On-Page SEO contains activities inside your website. It can be said that On-Page Search Engine Optimization services are what you are saying about your websites. For being successful in organic on-page SEO we need to follow and maintain some specific responsibilities. In this page we have discussed briefly about almost every aspects of organic On-Page SEO. Those are described below:
Website analysis for On-Page SEO:
At first you have to analyze about the subject matter of your website. This analysis includes:
- Subject of your website. For example: Technology, Medical services, SEO services and website designs.
- Will it be a website or blog or forum? For example: Educational, E-commerce and company website.
- Need to identify target visitor? For example: Student, Adult, Man and Women.
- What will be your target region? For example: Asia, Europe and the USA.
- What will be your target religion? For example: Islam, Christian, Buddhism and Hinduism.
- Will your site be for charity or commercial purposes?
- Will it be a regular site or niche site.
Keyword research and competition analysis:
Secondly, you need to research some keywords (search quarries) for your site and you must analyze some data and statistics of your competitors website or blog based on those keywords. In this part of On-Page SEO keyword research includes:
- If your site is new, then you are suggested choosing phrases instead of a word as keywords. Suppose your site subject is “Search Engine Optimization” in this case you can use, Search Engine Optimization services instead of only Search Engine Optimization.
- How many pages are already being indexed by search engine considering & comparing your competitors website?
- Number of back links of your competitors website.
- For new site medium competitive keywords should be considered.
Website structure analysis:
Your website structure should be user-friendly. Specially you should give much attention on navigation process of your site. You may use breadcrumb facilities for better navigation. One human readable Sitemap can help you in this process.
Every web page contains huge number of HTML and CSS tags. Those tags follow some rules and regulations. Some time unconsciously those rules are not followed. But search engine notice those seriously. You just need to correct those rules of HTML and CSS. Don’t worry there are several websites or software that can help you in finding those error with suggestions. This validation has great impact in the process of On-Page SEO.
Meta tags optimization:
Meta tags are used in the head tag of your HTML codding page. Before Panda algorithm update, these meta tags were considered very seriously. But nowadays those are not so much effective for On Page SEO. A fey common meta tags are:
<head> <meta name="keywords" content="seo, on page seo, search engine optimization"> <meta name="description" content="This page contains discussion on On-Page SEO"> </head>
NB: Meta keywords are not shown in web-page These are for only search engine and meta description will be published by search engine when anyone searches using SE.
Every page contains some text, images, videos and so on. Those are called content. Content optimization includes activities such as:
- Every page should have page title.
- Keywords: Your desired keywords should be inserted into page title, page descriptions and contents. Keyword density should not be more than 2-4%.
- Content description.
- Using h1, h2, h3, h4, h5 & h6 tags. Every page should contain only one <h1> tag for making title of a page. You can use several sub heading using <h2> tag where h1 tag gets most priority and h6 tag get least priority.
- Anchor text optimization: Anchor texts are those texts that are used to make a text link. In html codding, it is used inside link making code <a href=”http://www.google.com”>Google</a>. We find output: Google as a link. If we click on this link, then we go to Google home page. In this example Google is called anchor text. Some time those link may not work properly due to wrong codding. We need to optimize those using proper code.
- Image optimization: It is very easy task to do. You just need to use an alt (alternative) attribute (mistakenly many people say alt text) inside image inserting code such as: <img src=”example-image.png” alt=”example text”> Here example text are known as alt attribute. Basically search engine cannot understand language of an image so we need to use an alt attribute to introduce that image.
NB: Page content must be unique. Search engine can identify that your content is duplicate (copy & pest) or original. Duplicate content can seriously harm your SEO. So make your own content for getting better result.
Sitemap creation and implementation:
WWe need to use a Sitemap file especially for the purpose of using by search engine bots. This file contains all the links URL of your website that you are willing to show or index by search engine. An example Sitemap are shown below:
<?xml version="1.0" encoding="UTF-8"?> <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"> <url> <loc>http://www.example.com /</loc> <lastmod>2013-05-03</lastmod> <changefreq>weekly</changefreq> <priority>0.8</priority> </url> <url> <loc>http://www.example.com/about-us</loc> <changefreq>weekly</changefreq> </url> </urlset>
An XML Sitemap files must be placed in your site root directory That is http://example.com/sitemap.xml
Robots.txt file creation and implementation:
Robots file is used giving guidelines to search engine which SE are allowed to crewel the page. It also provides information about which links or directories are permitted to crewel to index and which are not.
Example #01 User-agent: * Disallow: Example #02 User-agent:* Disallow: Sitemap: http://example.com/sitemap.xml
Just put this code in a txt file and save it as robots.txt.
In example #01, User-agent:* means all search engine and Disallow: means all search engines are allowed to crewel and index.
In example #02, All Search Engines are allowed and suggested to crewel and index URL provided in sitemap.xml file.
This file must be placed in your site root directory That is http://example.com/robots.txt
- .htaccess file is used to redirect page URL. You can redirect your visitor to one URL to another URL automatically using it.
- Some time web-page may not be shown due to broken link or page may be deleted, in this case user will see a 404 error notice. It is also bad for your site because user or visitor may feel less interest about your site, in this case you can redirect those pages to a certain URL selected by you using htaccess file. Even you can make custom 404 error page for your site using it.
- You can deny or allow certain IP Address also using it.
This file must be placed in your site root directory That is http://example.com/htaccess
RSS feeds creation and implementation:
RSS feeds stores recent activities and posts of your website. It provides subscription facilities to your site visitors. They just need to put their email address in subscription registration page. All latest post information will be sent to their e-mail address automatically. You can also send e-mail to those addresses using RSS feeds in free of cost. Actually this is a free service from Google.
Google tools setup:
If your target is to optimize your website targeting Google search engine, then you should use following tools:
- Google AdWords tools: This is a great tool for keyword research and analysis and for advertising support for your site.
- Google webmaster’s tools: It will help you to index your web pages, submitting sitemap.xml file, Monitoring links and keyword position, removing or blocking URL and in many ways.
- Google analytic tools: Analytical information such as visitor number, landing page info, keyword search number, references, geographical position and many more are available here.
Good news is that these three tools are free to use. Just you need to have a Google mail account to use.