6 Crucial ON-Page SEO Tags to Rank High on Google in 2015

6 Crucial ON-Page SEO Tags to Rank High on Google in 2015

3066 Views
4 Comments

So far, 2015 has put forth posts that mainly emphasize on effective content marketing or social media engagement to rank better in Google SERP’s. Following the few updates viz. Panda, penguin and pigeon of last year, the focus has shifted to off-page strategies while not much is talked about ON-Page issues that can affect SEO badly. Noticing the same, I have come up with this list of ON-Page SEO factors that will help you make ON-Page changes and improve your SEO performance.

Listed below are 6 fundamental ON-Page tags that will help you rank high in Google:

1. Canonical tag:

Websites with more than one URLs containing similar content come under the scanner of Google owing to duplicity. This adversely affects website’s presence on search engines and is commonly termed as a bad SEO practice. Use of canonical tag is very common in ecommerce websites especially in product based pages where URL changes for color, size & similar product parameters. This is where canonical tag can come to your rescue.

Right use of this tag however is important, as websites making common mistakes with canonical tag eventually fail to get the actual benefits.

The use of canonical tag is must to keep control over the state of search results. With this, you can choose which page you want to get indexed and which not. By doing so, you not only save your site from the consequences of URL duplicity but also ensure link popularity of the chosen page.

Make the most of canonical tag:

By adding canonical appropriately, you avoid losses that occur when website goes against Google guidelines. At the same time, all your pages remain live to users.

how to use canonical tag

 

2. 301 Redirection:

People get confused between canonical and 301 but both are entirely different. A canonical tag is used to keep those pages away from spiders that can harm search engine rankings due to duplicity of content. By adding this you can non-index such pages without removing them and your users can still view them.

On the other hand, 301 redirection makes one page live as well as crawlable. If you have any page on the website that no longer serves to users’ interest but you don’t want to remove it permanently, redirect it to other relevant page. Keep one page live for crawler as well as for users.

Relevance of 301 Redirection:

Using 301 Redirection is beneficial for transferring old domains to new web address. Instead of showing 404 page, direct users and search engines to a suitable page (home page or one with similar offerings). This will not only prevent bounce rate but also pass link juice thereby giving ranking power to the page.

Suggestion: 

Many types of redirects are used by webmasters but using too many results in slow website. Choosing a redirect that solves a trouble without inviting new ones is the most appropriate way for webmasters to handle their business site. For instance, if a page is indexing and has good links from other sites but ceases to offer a product/service, it must be 301 redirected to a relevant page to save your search engine benefits.

In short, 301 redirection needs to be part of your ON-Page plan in 2015.

3. 404 Errors:

404 or ‘Not Found’ errors are prevalent on big ecommerce sites as online stores are updated quite frequently owing to availability/unavailability of products. Recurrent additions/deletions lead to broken links and this can be risky for your web presence.

Google says that 404 errors will not impact your search engine rankings directly but can prove hurtful to web presence eventually. Too many 404 errors or broken links lead to poor user experience and Google gives top priority to UX of a website while ranking it in search results. If your website lacks on this particular ground, you are likely to face constraints in SERPs too. Hence taking care of 404 errors is a crucial point of our ON-Page SEO guide.

Fixed 404 Errors: 

Check website pages in Google Webmaster that show ‘Not Found’ status. There could be various reasons viz. page deleted, renamed or misspelling in link. Monitor your Google Webmaster and redirect the ones that are renamed, and fix the others

Webmaster 404 Errors

Google Webmaster is not only about checking impressions and clicks. Also use Google webmaster to check about server errors, 404 errors, and other crawl errors issues from time to time.

4. Robots.txt file:

Robots.txt file is used to stop crawler from indexing pages that you don’t wish to be crawled. Blocking URL by robots.txt file is a common action taken by programmers during testing stage. So, checking robots.txt file is essential when it is live to ensure complete crawling of a site.

Risk factor:

Googlebot first accesses robots.txt file to check whether a site allows or blocks it from crawling any pages or URLs. None of your SEO techniques can help you boost search engine rankings of a page until it is made available to crawler. Hence, make sure your robots.txt file is accessible and does not have improper commands or a wrong URL listed.

Where to use:

No all pages on a website are for users; therefore, should not be indexed in search engines. For instance, admin page, portal, checkout pages, etc. Creating robot.txt file is a handy way to keep such pages on the backend only by providing instructions on which pages should be disallowed from getting displayed on SERPs.

Robots.txt File Pages

5. Robot Meta tags:

Robot Meta tags are crucial for website indexing. Robots Meta Tag must be placed in the head section of an HTML page. The ‘index’, ‘follow’, ‘no index’ & ‘no follow’ options tell search engine spiders which pages are for ‘index & follow’ and which are not. The most recommended combination from search engine point of view is “index & follow” but you don’t have to add this to your pages as it’s the default option.

Let’s discuss all the 4 cases one by one:

Index and follow - Use this option if you want spiders to index the page and also crawl the links on this page.

No index and follow - Use this option if you don’t want spiders to index the page but crawl all the links on this page.

Index and no follow - Use this option if you want spiders to index this page but leave the links on this page.

No index and no follow - Use this option if you don’t want spiders to index the page as well as links.

Robot Meta tags are crucial for website’s complete indexing. This is the reason you need to take care of it while thinking on-page SEO in 2015.

Possible loss of mishandling Robot Meta tags:

Choosing ‘no index’ unknowingly or mistakenly can block your search indexing and throw your website out of Google SERPs.

Robots Meta Tag

This is a common case of errors that result in low presence on search networks despite applying best practices.

Where to use Robots.txt file and Robots Meta Tag

Robots.txt file can be used to block the whole website, a particular folder of website pages and even bots of particular search engines. Robots Meta tag on the other hand work page wise.

For example, if I want to no-index only 1 page of my website, then, I will go for robots Meta tag. If I want to block my whole website or any specific folder, then, robots.txt file will be best option for me.

6. Href Lang Tag:

Websites that serve users from various regions or require translating similar content for various users have to pay special attention to this tag. It allows one to show relationship between various site pages, and helps search engine give accurate results for a search query.

So, if your Mexico based client searches your business online, Google will automatically show the translated version in search results and also save you from the consequences of content duplicity (only if the main content abides by the keywords, links and other quality guidelines).

1

This somehow adds value to users’ search by showing results of appropriate language based on geographic association and improves user experience too.

Best for serving to local audience:

hreflang tag helps businesses that have wider audience belonging to different regions serve local audience better. It adds signal that helps spider find relevant page instantly. This way, you don’t need multiple websites to target people from different geographical regions.

Suggestion:

Avoid incorrect codes to prevent invalidation of hreflang. Read more

SEO planning can turn fruitless if any of the above mentioned on-page points are mishandled or ignored. Many businesses lag behind in search engine rankings due to such trivial issues that go unnoticed.

Use this ON-Page SEO guide to get the attention your business deserves in search engines. Take my word, Google as well as customers will love you for it.

Recommended Reading: Improve Website Ranking with these Modern SEO techniques in 2017

Improve your website ranking with our excellent SEO services

Disclaimer: The Blog has been created with consideration and care. We strive to ensure that all information is as complete, correct, comprehensible, accurate and up-to-date as possible. Despite our continuing efforts, we cannot guarantee that the information made available is complete, correct, accurate or up-to-date. We advise - the readers should not take decisions completely based on the information and views shared by FATbit on its blog, readers should do their own research to further assure themselves before taking any commercial decision. The 3rd party trademarks, logos and screenshots of the websites and mobile applications are property of their respective owners, we are not directly associated with most of them.



Leave a Comment


CAPTCHA Image
Reload Image
Comments (4)
Matija

Hi, nice post. Just noticed your graph above about errors in gwt.I would really appreciate if you could help us explain how to solve those errors. I am having far more of them than you :) . Regards, Matija

FATbit Chef

Thanks Matija.
There must be some reason behind too many errors in GWT. Usually such errors are seen in ecommerce sites that sell products and to fix them we can use 301 redirect. But, as you said, you have many of them, it’s better to do website analysis & spot all issues before they affect your search engine presence.

Tell us, if you need any help in this regard.

Cheers