Fix Crawl Errors In Google Webmaster Tool

Are you buying paid traffic through Advertising program?No need of buying traffic you can rank high by Search Engine Optimization.Google webmaster tool(GWT) is the best SEO tool for optimizing a blog or website.This will help you to rank better in google search engine.This tool always informs you what is going on with your site.Webmaster Tool regularly displays us the site crawl errors rate and what causes those errors.If site crawl errors occur on your site then you should fix those errors.By fixing those site crawl errors you can rank high on google.



How To Fix Crawl Errors In Google Webmaster Tool.






Types of Site Crawl Errors in Google Webmaster Tool:

In GWT three types of Site Crawl Errors are there.We will touch briefly on all types of site crawl errors.

  • DNS Error
  • Server Connectivity Error
  • Robots.txt fetch Errors

DNS Errors:

When google bots face problem to crawl your site then they report these type of errors to you.These errors are like normal DNS errors occurs in our web browser when our browsers can't open a web address.
These errors may occur when you change hosting provider or name server change.
If you have DNS error on your GWT.For fixing these errors you should contact your hosting provider or you can change your hosting to a better hosting provider.

Server Connectivity Errors:

These type of error occurs when a web page takes too much time to load and times out before loading the page or it may be your server configuration problem and because of this web crawler google bots can't crawl your site.
To fix Site connectivity errors you should control the crawl rate of bots.In GWT you can use URL parameter and Robots.txt file to control your site crawling.
If you are on WordPress, you should use caching plugins like Better Wordpress Minify or W3 Total Cache.

Robots.txt Fetch Errors:

The robots.txt file used to stop indexing certain portion to search engine, this error can occur when your robots.txt file not configured properly.Or it may be, after hiding a page from the search engine it takes some days to remove that page from Search Engine.
In between this process if someone clicks on that page link then this error occurs.

One more error is present in GWT is URL Errors

URL Errors:

In this section of  GWT shows up to 1000 URL Errors.This URL Error occurs when a page or post is deleted from your site but the URL is present on the search engine.It has a negative impact on your users as well as Google Crawler also.
Because when someone clicks on your deleted post URL on Search Engine your site will show 404 error page then he/she will immediately go back to search engine this causes Pogo Sticking and Bounce Rate.This Pogo Sticking and Bounce Rate have big negative impact on SEO of your blog.


So now let's know how to Avoid URL Errors

By Updating Sitemap: When you add a new sitemap to GWT,  then it will crawl URLs present recently on your site.
By Making Redirect To New URL: If you have deleted some Post from your site hen you should make a redirection to a new Post or Page URL.

After completing above steps now you need to Fix URL Errors

To fix these errors go to GWT search console then Crawl>>Crawl Errors
You can see top 1000 URLs Errors on that page.Now Mark on the First checkbox to fix all the URLs at once then Click on MARK AS FIXED.

Now you have done your work......

If you are getting errors consistently on GWT and email from GWT about crawling your site then you should start fixing those errors.

I hope you found this article helpful.
If you want more article like this please don't forget to share this using sharing buttons below.

2 comments:

  1. Sometimes Google Webmaster also show those pages which are linked from other domains. We should redirect them via 301 to the main page.

    ReplyDelete
    Replies
    1. Yes you can redirect those urls too....

      Delete