Get Even More Visitors To Your Blog, Upgrade To A Business Listing >>

What is the method for removing dead links and crawl errors from a site?


What is Crawling Errors ?
Crawl errors occur when a search engine tries to reach a web page but fails to do so. There are two types of Crawl errors:
Site errors (which mean your entire site can’t be crawled)
URL errors (which relate to one specific URL per error)

Site Errors in Google Search Console :
1) DNS Errors : A DNS (Domain Name System) error means that a search engine can’t communicate with your server either because the server is down or because there's an issue with the DNS routing to your domain. Usually this is a temporary issue.

2.) Server Errors : When you see this type of error for your URLs, it means that Googlebot wasn’t able to access your URL, the request timed out, or your site was busy. This can also mean that your website has so many visitors that the server just couldn’t handle all the requests. Most of these errors are defined as 5xx status codes.

3) Robots failure : Before Googlebot crawls your website, it tries to crawl your robots.txt file to see if there are any pages you’d rather not have indexed. If that bot can’t reach the robots.txt file (i.e. robots.txt file doesn’t return a 200 or 404 HTTP status code), Google will postpone the crawl rather than risk crawling URLs that you don’t want crawled. That’s why you always want to make sure the robots.txt file is available.

URL Errors in Google Search Console :
1) Common URL Errors : Common URL errors occur when a search engine tries to crawl a specific web page. Among them might be an occasional DNS error or server error for that specific URL.

2) Mobile-Only URL Errors : This type of error refers to the crawl failure that occurs on smart devices. If your website is responsive, mobile-only URL errors are unlikely to appear.The most usual mobile-only URL errors are faulty redirects. Some websites use different URLs for desktop and smartphone users. A faulty redirect occurs when a desktop page incorrectly redirects mobile users to a page not relevant to their query.

Method for removing dead links and crawl errors from a site

Use Google Web Master Tools to remove dead links & crawl errors.

As in the above image, click on Crawl Errors. I have eliminated all errors so it’s coming blank. Here you will get the list of all errors.

Method 1: For broken links you can redirect it to some existing page until you re-create those pages. If you don’t redirect all backlinks to those pages are lost. Redirection passes on the link juice to the redirected page thus saving your page rank to some extent.

Method 2: In case there are some links which are appearing in the google search index and they are broken links and in case you do not want to redirect but want those links to be removed from the search engine you can use another option under Google Index -> Remove URLs.


First copy all the dead links that appear as in image no. 1 into the notepad.
You can also use Free broken Link Checker tool to check dead link
Free Broken Link Checker 

Then go to Google Index -> Remove URLs. Click on Temporary Hide -> Enter the URLs one by one and click on continue -> Request Type:




This post first appeared on Flats For Sale In Mira Road, please read the originial post: here

Share the post

What is the method for removing dead links and crawl errors from a site?

×

Subscribe to Flats For Sale In Mira Road

Get updates delivered right to your inbox!

Thank you for your subscription

×