Get Even More Visitors To Your Blog, Upgrade To A Business Listing >>

How to stop search engines from crawling a file or folder using Robots.txt

Robots.txt is a file which instructs Search engine crawlers, spiders and robots which part of the site to be indexed and which part of the site not be indexed. Basically...

The post How to stop Search Engines from crawling a file or folder using Robots.txt appeared first on Techgist - Restore Firefox Bookmarks, Apple ipad Tablet, Wordpress Database Backup, Wordpress Themes, Wordpress Manual Backup.



This post first appeared on Techgist, please read the originial post: here

Share the post

How to stop search engines from crawling a file or folder using Robots.txt

×

Subscribe to Techgist

Get updates delivered right to your inbox!

Thank you for your subscription

×