Organizational Research By

Surprising Reserch Topic

how to Proper use of Crawl-delay directive in robots.txt

asked Sep 13, 2013 in google by rajesh
edited Sep 12, 2013
0 votes

Related Hot Questions

1 Answer

0 votes
Crawl-delay’s directive is used in mega website like twitter with very frequent updates to accommodate with web server load issues by indirectly throttling to the number of pages bot will crawl. As bandwidth of the website, pages in a website, content of the website and lot more factor affect crawling of any website, using crawl-delay directive can reduce the server over-loading issues. However, most search engine dost not support Crawl-delay’s directive, so it is advised to have robots.txt file as,

User-Agent: bingbot
Crawl-delay: 5

But never use like:

User-agent: *
Crawl-delay: 5

keeping in mind crawl-delay time is represent in seconds.
answered Sep 13, 2013 by rajesh
edited Sep 12, 2013