Crawl-delayâ€™s directive is used in mega website like twitter with very frequent updates to accommodate with web server load issues by indirectly throttling to the number of pages bot will crawl. As bandwidth of the website, pages in a website, content of the website and lot more factor affect crawling of any website, using crawl-delay directive can reduce the server over-loading issues. However, most search engine dost not support Crawl-delayâ€™s directive, so it is advised to have robots.txt file as,
But never use like:
keeping in mind crawl-delay time is represent in seconds.