RE: Need Better Explained about the Robots.txt

Hello ,

Actually, the robots.txt file is an important and standard protocol used by websites to communicate with the web crawlers and other robots.which means to command Crawl bots which pages need to crawl/allow and which page don’t need to allow, that means, which need to disallow. That will instruct to the search engine to leave out some pages (private, security related, etc.) while crawling. 

This is actually the meaning of robots.txt. If you need any further clarifications, please let me know!

Be the first to post a comment.

Add a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.