Prevent Search Engines and Bots
From crawling or Indexing their sites
How does a robots.txt file work?
A robots.txt file is a text file that tells web robots (also known as spiders or crawlers) which pages on your website to crawl and which to ignore.
When a robot crawls a website, it reads the robots.txt file to check for instructions on which pages it should crawl and which it should ignore.
Post a Comment
Note: Only a member of this blog may post a comment.