Robots.txt

A robots.txt file is a document placed on a web server that provides instructions to search engine crawlers, also known as robots or bots, regarding which pages to crawl and index on a website. These files specify which parts of a website should be included in search engine indexes and which parts should be excluded. For instance, pages such as form submission confirmation pages (“thank you” pages) can be hidden from search engines to prevent them from appearing in search results.