Robots.txt is a text file within your website’s top-level directory that instructs search engines how to crawl your pages.
robots.txt is a text file that instructs web robots (search engine robots) about how to crawl pages on their website. It is used to manage crawler traffic to the website.