Robots.txt is a text file within your website directory that tells search engine crawlers which pages they can/cannot access. It makes sure that search engine bots don’t overwhelm the webserver hosting or index pages that aren’t meant for public view.