Robots.txt is a configuration file for search engines that tells search engines which URLs within a website they can index.
This tool allows you to easily view the contents of the robots.txt file, including its history. We check the content of the robots.txt file approximately twice a week for all monitored domains.