We use cookies to enhance your experience. By using this website, you consent to the use of cookies and privacy policy.

How to fix blocked robots.txt that was blocked by Disallow: /*.txt

Never block .txt files within the main root folder, this prevents access to many required text files that search engines need to crawl.
One of them is ads.txt, if txt is disallowed then you will be getting message "Earnings at risk - You need to fix some ads.txt file issues to avoid severe impact to your revenue." even when you have the required file set.
But if you run into this problem where you blocked all .txt files by robots.txt and Google can't crawl robots.txt file there is a solution for this.

Simply remove all the contents within the robots.txt file.
Then you will need to wait around 48 hours for crawlers to check it again.
Periodically you can do a live URL test in the Google search console.
Also, make sure that it is not blocked by .htaccess - you should be able to access the file from URL example.com/robots.txt
It can get updated sooner but you can't speed this up because they think file is blocked from indexing.