Robots.txt testing tool
Now you can find this crucial tool at google.com/robots-testing-tool . And I recommend you keep this link in your browser's bookmarks.
What is it for? As you well know, robots.txt is a file that interfaces with the search engine and gives indications on what to index or not. If you make changes, you can do some tests and insert the URL you want to verify in the what you will get inside the engineer database specific field. This way you can have confirmation and find out if you have worked well or if you are about to do damage.
For more information, you can read: Quick Guide to Writing a Robots.txt File
SEMrush Site Audit
Well, if we want to talk about tools to discover and correct blog errors, I think it is right to also mention SEMrush's Site Audit. That is, a utility that allows you to analyze the domain in search of possible flaws. Objective? Identify and correct what bothers those who read your web pages.
a fundamental tool for blog audit: site audit
The site audit dashboard.
That is, readers and Google spider . The most interesting aspect is the Overview dashboard that includes top indicators with errors, warnings and alerts so you can get a summary of the situation you are facing.
Among the many tools in this suite I suggest to carefully address the statistics that give an indication of the depth of the structure, problems related to broken images , broken links and resources with mixed content. That is to say with embedded in HTTP that could present with display problems.
Site Audit has 2 great advantages: it periodically scans your projects and allows you to check, at the same time, many errors. In this way you save time and, if there is a problem, you notice it in time, without having to perform new analyses with other tools.