A robots.txt
file instructs search engine crawlers (and other bots) which parts
of your site should or shouldn’t be accessed. Mistakes in this file can cause unintentional
de-indexing or partial blocking of critical sections.
Tool Steps:
/robots.txt
file.User-agent
, Allow
, and Disallow
blocks./private
) to see if the file
allows or disallows crawling.
Note: This simplified parser doesn’t fully handle wildcards (*
in paths) or advanced
matching logic. If your robots.txt
uses advanced syntax, you may need a more specialized tool.
Still, this tester covers the common directives and patterns.