Robots.txt Tester & Analyzer

Fetch and inspect robots.txt, see all crawler rules and test whether specific URL paths are allowed or blocked for different user-agents.

Website URL
Enter your website URL. The tool will automatically request https://yourdomain.com/robots.txt.
Source options
Use pasted robots.txt instead of live URL
Turn this on if you want to validate a draft robots.txt or one from a private/staging environment.
Robots.txt content
Source
β€”
URL or pasted robots.txt
HTTP Status
β€”
Waiting for analysis…
Content-Type
β€”
Reported by the server (if fetched)
File Size
β€”
Approximate uncompressed size
Last Modified
β€”
If provided by the server
User-agent groups
β€”
Unique groups of crawler rules
User-agent lines
β€”
User-agent directives
Allow rules
β€”
Number of Allow directives
Disallow rules
β€”
Number of Disallow directives
Sitemap directives
β€”
Sitemap: lines in robots.txt
Crawl-delay lines
β€”
Crawl-delay directives

Test URL access

Choose a crawler and path to see whether it is allowed or blocked according to the current robots.txt rules.

User-agent
Pick a crawler you want to test.
URL path to test
You can enter either a path (e.g. /blog/post/) or a full URL. Query strings are supported.
Issues & suggestions

Free Online Robots.txt Tester & Analyzer

The Robots.txt Tester from PowerSEOTools.online helps you understand and debug how crawlers see your site. It fetches and parses your robots.txt file, shows you user-agent groups and rules, and lets you test whether specific URL paths are allowed or blocked for different bots.

When Should You Check Robots.txt?

  • After launching a new site or migration to make sure important pages are not accidentally blocked.
  • Before opening a staging or test environment to search engines.
  • When you see "Submitted URL blocked by robots.txt" warnings in Google Search Console.
  • During technical SEO audits to confirm crawler access for key sections like /blog/, /shop/ or /images/.

Best Practices for Robots.txt

  • Keep the file small and simple so that it is easy to maintain and debug.
  • Use User-agent: * for generic rules and more specific user-agents when you need special cases.
  • Avoid blocking CSS and JS files that are needed to render pages correctly.
  • Add Sitemap: lines to point crawlers at your XML sitemaps.
  • Remember that robots.txt is a crawl directive, not a security feature. Do not use it to hide sensitive URLs.

Combine this robots.txt tester with other Website Management tools on PowerSEOTools.online like the HTTP Header Checker, Redirect Checker and XML Sitemap Validator for a complete technical SEO toolkit directly in your browser.