Robots.txt Generator
Compose a valid robots.txt with policy presets, custom rules, and Sitemaps.
ConfigurationCustom
Used to build absolute Sitemap URLs if needed.
Use * to target all crawlers.
Disallow Paths
Allow Paths
Note: Not supported by Google; used by Bing/Yandex.
Sitemaps
Preview
robots.txt
User-agent: * Disallow: /admin Disallow: /api Allow: / Sitemap: https://www.example.com/sitemap.xml
Place at /robots.txt. Use Sitemaps for discovery of URLs.
More Free Tools
Ready to Win More Clients?
For less than your daily coffee, deliver powerful audits that impress clients, boost conversions, and grow your freelance business.
Don't wait; start turning your site audits into profits today!