Robots.txt Generator
Compose a valid robots.txt: choose a policy preset or customize allows/disallows, add crawl-delay and Sitemaps, then copy.
Used to build absolute Sitemap URLs if needed.
Use *
to target all crawlers.
Note: Not supported by Google; used by Bing/Yandex.
Preview robots.txt
User-agent: * Disallow: /admin Disallow: /api Allow: / Sitemap: https://www.example.com/sitemap.xml
Place at
/robots.txt
. Use Sitemaps for discovery of URLs.