Robots.txt Generator
Build robots.txt files with visual rule blocks, AI bot protection presets, crawl-delay configuration, and sitemap references. Client-side, no signup required.
How to Use
Building a robots.txt file for your website is fast and visual with our rule-block editor:
Step-by-Step
- Add Rule Blocks: Each block targets a specific
User-agent(e.g., Googlebot, Bingbot, or all bots with*). Click Add Rule Block to create new sections. - Set Allow/Disallow Paths: For each agent, add
Disallow: /private/to block a path orAllow: /public/to explicitly permit access. UseDisallow: /to block the entire site for that bot. - Block AI Scrapers: Click the Block AI Bots button to instantly add
Disallow: /rules for 15+ AI crawlers including GPTBot, ClaudeBot, and CCBot. - Add Sitemaps: Enter your sitemap URLs to help search engines discover all your pages efficiently.
- Download or Copy: Use the copy button or download the generated
robots.txtfile and upload it to your website root directory.
About This Tool
The Robots Exclusion Protocol
The robots.txt file follows the Robots Exclusion Protocol, a convention adopted by every major search engine since 1994. It must be placed at the root of your domain (e.g., https://example.com/robots.txt) and uses a simple text format to specify crawl permissions per user-agent.
AI Crawler Management in 2026
With the proliferation of large language model training, dozens of new AI crawlers have emerged. Companies including OpenAI (GPTBot), Anthropic (ClaudeBot, CCBot), Google (Google-Extended), Meta (FacebookBot), ByteDance (Bytespider), and Perplexity (PerplexityBot) all honor robots.txt directives. Our generator includes a comprehensive database of 15+ AI bot user-agent strings, allowing you to block AI training crawlers with a single click while still permitting search engine indexing.
Crawl-Delay Compatibility
The Crawl-delay directive is supported by Bing, Yandex, and several other crawlers, but Googlebot ignores it entirely. Google recommends using Search Console's crawl rate settings instead. Our generator warns you when you add a crawl-delay to a Google-specific user-agent block.
Why Use This Tool
Why Use This Generator?
Writing robots.txt by hand is error-prone. A missing colon, an incorrect path, or a misplaced directive can accidentally block search engines from your entire site. Our visual builder eliminates syntax errors by construction. The rule-block interface maps directly to the spec format, so what you see is exactly what crawlers will interpret.
Everything runs 100% client-side in your browser. No server upload, no account, no tracking. Your website structure and crawl rules stay private.