Robots.txt Generator

Free

Create a proper robots.txt file to control search engine crawling of your site.

Advertisement
Default Rules
Advertisement

How to Use Robots.txt Generator

  1. Pick default behavior

    Allow all crawling (recommended for most sites) or block everything.

  2. Add blocked paths

    Tick common admin/private paths or write custom rules.

  3. Block AI crawlers

    Optionally stop GPTBot, ClaudeBot, etc. from training on your content.

  4. Download & upload

    Save as robots.txt to your domain root: yoursite.com/robots.txt

Frequently Asked Questions

In your website's root directory. It must be accessible at yoursite.com/robots.txt — not in a subfolder.
Major search engines (Google, Bing, DuckDuckGo) respect robots.txt. Some scraper bots ignore it. For sensitive content, use authentication, not just robots.txt.
Your call. Blocking them prevents your content being used to train AI models, but doesn't affect Google search ranking. Many publishers block them; others allow them for visibility.
Crawl-delay tells bots to wait N seconds between requests. Useful if your server is overloaded. Google ignores this; use Search Console crawl rate settings instead.