Advertisement
Default Rules
Advertisement
How to Use Robots.txt Generator
-
Pick default behavior
Allow all crawling (recommended for most sites) or block everything.
-
Add blocked paths
Tick common admin/private paths or write custom rules.
-
Block AI crawlers
Optionally stop GPTBot, ClaudeBot, etc. from training on your content.
-
Download & upload
Save as robots.txt to your domain root: yoursite.com/robots.txt
Frequently Asked Questions
In your website's root directory. It must be accessible at yoursite.com/robots.txt — not in a subfolder.
Major search engines (Google, Bing, DuckDuckGo) respect robots.txt. Some scraper bots ignore it. For sensitive content, use authentication, not just robots.txt.
Your call. Blocking them prevents your content being used to train AI models, but doesn't affect Google search ranking. Many publishers block them; others allow them for visibility.
Crawl-delay tells bots to wait N seconds between requests. Useful if your server is overloaded. Google ignores this; use Search Console crawl rate settings instead.