Overview
generate_robots_txt creates an AI-friendly robots.txt:
- AI crawler support — allows GPTBot, ClaudeBot, PerplexityBot, and other AI bots
- Customizable rules — maintains your existing access controls
- Best practices — follows robots.txt standards
- Ready to deploy — returns content you can paste into /robots.txt
This tool is completely free and works without an API key.
Parameters
Your website URL. The tool analyzes your current robots.txt (if any) and generates an improved version.
Example prompts
Response structure
The tool returns a JSON object with:The complete robots.txt file content, ready to deploy at /robots.txt
List of AI crawler user-agents that are now allowed
List of modifications made to your existing robots.txt (if any)
Usage example
AI crawlers supported
The generated robots.txt allows these AI crawler user-agents:- GPTBot — OpenAI’s crawler for ChatGPT
- ClaudeBot — Anthropic’s crawler for Claude
- PerplexityBot — Perplexity AI’s crawler
- Google-Extended — Google’s AI training crawler
- anthropic-ai — Anthropic’s general AI crawler
- Applebot-Extended — Apple Intelligence crawler
- Diffbot — Knowledge graph crawler
- FacebookBot — Meta’s AI crawler
- ImagesiftBot — Image AI crawler
When to use
Usegenerate_robots_txt when:
- audit_site shows robots_ai check fails — AI crawlers are being blocked
- New site launch — ensure AI visibility from day one
- Existing robots.txt blocks AI — fix overly restrictive rules
- Want to control AI access — explicitly allow specific AI bots
Deployment
After generating the file:- Copy the
contentfrom the response - Save it as
robots.txtat your domain root - Deploy to
https://yourdomain.com/robots.txt - Verify it’s accessible publicly
- Test with robots.txt validators
The file must be at your domain root (/robots.txt), not in a subdirectory.
Maintaining access controls
The generator preserves your existing access rules:- Existing
Disallowrules are maintained - Sitemap references are preserved
- Custom user-agent rules are kept
- Only adds AI crawler permissions that are missing
Next steps
- audit_site — verify robots_ai check passes after deployment
- generate_sitemap — add sitemap reference to robots.txt