robots.txt Generator
Create a robots.txt file to guide search engine crawlers.
Generated robots.txt
Build Your robots.txt
- 1
Use the 'Presets' buttons for common configurations (Allow All, Disallow All, Allow Google Only).
- 2
Click 'Add Directive' to add specific rules like User-agent, Allow, Disallow, or Sitemap.
- 3
Modify the value for each directive as needed (e.g., specify bot names for User-agent, paths for Allow/Disallow, or the full URL for Sitemap).
- 4
Remove unwanted directives using the trash icon.
Get Your File
- 1
The generated robots.txt content will appear in the text area on the right.
- 2
Click 'Copy Content' to copy the text to your clipboard.
- 3
Click 'Download robots.txt' to save the generated file.
- 4
Upload the downloaded `robots.txt` file to the root directory of your website.