Robot Generator
Generate a custom robots.txt
file for your website. Fill in the details below to control how search engines crawl your site.
How to Use the Robot Generator
The Robot Generator tool creates a custom robots.txt
file for your website. This file tells search engine crawlers which parts of your website they are allowed (or not allowed) to visit.
To use the tool, fill in the form fields:
- User-agent: Specify which search engine robots this rule applies to. Use an asterisk (
*
) to target all bots. - Disallow: Enter the URL paths you want to block from being crawled. For multiple paths, use one path per line.
- Allow: Optionally, list paths that should be crawled even if their parent directory is blocked.
- Crawl-delay: (Optional) Specify a delay (in seconds) between consecutive crawls.
- Sitemap URL: (Optional) Provide the URL where your sitemap is located.
Example: If you want to prevent search engines from accessing your private directory while still allowing them to visit your public pages, you might fill in:
User-agent: * Disallow: /private Allow: /public
After filling out the form, click "Generate robots.txt" to see the output. You can then copy the generated file to your clipboard or download it for use on your website.