Robots.txt Generator

Build a custom robots.txt file to control how search engines crawl your site.

Default Crawl Rules

Disallow Paths

robots.txt Output
# Click "Generate robots.txt" to build your file

How to Use

  1. 1

    Set crawl rules: allow all bots, or select specific bots.

  2. 2

    Add Disallow paths (e.g. /admin/, /private/) and optionally your Sitemap URL.

  3. 3

    Click Generate robots.txt — copy and save as robots.txt in your site root.

📖 Want to learn more?

Read our in-depth guide: tips, best practices, FAQs, and real-world examples.

  • Always include your sitemap URL in robots.txt
  • Disallow /wp-admin/ to block crawlers from admin paths
  • Use Crawl-delay only if your server has limited resources
Full Guide