Robots.txt Generator

Master how search engines interact with your website using precise bot instructions.

Features

  • User-Agent Control

    Target specific bots like Googlebot or Bingbot with custom crawling instructions.

  • Smart Templates

    Pre-configured rules for WordPress, documentation sites, and high-privacy setups.

  • Sitemap Integration

    Automatically link your XML sitemap for better search engine discovery and indexing.

What is Robots.txt?

The robots.txt file is part of the Robots Exclusion Protocol (REP). It is a text file located at the root of your website that tells web robots (most commonly search engines) which pages on your site to crawl and which pages to ignore.

How to Use This Tool

  1. Start with a Template or add a new custom rule.
  2. Specify the User-agent (the name of the bot, like * for all bots).
  3. Enter comma-separated paths for Disallow (folders bots shouldn't visit) and Allow.
  4. Provide your Sitemap URL to help bots find your content index.
  5. Review the generated file and download it to upload it to your server's root.

SEO Guidance

Be careful not to disallow your CSS, JavaScript, or images, as modern bots need to see these to understand how your site looks and functions. A misconfigured robots.txt can accidentally de-index your entire site if not tested carefully.