search

Free Robots.txt Generator

Create a robots.txt file for your website. Manage crawl rules per bot, add paths and use presets for popular platforms.

tune Configuration

bolt Quick Presets
smart_toy Bot Crawl Rules
info Checked bots are allowed to crawl your site. Unchecked bots will be blocked.
block Blocked Paths (Disallow)
check_circle Allowed Paths (Allow)
settings Settings
Leave empty to omit
Not supported by Googlebot. Use sparingly.
Optional. Specifies the preferred URL.

description robots.txt

# robots.txt

Robots.txt Generator - Everything You Need to Know

A robots.txt file tells search engines and web crawlers which pages they can and cannot visit. It is an essential file for every website and is always located in the root directory of your domain.

What is robots.txt?

Robots.txt is a text file that follows the Robots Exclusion Protocol. It gives instructions to web crawlers about which parts of your site they may index. Search engines like Google, Bing, and DuckDuckGo respect these rules, as well as AI crawlers like GPTBot and ClaudeBot.

Important directives

  • User-agent - Specifies which bot the rules apply to (* = all bots)
  • Disallow - Blocks access to a specific path
  • Allow - Allows access to a path (overrides Disallow)
  • Sitemap - Points to your XML sitemap
  • Crawl-delay - Delay between requests (not supported by all bots)

Blocking AI Bots

More and more websites are choosing to block AI crawlers. Bots like GPTBot (OpenAI), ClaudeBot (Anthropic), and CCBot are used to collect training data. With our generator, you can selectively block these bots while keeping search engines access.