Toolypet
SEOツール/Robots.txt生成器

Robots.txt生成器

検索エンジンクローラー用のrobots.txtを作成・カスタマイズ

ルール 1

1行に1つずつ

1行に1つずつ

追加オプション

生成されたrobots.txt

User-agent: *
Allow: /
Disallow: /admin/
Disallow: /private/
Disallow: /api/

Sitemap: https://example.com/sitemap.xml

ウェブサイトのルートディレクトリにrobots.txtとして保存

Robots.txt Generator Guide

Learn how to control search engine crawling with robots.txt

What is robots.txt?

The robots.txt file is a standard that tells search engine crawlers which pages or sections of your site they can or cannot access. It is placed in the root directory of your website and is one of the first files search engines check when visiting your site.

How to Use This Tool

  1. Add rules for specific user agents (or use * for all bots)
  2. Specify paths to allow or disallow crawling
  3. Add your sitemap URL for better indexing
  4. Download and place the file in your website root directory

SEO Best Practices

  • Never block CSS or JavaScript files - search engines need them
  • Always include a link to your XML sitemap
  • Test your robots.txt using Google Search Console

Search Engine Support

All major search engines (Google, Bing, Yahoo, Yandex) respect robots.txt directives. However, malicious bots may ignore these rules, so do not use robots.txt for security purposes.