Ferramentas SEO/Gerador de Robots.txt
Gerador de Robots.txt
Crie e personalize robots.txt para rastreadores de busca
Regra 1
Um caminho por linha
Um caminho por linha
Opções Adicionais
robots.txt Gerado
User-agent: * Allow: / Disallow: /admin/ Disallow: /private/ Disallow: /api/ Sitemap: https://example.com/sitemap.xml
Salve como robots.txt no diretório raiz do seu site
Robots.txt Generator Guide
Learn how to control search engine crawling with robots.txt
What is robots.txt?
The robots.txt file is a standard that tells search engine crawlers which pages or sections of your site they can or cannot access. It is placed in the root directory of your website and is one of the first files search engines check when visiting your site.
How to Use This Tool
- Add rules for specific user agents (or use * for all bots)
- Specify paths to allow or disallow crawling
- Add your sitemap URL for better indexing
- Download and place the file in your website root directory
SEO Best Practices
- Never block CSS or JavaScript files - search engines need them
- Always include a link to your XML sitemap
- Test your robots.txt using Google Search Console
Search Engine Support
All major search engines (Google, Bing, Yahoo, Yandex) respect robots.txt directives. However, malicious bots may ignore these rules, so do not use robots.txt for security purposes.