Generate a valid robots.txt file for your website. Configure rules per user-agent, add sitemaps, and block AI crawlers with one click.
Presets
User-agent: * Allow: / Disallow: /api/ Disallow: /admin/ Disallow: /_next/ Sitemap: https://example.com/sitemap.xml