Robots.txt Generator

Robots.txt Generator API

Web Development

Create SEO-optimized robots.txt files instantly. Control search engine crawlers and improve your site's indexing with AI-powered generation.

Authentication

All API requests require a valid API key passed in the Authorization header as a Bearer token.

Rate Limit

50 requests per minute

Endpoints

1 endpoint available

Overview

The Robots.txt Generator API creates properly formatted robots.txt files using AI. Provide your domain and describe your requirements in plain English, and the API generates a standards-compliant robots.txt file with the appropriate directives.

How It Works

Simply describe what you want to allow or block:

  • Block crawlers - "Block all crawlers from /admin/"
  • Allow specific bots - "Allow Googlebot to access all pages"
  • Specify sitemaps - "Add sitemap at /sitemap.xml"
  • Block file types - "Block .pdf and .doc files"

Supported Directives

  • User-agent - Specify which crawlers the rules apply to
  • Disallow - Block access to specific paths
  • Allow - Explicitly allow access to paths
  • Sitemap - Declare sitemap locations
  • Crawl-delay - Set delays between requests

Use Cases

  • SEO optimization and crawl control
  • Protecting private or admin areas
  • Managing crawler bandwidth usage
  • Specifying sitemap locations for search engines
  • Blocking specific bots or allowing only certain crawlers

Endpoints

POST
/v1/tools/robots-txt-generator
Generate a robots.txt file based on domain and requirements

Request Body

Content-Type: application/json

ParameterTypeRequiredDescription
domain
string
Required
The website domain (e.g., "example.com" or "https://example.com")
requirements
string
Required
Plain English description of what you want the robots.txt to do

Response Example

{
  "success": true,
  "result": "User-agent: *\\nDisallow: /admin/\\nDisallow: /private/\\n\\nUser-agent: Googlebot\\nAllow: /\\n\\nSitemap: https://example.com/sitemap.xml"
}

Error Codes

400
Invalid request body or missing required parameters
401
Missing or invalid API key
429
Rate limit exceeded
500
Internal server error

Code Examples

curl -X POST https://api.opentools.ca/v1/tools/robots-txt-generator \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{"domain": "example.com", "requirements": "Block all crawlers from /admin/ and /private/. Allow Googlebot everywhere. Add sitemap location."}'

Ready to get started?

Create an API key to start using the Robots.txt Generator API.