Docs for agents
AI agents — tools like Cursor, GitHub Copilot, and Claude Code — can answer questions about Cloudflare products, generate configuration, and call Cloudflare APIs on your behalf. Cloudflare documentation provides content in agent-friendly formats, agent skills, and MCP servers so your AI agent can look up documentation and interact with Cloudflare services directly.
This page explains the available approaches and how to set them up.
These resources cover different aspects of using AI agents with Cloudflare documentation. Start with the one most relevant to you:
Agent skills ↗ are structured, task-specific instructions that AI tools load on demand — for example, a skill might teach your agent how to deploy a Cloudflare Worker or configure a WAF (Web Application Firewall) rule. Skills give your agent Cloudflare-specific instructions it would not otherwise have. Cloudflare publishes skills covering Workers, storage, AI, networking, security, and more in the Cloudflare Skills repository ↗.
Each agent has its own installation method for skills. Refer to Agent setup for installation instructions.
The Model Context Protocol ↗ (MCP) is an open standard that defines how AI tools connect to external tools, data, and services. An MCP server is an application that exposes specific capabilities. When you connect one to your agent, the agent can use those capabilities as part of its workflow (for example, searching documentation, creating DNS records, or deploying Workers).
Cloudflare runs managed remote MCP servers that give your agent the ability to search documentation, call the Cloudflare API, and query logs and analytics while it works.
There are two approaches:
- Code Mode: A single MCP server that covers the entire Cloudflare API (over 2,500 endpoints). Use this when your agent needs broad access across multiple Cloudflare products.
- Domain-specific servers: Focused servers for documentation, observability, DNS analytics, and more. Use these when your agent only needs access to a specific area. The full catalog is in the cloudflare/mcp-server-cloudflare ↗ repository.
Each agent's Agent setup guide includes MCP server installation as part of its Quick start. For the full list of available MCP servers, refer to MCP servers for Cloudflare.
AI agents use large language models (LLMs) to understand your requests and generate responses. The model affects response quality, speed, and cost. How many models you can choose from depends on the agent:
- Locked: Only the vendor's own models are supported.
- BYOK (Bring Your Own Key): You supply your own API key for the model provider of your choice.
- Multi-provider: Several model providers are supported out of the box.
How the agent retains information about your project between conversations affects how much your agent remembers between sessions:
- Project memory: The agent remembers context across sessions using stored files or memory.
- Indexed codebase: The agent builds a searchable index of your repository for fast lookups.
Each supported agent has a dedicated setup guide covering installation, skills, MCP server configuration, example prompts, tips, and troubleshooting.
Get startedAI tools work better with Markdown than HTML because Markdown's explicit structure has less overhead than HTML tags, which reduces wasted tokens (the units of text that AI models process) and produces better results.
Every documentation page is available as Markdown using any of the following methods, powered by Markdown for Agents.
On any documentation page, select Copy as Markdown to copy the current page as Markdown.
Add /index.md to the end of any page URL. For example:
https://developers.cloudflare.com/workers/get-started/index.mdRequest any page with the Accept: text/markdown header, which tells the server you prefer Markdown instead of HTML:
curl "https://developers.cloudflare.com/workers/get-started/" \ --header "Accept: text/markdown"The response includes an x-markdown-tokens header with an estimated token count for the document, useful for context window planning (a context window is the maximum number of tokens an AI model can consider at once).
These endpoints follow the llms.txt standard ↗ and provide documentation content in Markdown format:
| Endpoint | Description |
|---|---|
/llms.txt | Page index grouped by product category, with links to each product's own llms.txt |
/llms-full.txt | Full content of all documentation in a single file, for offline indexing, bulk vectorization (converting content into numerical representations for similarity search), or large-context models |
Each product has its own scoped llms.txt and llms-full.txt. Use these when you only need documentation for a specific product.
| Endpoint | Description |
|---|---|
/workers/llms.txt | Page index for Workers documentation |
/workers/llms-full.txt | Full content of all Workers documentation pages |
Replace /workers/ with any product path. For the full list of available products, refer to /llms.txt.
An OpenAPI specification ↗ is a machine-readable description of an API — it lists every available endpoint, the parameters each one accepts, and the responses it returns. When you add this to your AI tool's context, the tool can generate API calls to Cloudflare services without you having to look up the documentation manually.
The full Cloudflare API OpenAPI specification is available for AI coding tools, API clients, and code generators:
| Endpoint | Description |
|---|---|
cloudflare/api-schemas ↗ | Full Cloudflare API OpenAPI specification (JSON) |
For the full API reference, refer to the Cloudflare API documentation.