Complete pricing guide for Cloudflare Workers AI. Compare all plans, analyze costs, and find the perfect tier for your needs.
Not sure if free is enough? See our Free vs Paid comparison →
Still deciding? Read our full verdict on whether Cloudflare Workers AI is worth it →
mo
mo
mo
mo
Pricing sourced from Cloudflare Workers AI · Last verified March 2026
The catalog includes 50+ open-source models, including Meta Llama 3.1/3.2/3.3 and Llama 4 Scout, Mistral 7B, Google Gemma, Qwen, DeepSeek, BGE embeddings for semantic search, OpenAI Whisper for speech-to-text, Stable Diffusion XL and Flux for image generation, plus models for translation, classification, summarization, and sentiment analysis. The catalog is curated and optimized by Cloudflare for edge deployment, and new models are added regularly as they become available and pass Cloudflare's optimization pipeline. Each model in the catalog includes published neuron costs, supported features (streaming, function calling, etc.), and maximum context window specifications.
Pricing is based on neurons, Cloudflare's normalized unit of AI compute. The free tier includes 10,000 neurons per day at no cost, and the Workers Paid plan ($5/month) includes 10,000 neurons/day plus pay-as-you-go pricing at $0.011 per 1,000 neurons beyond the free allotment. Each model has a published neuron cost per request in the model catalog, so developers can estimate expenses before deploying. For example, a typical Llama 3.1 8B inference request costs approximately 50 neurons (~$0.00055). Enterprise customers can negotiate volume discounts and committed-use contracts. Neuron costs vary by model size and modality — text generation models consume fewer neurons per request than image generation models.
Yes. Workers AI supports LoRA adapters on selected base models, allowing you to load fine-tuned weights at inference time without redeploying the base model. You can also bring your own fine-tuned weights for supported architectures through the BYOM program, and Cloudflare integrates with Hugging Face for some model import workflows. Fully custom architectures that fall outside the supported model formats (such as novel attention mechanisms or proprietary model structures) still require dedicated infrastructure and cannot be deployed to Workers AI. Cloudflare continues to expand the range of supported base models and adapter formats, so checking the current documentation for the latest compatibility list is recommended.
OpenAI offers higher-quality proprietary models like GPT-4o and o-series reasoners, the most mature developer ecosystem, and broader feature coverage (advanced function calling, Assistants API, fine-tuning). Workers AI offers global edge inference with lower latency for geographically distributed users, open-weight models that provide transparency and no vendor lock-in, lower price points for many workloads (especially at scale with smaller models), and tight integration with Cloudflare's storage, networking, and security stack. The choice depends on whether you prioritize frontier model quality (OpenAI) or edge distribution, cost efficiency, and platform integration (Workers AI). Many teams use both — Workers AI for latency-sensitive open-model tasks and OpenAI via AI Gateway for frontier-quality reasoning.
Requests are routed to the nearest Cloudflare data center equipped with GPUs capable of serving the requested model. GPU capacity is deployed across over 300 cities globally through Cloudflare's anycast network, so latency from end-user to inference is typically low for popular models that are widely distributed. However, not every model is available at every location — larger models may only be served from a subset of GPU-equipped data centers, which can increase latency for those specific models. Cloudflare's routing layer automatically selects the optimal location balancing proximity, GPU availability, and current load. The network continues to expand GPU coverage, with the goal of making all catalog models available at every major point of presence.
AI builders and operators use Cloudflare Workers AI to streamline their workflow.
Try Cloudflare Workers AI Now →