Complete pricing guide for Azure AI Agent Service. Compare all plans, analyze costs, and find the perfect tier for your needs.
Not sure if free is enough? See our Free vs Paid comparison →
Still deciding? Read our full verdict on whether Azure AI Agent Service is worth it →
mo
mo
mo
Pricing sourced from Azure AI Agent Service · Last verified March 2026
Yes. The hosted agents feature supports Microsoft Agent Framework, LangGraph, Semantic Kernel, or custom code deployment on the same managed runtime. You can bring your existing agent codebase and run it on Azure's managed infrastructure without rewriting for Azure-specific orchestration. This dual-path support — no-code in the portal or code-first through hosted agents — is unique among cloud agent platforms and means you are not locked into a single framework or forced to rewrite existing agent logic to benefit from Azure's managed infrastructure, security, and memory capabilities.
Create prompt-based agents directly in the Microsoft Foundry portal by configuring tools, knowledge sources, and workflows through the UI. You can attach SharePoint sites, Azure AI Search indexes, Bing grounding, OpenAPI tools, and Logic Apps actions without writing code. Deploy with a single click. This path is best suited for simple agents, rapid prototyping, and business teams that need conversational AI without engineering resources. When requirements grow more complex, agents can be transitioned to code-based deployment on the same managed runtime without starting over.
Foundry's managed memory, available in public preview, provides automatic extraction of key information from conversations, consolidation across agent sessions, and intelligent retrieval based on the current request context. Agents remember customer preferences, previous requests, and ongoing project state without you building a vector store, summarization pipeline, or custom retrieval logic. This eliminates weeks of infrastructure work that teams using LangGraph or CrewAI typically invest in building and maintaining their own memory systems. The service handles storage, indexing, and context-aware recall natively within the Azure platform.
Both charge for model tokens with no separate orchestration fee. Azure AI Agent Service's GPT-4o pricing runs $2.50/1M input tokens and $10/1M output tokens at pay-as-you-go rates, comparable to Bedrock's Claude and Llama pricing tiers. Azure adds unique value through Agent Commit Units — pre-purchase volume discounts for committed monthly spend starting at $5,000/month. AWS counters with a broader model marketplace. For high-volume enterprise workloads, ACU discounts can meaningfully reduce total cost versus Bedrock's strictly pay-as-you-go model.
Foundry Agent Service primarily supports models available through Azure OpenAI Service (GPT-4, GPT-4o, GPT-5 family) plus a curated Foundry Models catalog that includes select Meta Llama, Mistral, and partner models. Model availability is narrower than AWS Bedrock's marketplace. Verify your preferred models are available in your target Azure region before committing, as catalog availability varies by region and new models are added on a rolling basis through the Foundry Models program.
AI builders and operators use Azure AI Agent Service to streamline their workflow.
Try Azure AI Agent Service Now →Build, deploy, and manage autonomous AI agents that use foundation models to automate complex tasks, analyze data, call APIs, and query knowledge bases — all within the AWS ecosystem with enterprise-grade security.
Compare Pricing →Graph-based workflow orchestration framework for building reliable, production-ready AI agents with deterministic state machines, human-in-the-loop capabilities, and comprehensive observability through LangSmith integration.
Compare Pricing →Open-source Python framework that orchestrates autonomous AI agents collaborating as teams to accomplish complex workflows. Define agents with specific roles and goals, then organize them into crews that execute sequential or parallel tasks. Agents delegate work, share context, and complete multi-step processes like market research, content creation, and data analysis. Supports 100+ LLM providers through LiteLLM integration and includes memory systems for agent learning. Features 48K+ GitHub stars with active community.
Compare Pricing →SDK for building AI agents with planners, memory, and connectors. - Enhanced AI-powered platform providing advanced capabilities for modern development and business workflows. Features comprehensive tooling, integrations, and scalable architecture designed for professional teams and enterprise environments.
Compare Pricing →