aitoolsatlas.ai
BlogAbout
Menu
📝 Blog
â„šī¸ About

Explore

  • All Tools
  • Comparisons
  • Best For Guides
  • Blog

Company

  • About
  • Contact
  • Editorial Policy

Legal

  • Privacy Policy
  • Terms of Service
  • Affiliate Disclosure
Privacy PolicyTerms of ServiceAffiliate DisclosureEditorial PolicyContact

Š 2026 aitoolsatlas.ai. All rights reserved.

Find the right AI tool in 2 minutes. Independent reviews and honest comparisons of 875+ AI tools.

  1. Home
  2. Tools
  3. AI Gateway
OverviewPricingReviewWorth It?Free vs PaidDiscountAlternativesComparePros & ConsIntegrationsTutorialChangelogSecurityAPI
Developer Tools
A

AI Gateway

Databricks central AI governance layer for LLM endpoints, MCP servers, and coding agents. Provides enterprise governance with unified UI, observability, permissions, guardrails, and capacity management across providers.

Starting atFree
Visit AI Gateway →
OverviewFeaturesPricingUse CasesLimitationsFAQSecurityAlternatives

Overview

Databricks AI Gateway is a Developer Tools governance layer that centralizes control over LLM endpoints, MCP servers, and coding agents, with pricing available through Databricks Enterprise contracts (Beta features currently incur no charges). It is built for enterprise data platform teams, AI platform engineers, and ML governance leads operating multi-provider AI stacks who need unified visibility, access control, and cost management across dozens of model endpoints and agent integrations.

As of April 15, 2026, AI Gateway (Beta) sits natively inside the Databricks workspace left navigation, providing a single control plane for three distinct governance domains: LLM endpoints (including external models, Foundation Model APIs, and custom-served models), Model Context Protocol (MCP) servers, and coding agent integrations like Cursor, Gemini CLI, Codex CLI, and Claude Code. Account admins enable the feature via the account console Previews page, and endpoints are queryable through the standard OpenAI client plus other supported APIs, making migration from direct provider calls essentially drop-in. The gateway exposes usage analytics through Unity Catalog system tables, payload logging via inference tables stored as Delta tables, configurable rate limits for cost and capacity management, and safety guardrails applied consistently across providers.

Based on our analysis of 870+ AI tools in our directory, AI Gateway differentiates itself from standalone gateway offerings like Portkey, Kong AI Gateway, and LiteLLM Proxy by being deeply integrated with Unity Catalog for governance lineage — meaning permissions, audit logs, and inference tables all share the same Databricks RBAC model that governs the rest of your lakehouse data. Compared to the other Developer Tools in our directory focused on LLM orchestration, this is the only option that also natively governs MCP servers (both Databricks-managed and external) alongside traditional LLM endpoints, and hosts custom MCP servers as Databricks Apps. The tradeoff is that AI Gateway only makes sense if you are already a Databricks customer; teams without an existing lakehouse commitment will find lighter-weight gateways easier to adopt.

🎨

Vibe Coding Friendly?

â–ŧ
Difficulty:intermediate

Suitability for vibe coding depends on your experience level and the specific use case.

Learn about Vibe Coding →

Was this helpful?

Key Features

Unified governance across LLMs, MCPs, and coding agents+

A single left-nav product in the Databricks workspace covers three historically separate concerns — model endpoint governance, MCP server governance, and developer coding agent governance. Permissions, audit logs, and rate limits are configured from one UI, which eliminates the common pattern of stitching together a separate LLM gateway, MCP proxy, and developer-tool audit system.

Unity Catalog inference tables+

Every request and response flowing through AI Gateway endpoints can be logged to Unity Catalog Delta tables for full payload-level audit, replay, and quality monitoring. Because these inference tables are native Delta, they are immediately queryable via SQL, notebooks, BI tools, and downstream ML monitoring workflows with row- and column-level access controls already in place.

OpenAI-compatible query API+

AI Gateway endpoints are queryable using the standard OpenAI client plus other supported APIs, so existing application code pointing at OpenAI or other providers can be redirected to AI Gateway with minimal refactoring. This lowers migration friction and lets teams adopt governance without rewriting their agent or application code paths.

Rate limits and guardrails+

Administrators can configure per-endpoint rate limits to cap capacity and cost, and apply safety guardrails that run consistently across providers to block unsafe prompts or responses. This enforces the same policy regardless of whether traffic is routed to OpenAI, Anthropic, a Databricks Foundation Model, or a custom-served model.

Coding agent integrations+

Documented first-class integrations with Cursor, Claude Code, Gemini CLI, and Codex CLI route developer traffic through AI Gateway so platform teams can attribute token spend, enforce quotas, and capture prompt/response logs. This brings developer AI tooling under the same governance umbrella as production LLM workloads — an increasingly important requirement as coding agents become standard developer infrastructure in 2026.

Pricing Plans

Beta (Current)

Free

  • ✓Full AI Gateway feature set during Beta period
  • ✓Unified governance for LLM endpoints, MCP servers, and coding agents
  • ✓Unity Catalog inference tables and system tables
  • ✓Rate limits and safety guardrails
  • ✓Coding agent integrations (Cursor, Claude Code, Gemini CLI, Codex CLI)
  • ✓Standard Databricks compute and serving charges still apply

Enterprise (Post-GA)

Contact Sales

  • ✓All Beta features with enterprise SLAs
  • ✓Pricing set through Databricks enterprise contracts
  • ✓Bundled with Databricks platform — no standalone purchase available
  • ✓Volume-based pricing aligned with existing Databricks DBU model
  • ✓Contact Databricks account team for custom quote
See Full Pricing →Free vs Paid →Is it worth it? →

Ready to get started with AI Gateway?

View Pricing Options →

Best Use Cases

đŸŽ¯

A large enterprise running multiple LLM providers (OpenAI, Anthropic, Databricks Foundation Models, custom serving endpoints) needs a single governance plane with consistent rate limits, guardrails, and audit across all of them

⚡

Platform teams rolling out Cursor, Claude Code, or Codex CLI to hundreds of developers and needing to centrally attribute token spend, enforce quotas, and capture prompt/response logs for compliance

🔧

Regulated organizations (financial services, healthcare) needing payload-level audit logs of every LLM interaction stored in Unity Catalog Delta tables with lineage and RBAC

🚀

AI platform teams deploying MCP servers — Databricks-managed, external, or custom — and needing unified access control, visibility, and audit logging across all MCP interactions

💡

Data science organizations already invested in Databricks Unity Catalog who want LLM governance to inherit the same permission model as their lakehouse tables

🔄

Cost management scenarios where finance needs per-team or per-workspace chargeback for LLM and coding agent usage, backed by system-table queries

Limitations & What It Can't Do

We believe in transparent reviews. Here's what AI Gateway doesn't handle well:

  • ⚠Not available outside Databricks — the entire product is a Databricks-native control plane with no standalone or multi-cloud-agnostic deployment
  • ⚠Beta status (as of April 15, 2026) means APIs, UI, and feature coverage may change before general availability
  • ⚠Custom MCP server hosting requires packaging as a Databricks App, which adds platform-specific deployment overhead compared to hosting MCP servers on generic infrastructure
  • ⚠Two parallel AI Gateway products (new Beta vs. previous serving-endpoints version) create migration and documentation complexity for existing customers
  • ⚠Public pricing is not published, so budgeting and vendor comparison require direct engagement with Databricks account teams

Pros & Cons

✓ Pros

  • ✓Native integration with Unity Catalog means permissions, audit logs, and lineage work identically to the rest of your Databricks data assets without extra IAM plumbing
  • ✓OpenAI-compatible client interface allows existing application code to point at AI Gateway endpoints with minimal refactoring
  • ✓Governs three distinct asset types (LLM endpoints, MCP servers, coding agents) in a single pane of glass — rare across the 870+ tools in our directory
  • ✓No charges during Beta (confirmed on docs as of April 15, 2026), letting teams pilot full governance workflows before committing to enterprise pricing
  • ✓Supports major coding agents including Cursor, Claude Code, Gemini CLI, and Codex CLI, covering the dominant agent tools developers use in 2026
  • ✓Inference tables land as Delta tables in Unity Catalog, making audit and monitoring queries trivially accessible via SQL or notebooks

✗ Cons

  • ✗Only available inside the Databricks platform — teams not already on Databricks cannot adopt AI Gateway as a standalone product
  • ✗Currently in Beta, meaning feature set, APIs, and limits may shift before GA and enterprise SLAs may not apply
  • ✗Two parallel versions exist (new AI Gateway in left nav vs. previous AI Gateway for serving endpoints), which creates documentation and migration ambiguity
  • ✗Custom MCP server hosting requires packaging as a Databricks App, adding a layer of platform-specific deployment knowledge
  • ✗Pricing is opaque enterprise-contract based with no public tier breakdown, making TCO comparisons against standalone gateways difficult

Frequently Asked Questions

How is the new AI Gateway different from the previous AI Gateway for serving endpoints?+

The new AI Gateway, launched in Beta and visible in the left nav of the Databricks UI, is a broader central governance layer that covers LLM endpoints, MCP servers, and coding agents together. The previous AI Gateway was scoped only to model serving endpoints — external model endpoints, Foundation Model API endpoints, and custom model endpoints — and focused on usage tracking, payload logging, rate limits, and guardrails at the endpoint level. Both versions coexist in the documentation as of April 15, 2026, and Databricks recommends account admins enable the new version from the account console Previews page. Existing serving-endpoint governance continues to function while teams migrate.

Does AI Gateway cost extra on top of Databricks?+

According to the official documentation, AI Gateway features do not incur charges during the Beta period. Standard Databricks consumption charges for model serving, DBU usage, and underlying compute still apply, and once the product moves to GA, enterprise pricing will be set through standard Databricks contracts. Because pricing is not published publicly, prospective customers should request a quote through their Databricks account team. This makes the Beta window a good opportunity to pilot full governance before any commercial commitment.

Which coding agents can I integrate with AI Gateway?+

The documentation explicitly calls out support for Cursor, Gemini CLI, Codex CLI, and Claude Code, which covers most of the dominant AI coding agents developers use in 2026. Integration routes each agent's model calls through the AI Gateway, so prompt/response payloads, token usage, and cost attribution are captured in Unity Catalog inference tables. This lets platform teams apply the same rate limits and guardrails to developer coding traffic that they apply to production LLM workloads. Other OpenAI-compatible agents can also point at AI Gateway endpoints using the OpenAI client.

What can I do with the MCP server governance features?+

AI Gateway supports three MCP deployment patterns: Databricks-managed MCP servers that expose native platform features, external MCP servers connected through managed connections, and custom MCP servers hosted as Databricks Apps. For each, AI Gateway enforces access control through Unity Catalog permissions and logs every MCP interaction for audit. Non-Databricks MCP clients can also connect to Databricks-hosted MCP servers through documented client connection flows. This unified governance is differentiated from pure LLM gateways — based on our analysis of 870+ AI tools, AI Gateway is the only offering that natively governs MCP servers alongside LLM endpoints.

How do I monitor usage, cost, and audit logs?+

AI Gateway emits two complementary telemetry streams into Unity Catalog. System tables capture endpoint-level usage and cost aggregates for budgeting and chargeback, while inference tables capture full request and response payloads as Delta tables for granular audit, replay, and quality monitoring. Both are queryable through standard SQL, notebooks, or BI tools, and inherit Unity Catalog row- and column-level access controls. Rate limits can be configured per endpoint to cap capacity and prevent runaway cost, and guardrails can be applied to block unsafe content across providers consistently.
đŸĻž

New to AI tools?

Learn how to run your first agent with OpenClaw

Learn OpenClaw →

Get updates on AI Gateway and 370+ other AI tools

Weekly insights on the latest AI tools, features, and trends delivered to your inbox.

No spam. Unsubscribe anytime.

What's New in 2026

As of April 15, 2026, Databricks has launched a new AI Gateway (Beta) visible in the workspace left navigation that expands governance beyond model serving endpoints to also cover MCP servers and coding agents including Cursor, Claude Code, Gemini CLI, and Codex CLI. The previous AI Gateway for serving endpoints remains available in parallel. AI Gateway features do not incur charges during the Beta period, and account admins enable access through the account console Previews page.

Alternatives to AI Gateway

LiteLLM

Deployment & Hosting

LiteLLM: Y Combinator-backed open-source AI gateway and unified API proxy for 100+ LLM providers with load balancing, automatic failovers, spend tracking, budget controls, and OpenAI-compatible interface for production applications.

Cloudflare AI Gateway

Deployment & Hosting

Observe and control AI applications with caching, rate limiting, and analytics for any LLM provider.

Helicone

Analytics & Monitoring

Open-source LLM observability platform and API gateway that provides cost analytics, request logging, caching, and rate limiting through a simple proxy-based integration requiring only a base URL change.

View All Alternatives & Detailed Comparison →

User Reviews

No reviews yet. Be the first to share your experience!

Quick Info

Category

Developer Tools

Website

docs.databricks.com/aws/en/ai-gateway/
🔄Compare with alternatives →

Try AI Gateway Today

Get started with AI Gateway and see if it's the right fit for your needs.

Get Started →

Need help choosing the right AI stack?

Take our 60-second quiz to get personalized tool recommendations

Find Your Perfect AI Stack →

Want a faster launch?

Explore 20 ready-to-deploy AI agent templates for sales, support, dev, research, and operations.

Browse Agent Templates →

More about AI Gateway

PricingReviewAlternativesFree vs PaidPros & ConsWorth It?Tutorial

📚 Related Articles

5 Undiscovered aitoolsatlas.ai You Should Try This Week (March 2026)

Hidden gems in the AI agent tooling space — from browser infrastructure to memory platforms to observability tools. These production-ready tools solve real problems that most developers haven't discovered yet.

2026-03-105 min read