TypeScript-native AI agent framework for building agents with tools, workflows, RAG, and memory — designed for the JavaScript/TypeScript ecosystem.
A TypeScript toolkit for building AI agents and workflows with type safety, MCP support, and 40+ LLM integrations — from the team behind Gatsby.
Mastra is an open-source TypeScript-native framework for building AI agents, created by the team behind Gatsby. While most AI agent frameworks are Python-first, Mastra provides first-class TypeScript support with full type safety, making it the go-to choice for teams building agents in Node.js, Next.js, and other JavaScript environments.
The framework launched in January 2026 after graduating from Y Combinator's W25 batch with $13M in seed funding. It has rapidly gained traction with 22,000+ GitHub stars and over 300,000 weekly npm downloads, with production users including PayPal, Adobe, and Replit.
Mastra provides a comprehensive set of primitives for agent development: LLM integration with 40+ providers (OpenAI, Anthropic, Google, and more), tool definition with Zod-typed schemas, graph-based workflow orchestration with intuitive control flow syntax (.then(), .branch(), .parallel()), RAG with vector store integration, and persistent memory management. All of these are designed with TypeScript's type system, providing autocompletion, compile-time checks, and excellent developer experience.
The framework includes full MCP (Model Context Protocol) server authoring capabilities, enabling developers to expose agents, tools, and structured resources through standardized MCP interfaces. Mastra supports human-in-the-loop workflows with suspend/resume capabilities, built-in evaluations using model-graded and rule-based methods, comprehensive observability with tracing and logging, and flexible deployment options.
Mastra integrates seamlessly with React, Next.js, Node.js, Express, Hono, and more, while also supporting standalone server deployment. The platform includes Mastra Studio, an interactive developer UI for visualizing agents and workflows, running and iterating on agents without wiring up a frontend, and inspecting inputs/outputs/tools/memory in one view.
The framework deploys naturally to Vercel, Cloudflare Workers, AWS Lambda, and any Node.js hosting environment. A cloud platform (Mastra Platform) with deployment, observability, and studio features is available with pricing launching Q1 2026. The framework itself remains free and open-source under the Apache 2.0 license.
Was this helpful?
Built from the ground up for TypeScript with full type safety, autocompletion, and compile-time checks. Agent tools use Zod schemas for automatic validation, type inference, and LLM function calling schema generation — not a Python port with TypeScript wrappers.
Step-based workflow engine with intuitive control flow syntax (.then(), .branch(), .parallel()) supporting sequential, parallel, and conditional execution patterns. Includes human-in-the-loop pause/resume, error handling, and retries.
Full Model Context Protocol server authoring capabilities for exposing agents, tools, and structured resources through standardized MCP interfaces, enabling seamless integration with MCP-compatible systems like Claude Desktop and other AI tools.
Document processing, chunking, embedding, and vector store integration (Pinecone, pgvector, and others) for building knowledge-grounded agents with semantic memory and retrieval.
Model routing across OpenAI, Anthropic, Google Gemini, and 40+ other providers with a unified API, allowing agents to use the best model for each task without changing code.
Interactive development interface for running and iterating on agents without wiring up a frontend. Inspect inputs, outputs, tools, and memory in one view. Configure workflows, datasets, and evals visually.
Free
forever
TBD
Contact sales
Ready to get started with Mastra?
View Pricing Options →Mastra works with these platforms and services:
We believe in transparent reviews. Here's what Mastra doesn't handle well:
Weekly insights on the latest AI tools, features, and trends delivered to your inbox.
AI Agent Builders
The industry-standard framework for building production-ready LLM applications with comprehensive tool integration, agent orchestration, and enterprise observability through LangSmith.
AI Agent Builders
LlamaIndex: Build and optimize RAG pipelines with advanced indexing and agent retrieval for LLM applications.
AI Agent Builders
OpenAI's official open-source framework for building agentic AI applications with minimal abstractions. Production-ready successor to Swarm, providing agents, handoffs, guardrails, and tracing primitives that work with Python and TypeScript.
AI Agent Builders
Production-grade Python agent framework that brings FastAPI-level developer experience to AI agent development. Built by the Pydantic team, it provides type-safe agent creation with automatic validation, structured outputs, and seamless integration with Python's ecosystem. Supports all major LLM providers through a unified interface while maintaining full type safety from development through deployment.
Enterprise Agents
Open-source Python framework and production runtime for building, deploying, and managing agentic AI systems at scale with enterprise-grade performance and security.
No reviews yet. Be the first to share your experience!
Get started with Mastra and see if it's the right fit for your needs.
Get Started →Take our 60-second quiz to get personalized tool recommendations
Find Your Perfect AI Stack →Explore 20 ready-to-deploy AI agent templates for sales, support, dev, research, and operations.
Browse Agent Templates →A hands-on comparison of the top AI agent frameworks — CrewAI, LangGraph, OpenAI Agents SDK, AutoGen, Google ADK, and more. Real code examples, setup times, and production guidance for builders.
An honest comparison of the best no-code AI agent builders: n8n, Flowise, Dify, Langflow, Make, Zapier, and more. Features, pricing, agent capabilities, and recommendations by use case.