Open-source Python framework for building reliable AI applications as state machines, currently undergoing Apache Software Foundation incubation.
Build AI applications as clear state machines in Python with built-in observability, debugging, and persistence — fully open source under the Apache 2.0 license.
Apache Burr (Incubating) is a free, open-source Python framework in the AI development frameworks category that models applications as explicit state machines. Licensed under Apache 2.0 with no usage limits or gated features in the core framework, it provides built-in observability, debugging, and persistence for AI agent workflows, chatbots, and multi-step pipelines.
Originally created by DAGWorks Inc. and now incubating at the Apache Software Foundation, Burr takes a fundamentally different approach to AI orchestration compared to chain-based or graph-based frameworks like LangChain and LangGraph. Instead of implicit data flows, every application step is a defined action with typed state reads and writes, connected by explicit conditional transitions. This state-machine paradigm makes complex agent behaviors—including loops, branches, retries, and human-in-the-loop checkpoints—first-class citizens that are visible, testable, and reproducible.
The framework's core API uses Python decorators to define actions and a builder pattern to wire them into applications. Developers write standard Python functions decorated with @action, specifying which state keys each action reads and writes. Transitions between actions can be conditional, enabling dynamic routing based on runtime state. This means applications are testable with standard pytest, debuggable with standard Python debuggers, and readable without learning a custom DSL or YAML configuration. The GitHub repository (github.com/DAGWorks-Inc/burr) has accumulated approximately 2,500 stars and contributions from over 40 developers, reflecting steady community growth since the project's initial release in late 2023.
Burr ships with a bundled local telemetry UI at no additional cost, providing step-by-step execution traces, state inspection at every transition, and the ability to replay executions from any checkpoint. Unlike competing observability solutions such as LangSmith, which requires a separate paid subscription, Burr's UI runs entirely locally with no external accounts or API keys required.
Persistence is handled through a pluggable backend system supporting in-memory, SQLite, PostgreSQL, Redis, and custom implementations. This enables long-running workflows to checkpoint state and resume after failures or server restarts, making Burr suitable for production deployments where reliability is non-negotiable. Built-in FastAPI integration allows applications to be exposed as REST APIs with minimal boilerplate, and the framework supports both synchronous and asynchronous execution patterns along with streaming responses.
The project entered the Apache Software Foundation Incubator in 2025, transitioning governance from DAGWorks Inc. to a vendor-neutral community model. This incubation status brings structured IP clearance, transparent release processes, and a defined path toward graduation as a top-level Apache project. For teams evaluating the framework, the primary trade-off is ecosystem breadth versus architectural clarity: LangChain offers significantly more pre-built integrations and community extensions, while Burr provides stronger guarantees around state visibility, reproducibility, and debuggability—qualities particularly valued in regulated industries and enterprise environments that require auditable AI workflows.
Was this helpful?
Apache Burr (Incubating) establishes a clear paradigm for AI application development by treating every workflow as an explicit state machine. Its bundled telemetry UI, pluggable persistence, and framework-agnostic design make it a compelling choice for teams prioritizing observability and reliability. The trade-off is a smaller ecosystem and more upfront design effort compared to larger frameworks.
Define applications as explicit state machines with decorator-based actions and conditional transitions, then inspect and replay executions through the built-in Burr UI's trace viewer and state inspector.
Works seamlessly with any LLM provider (OpenAI, Anthropic, local models) and Python library, with no vendor lock-in or required abstraction layers.
Every installation includes a local web UI for step-by-step execution traces, state inspection, and time-travel debugging — no external services or accounts required.
Pluggable persistence backends support in-memory, SQLite, PostgreSQL, Redis, and custom stores for checkpointing, recovery, and long-running workflow state.
Deploy applications as web services with built-in FastAPI support, enabling straightforward scaling and integration into existing service architectures.
Currently incubating at the ASF, benefiting from its proven governance model with vendor-neutral oversight, transparent development, and community-driven roadmap.
Deep introspection capabilities allow examination of state at every step, enabling reproducible debugging and comprehensive audit trails for compliance.
Seamlessly manages state containing text, images, embeddings, and structured data across actions and transitions in complex AI pipelines.
Free
Free during beta; post-GA pricing not yet announced
Ready to get started with Apache Burr?
View Pricing Options →We believe in transparent reviews. Here's what Apache Burr doesn't handle well:
Weekly insights on the latest AI tools, features, and trends delivered to your inbox.
Apache Burr's most significant 2025–2026 milestone is its acceptance into the Apache Software Foundation Incubator, establishing vendor-neutral governance. The project has added streaming support, async execution, improved multi-agent patterns, and Burr Cloud beta for hosted observability.
No reviews yet. Be the first to share your experience!
Get started with Apache Burr and see if it's the right fit for your needs.
Get Started →Take our 60-second quiz to get personalized tool recommendations
Find Your Perfect AI Stack →Explore 20 ready-to-deploy AI agent templates for sales, support, dev, research, and operations.
Browse Agent Templates →