Stay free if you only need full framework under apache 2.0 license and burr ui for local observability and debugging. Upgrade if you need hosted telemetry and observability dashboard and team-based access controls and collaboration. Most solo builders can start free.
Why it matters: State machine concept requires upfront design thinking and may have a learning curve for developers unfamiliar with the pattern.
Available from: Burr Cloud (Beta)
Why it matters: Smaller ecosystem compared to LangChain with fewer pre-built integrations and community plugins.
Available from: Burr Cloud (Beta)
Why it matters: Python-only framework with no support for other programming languages or cross-language workflows.
Available from: Burr Cloud (Beta)
Why it matters: More verbose setup compared to quick-start frameworks that hide complexity behind high-level abstractions.
Available from: Burr Cloud (Beta)
Why it matters: Burr Cloud enterprise features still in beta with pricing not yet publicly finalized.
Available from: Burr Cloud (Beta)
While basic understanding helps, Burr's state machine model is intentionally simple. Actions define inputs and outputs, and transitions specify which action runs next. The decorator-based API makes it feel like writing standard Python functions with clear control flow.
Burr is completely framework-agnostic. Actions are standard Python functions, so you can call OpenAI, Anthropic, local models via Ollama, or any other provider. There is no built-in LLM abstraction layer that forces you into a specific integration.
Burr's telemetry UI is built-in and free, providing step-by-step execution traces, state inspection, and time-travel debugging out of the box. LangSmith is a separate paid service starting at $39 per seat per month. Burr's approach requires no external accounts or API keys for local debugging.
Yes. Burr includes FastAPI integration, persistent state backends, and robust error handling suitable for production. Its Apache Software Foundation incubation status signals community commitment to long-term maintenance and governance. Note that the project is still in ASF incubation, so users should evaluate maturity for their specific requirements.
Burr's overhead is minimal since it primarily orchestrates function calls and manages state transitions. The actual computational work happens in your actions (LLM calls, data processing), and Burr adds negligible latency to the orchestration layer.
Migration involves restructuring chain logic into actions and transitions. Since Burr actions are plain Python functions, existing LangChain tool integrations can often be wrapped directly. The main effort is in redesigning the flow as an explicit state machine.
The open-source version includes community support via Discord and GitHub. Burr Cloud (currently in beta) is planned to offer hosted observability and team features. Beta access is currently free; post-GA pricing has not been publicly announced but is expected to follow industry-standard per-seat or usage-based models. The Apache Software Foundation governance model ensures the project's long-term continuity regardless of commercial offerings.
Yes. Burr applications can run concurrently with isolated state, making it straightforward to orchestrate multiple agents or parallel workflows within a single service.
Start with the free plan — upgrade when you need more.
Get Started Free →Still not sure? Read our full verdict →
Last verified March 2026