Complete pricing guide for Together AI. Compare all plans, analyze costs, and find the perfect tier for your needs.
Not sure if free is enough? See our Free vs Paid comparison →
Still deciding? Read our full verdict on whether Together AI is worth it →
mo
mo
mo
mo
Pricing sourced from Together AI · Last verified March 2026
Together AI provides access to open-source models (Llama, Mistral, DeepSeek) through an OpenAI-compatible API. Key advantages include 5-20x lower costs per token, faster inference speeds through custom optimizations, and access to specialized models. The tradeoff is that even the best open-source models may lag behind GPT-4 on complex reasoning tasks, though the gap is rapidly narrowing with models like Llama 3.3 and DeepSeek-V3.
Yes, Together AI implements OpenAI-compatible function calling across supported models including Llama, Mistral, and other major families. The implementation uses the same tools/function_call API format, so existing agent code using OpenAI SDK works with minimal changes. Function calling quality varies by model size - larger models (70B+) generally produce more reliable tool calls than smaller ones.
Yes, Together AI provides comprehensive fine-tuning capabilities for customizing open-source models on your data. You can fine-tune Llama, Mistral, and other supported base models using instruction tuning, domain adaptation, or full fine-tuning. The platform supports advanced techniques like LoRA and QLoRA for efficient training. Fine-tuned models are automatically deployed for inference through the same API with usage-based pricing.
Dedicated endpoints provide reserved GPU capacity with guaranteed performance and sub-100ms latency SLAs. They're ideal for production applications requiring consistent performance, high-volume workloads, or custom model hosting. Unlike serverless inference which shares resources, dedicated endpoints give you isolated infrastructure. Pricing is based on hourly GPU reservations rather than per-token usage.
Together AI offers 99.9% uptime SLA on dedicated endpoints and maintains high availability on serverless infrastructure. The platform is SOC 2 Type II certified with enterprise security features. For mission-critical applications, dedicated endpoints provide the most reliable option with guaranteed capacity and consistent performance. Enterprise plans include priority support and custom SLAs.
AI builders and operators use Together AI to streamline their workflow.
Try Together AI Now →