Track how LLM API costs have evolved since GPT-4's launch in March 2023. Prices are per million tokens based on public API pricing from each provider.
Data from public pricing pages and announcements. Last updated: 2026-04-03.
GPT-4 Output: $60 → $10
83% drop via GPT-4o replacement
DeepSeek V3
$1.1/1M output · DeepSeek
100 → 5.5
94.5% drop since March 2023
Average output price across frontier models, normalized to GPT-4 launch pricing. Lower means cheaper.
Output price per million tokens over time. Toggle models to compare.
All models from pricing history, sorted by output price (highest first)
| Model | Provider | Input $/1M | Output $/1M |
|---|---|---|---|
| GPT-5.4 Pro | OpenAI | $30 | $180 |
| Claude 3 Opus | Anthropic | $15 | $75 |
| Claude Opus 4.6 | Anthropic | $15 | $75 |
| GPT-4 | OpenAI | $30 | $60 |
| o1 | OpenAI | $15 | $60 |
| o3 | OpenAI | $10 | $40 |
| GPT-4 Turbo | OpenAI | $10 | $30 |
| GPT-5.4 | OpenAI | $2.5 | $15 |
| Claude 3 Sonnet | Anthropic | $3 | $15 |
| Claude 3.5 Sonnet | Anthropic | $3 | $15 |
| Claude Sonnet 4.5 | Anthropic | $3 | $15 |
| Claude Sonnet 4.6 | Anthropic | $3 | $15 |
| Grok 4.1 | xAI | $3 | $15 |
| GPT-4o | OpenAI | $2.5 | $10 |
| GPT-4.1 | OpenAI | $2 | $8 |
| GPT-5.2 | OpenAI | $2 | $8 |
| GPT-5.1 | OpenAI | $1.5 | $6 |
| Mistral Large 3 | Mistral | $2 | $6 |
| Gemini 1.5 Pro | $1.25 | $5 | |
| Gemini 2.5 Pro | $1.25 | $5 | |
| Gemini 3.1 Pro | $1.25 | $5 | |
| GPT-5.4 mini | OpenAI | $0.75 | $4.5 |
| o3-mini | OpenAI | $1.1 | $4.4 |
| Claude Haiku 4.5 | Anthropic | $0.8 | $4 |
| Gemini 3 Flash | $0.5 | $3 | |
| DeepSeek R1 | DeepSeek | $0.55 | $2.19 |
| GPT-4.1 mini | OpenAI | $0.4 | $1.6 |
| Claude 3 Haiku | Anthropic | $0.25 | $1.25 |
| DeepSeek V3 | DeepSeek | $0.27 | $1.1 |
| GPT-4o mini | OpenAI | $0.15 | $0.6 |
| Gemini 2.5 Flash | $0.15 | $0.6 | |
| Grok 3 Mini | xAI | $0.3 | $0.5 |
| GPT-4.1 nano | OpenAI | $0.1 | $0.4 |
| Gemini 1.5 Flash | $0.075 | $0.3 |
Prices per million tokens from official API pricing. Does not include batch, cached, or volume discounts.
Compare all current LLM API prices
Find the best value per dollar
Tokens/sec and TTFT comparison
Since March 2023, the average output price for frontier LLMs has dropped approximately 94.5%. Our price index fell from 100 to 5.5, driven by competition and efficiency improvements across all major providers.
DeepSeek currently offers the lowest per-token pricing among frontier-capable models, with DeepSeek V3 at $1.10/1M output tokens. Google and OpenAI also offer competitive pricing through Gemini 2.5 Flash and GPT-4o mini respectively.
Reasoning models like o1, o3, and DeepSeek R1 generate internal chain-of-thought tokens before producing the final answer. These thinking tokens consume compute but are billed as output tokens, making the effective cost per useful output token significantly higher.
The trend over the past three years has been consistent price decreases driven by model distillation, hardware improvements, and provider competition. While frontier reasoning models may maintain premium pricing, standard inference prices are expected to continue declining.
Get notified when new models drop, benchmark scores change, or the leaderboard shifts. One email per week.
Free. No spam. Unsubscribe anytime. We only store derived location metadata for consent routing.