LLM API Pricing Trends

Track how LLM API costs have evolved since GPT-4's launch in March 2023. Prices are per million tokens based on public API pricing from each provider.

Data from public pricing pages and announcements. Last updated: 2026-04-03.

Biggest Price Drop

GPT-4 Output: $60 → $10

83% drop via GPT-4o replacement

Cheapest Frontier

DeepSeek V3

$1.1/1M output · DeepSeek

Price Index

100 → 5.5

94.5% drop since March 2023

LLM Price Index (March 2023 = 100)

Average output price across frontier models, normalized to GPT-4 launch pricing. Lower means cheaper.

Output Price History by Model

Output price per million tokens over time. Toggle models to compare.

OpenAIAnthropicGoogleDeepSeekMistralxAI

Current LLM API Prices

All models from pricing history, sorted by output price (highest first)

ModelProviderInput $/1MOutput $/1M
GPT-5.4 ProOpenAI$30$180
Claude 3 OpusAnthropic$15$75
Claude Opus 4.6Anthropic$15$75
GPT-4OpenAI$30$60
o1OpenAI$15$60
o3OpenAI$10$40
GPT-4 TurboOpenAI$10$30
GPT-5.4OpenAI$2.5$15
Claude 3 SonnetAnthropic$3$15
Claude 3.5 SonnetAnthropic$3$15
Claude Sonnet 4.5Anthropic$3$15
Claude Sonnet 4.6Anthropic$3$15
Grok 4.1xAI$3$15
GPT-4oOpenAI$2.5$10
GPT-4.1OpenAI$2$8
GPT-5.2OpenAI$2$8
GPT-5.1OpenAI$1.5$6
Mistral Large 3Mistral$2$6
Gemini 1.5 ProGoogle$1.25$5
Gemini 2.5 ProGoogle$1.25$5
Gemini 3.1 ProGoogle$1.25$5
GPT-5.4 miniOpenAI$0.75$4.5
o3-miniOpenAI$1.1$4.4
Claude Haiku 4.5Anthropic$0.8$4
Gemini 3 FlashGoogle$0.5$3
DeepSeek R1DeepSeek$0.55$2.19
GPT-4.1 miniOpenAI$0.4$1.6
Claude 3 HaikuAnthropic$0.25$1.25
DeepSeek V3DeepSeek$0.27$1.1
GPT-4o miniOpenAI$0.15$0.6
Gemini 2.5 FlashGoogle$0.15$0.6
Grok 3 MinixAI$0.3$0.5
GPT-4.1 nanoOpenAI$0.1$0.4
Gemini 1.5 FlashGoogle$0.075$0.3

Prices per million tokens from official API pricing. Does not include batch, cached, or volume discounts.

Frequently Asked Questions

How much have LLM API prices dropped?

Since March 2023, the average output price for frontier LLMs has dropped approximately 94.5%. Our price index fell from 100 to 5.5, driven by competition and efficiency improvements across all major providers.

Which LLM provider is cheapest right now?

DeepSeek currently offers the lowest per-token pricing among frontier-capable models, with DeepSeek V3 at $1.10/1M output tokens. Google and OpenAI also offer competitive pricing through Gemini 2.5 Flash and GPT-4o mini respectively.

Why do reasoning models cost more?

Reasoning models like o1, o3, and DeepSeek R1 generate internal chain-of-thought tokens before producing the final answer. These thinking tokens consume compute but are billed as output tokens, making the effective cost per useful output token significantly higher.

Will LLM prices keep dropping?

The trend over the past three years has been consistent price decreases driven by model distillation, hardware improvements, and provider competition. While frontier reasoning models may maintain premium pricing, standard inference prices are expected to continue declining.

Weekly LLM Benchmark Digest

Get notified when new models drop, benchmark scores change, or the leaderboard shifts. One email per week.

Free. No spam. Unsubscribe anytime. We only store derived location metadata for consent routing.