Model · DeepSeek · Text
DeepSeek V3
DeepSeek's frontier open-weight MoE model. 671B total parameters with 37B active per token; competitive with closed-tier models at a fraction of the inference cost.
- Modality
- Text
- License
- DeepSeek License (Open weights)
- Parameter size
- 671B (37B active)
- Context window
- 128,000 tokens
- Released
- December 26, 2024
- Last verified
- May 10, 2026
- Runs locally
- Yes
Strengths
- Frontier-tier quality at open-weights cost
- MoE architecture keeps active params manageable
- Strong coding and math performance
Weaknesses
- Massive deployment footprint — beyond reach for most local self-hosting
- License has restrictions on commercial use cases
Try it
| Where | Type | Notes |
|---|---|---|
| Hugging Face | weights | DeepSeek License |
| DeepSeek Platform | hosted-api | API key required |
| OpenRouter | hosted-api | API key required |
Version history
- DeepSeek V4 Flash May 2026
- DeepSeek V4 Pro May 2026
- DeepSeek V3 Dec 2024 Current Deprecated
Official sources
- Model card model
- Technical report github
Change log
- — Initial entry.
Esc