Model · DeepSeek · Code
DeepSeek-Coder V2
DeepSeek's open code model. MoE architecture with strong coverage across 338 programming languages; the open-weights coder of choice for high-end self-hosting.
- Modality
- Code
- License
- DeepSeek License (Open weights)
- Parameter size
- 236B (21B active)
- Context window
- 128,000 tokens
- Released
- June 17, 2024
- Last verified
- May 10, 2026
- Runs locally
- Yes
Strengths
- 338 programming languages
- MoE keeps active parameters manageable
- Competitive with closed coding APIs
Weaknesses
- Massive total parameter count — needs serious deployment hardware
- Lite variant (16B/2.4B active) trades quality for tractability
Try it
| Where | Type | Notes |
|---|---|---|
| Hugging Face | weights | DeepSeek License |
| DeepSeek Platform | hosted-api | API key required |
| Ollama | local | ollama run deepseek-coder-v2 |
Official sources
- Model card model
- Technical report github
Change log
- — Initial entry.
Esc