DeepSeek V4
Open SourceDeepSeek•Released on
DeepSeek's trillion-parameter MoE model with only 32B active parameters. 1M context window, native multimodal (text/image/video), and API pricing at ~1/20th of GPT-5. The fastest adopted open-source model in history, capturing 6% global market share within months.
85
Overall Score
Core Specs
1000K
Context Window
66K
Max Output
✓
Reasoning
✓
Open Source
Multimodal Support
textimagevideo
Scenario Scores
User Feedback Highlights
Based on community feedback. Hover to see original reviews.
+ 1M token context window− Servers in China (latency for overseas users)− Geopolitical supply chain concerns− Self-hosting requires significant hardware+ Native multimodal (text/image/video)+ Price 1/20th of proprietary models+ Open-source & self-hostable+ Matches frontier model performance
Sentiment:👍 70%😐 25%👎 5%
Pros & Cons
Pros
- +1M token context window
- +Native multimodal (text/image/video)
- +Price 1/20th of proprietary models
- +Open-source & self-hostable
- +Matches frontier model performance
Cons
- −Servers in China (latency for overseas users)
- −Geopolitical supply chain concerns
- −Self-hosting requires significant hardware
Reliability
SLA99.0%
Incidents (30d)1
Last Incident2026-03-30
Self-hosting available. API servers in China.
View Status Page →Pricing
Input (per 1M tokens)$0.14
Output (per 1M tokens)$0.28
Free trial available
Updated on 2026-03-08
Get Started
1Visit the provider's website
2Create an account
3Start using the model
Compare with Others
Benchmarks
sweBench%
mmlu91%
humanEval94%
arcAgi2%
noteMatches GPT-5 level on most benchmarks%