Skip to content
Meta

Llama 4 Maverick

Open Source

MetaReleased on 2025-04-05

Meta's flagship open-source multimodal model. 17B active parameters with 400B total (128 expert MoE). 1M context window, natively multimodal with early fusion. Extremely cost-effective at $0.15/$0.60 per M tokens. Supports 12 languages.

80
Overall Score

Core Specs

1049K
Context Window
16K
Max Output
Reasoning
Open Source
Multimodal Support
textimage

User Feedback Highlights

Based on community feedback. Hover to see original reviews.

+ Extremely affordable ($0.15/$0.60) 16K max output limit+ 1M context window+ Native multimodal (text + image) Benchmark gaming controversy Knowledge cutoff August 2024 Coding performance below Claude/GPT+ High throughput MoE architecture+ Open source (Llama 4 Community License)
Sentiment:👍 65%😐 20%👎 15%

Pros & Cons

Pros

  • +Extremely affordable ($0.15/$0.60)
  • +1M context window
  • +Native multimodal (text + image)
  • +Open source (Llama 4 Community License)
  • +High throughput MoE architecture

Cons

  • Coding performance below Claude/GPT
  • Benchmark gaming controversy
  • 16K max output limit
  • Knowledge cutoff August 2024

Reliability

Incidents (30d)0
开源模型,可靠性取决于托管方

Pricing

Input (per 1M tokens)$0.15
Output (per 1M tokens)$0.60
Free trial available
Updated on 2026-03-07

Get Started with Llama 4

2Or self-host (open source)

💡 Open source - can be self-hosted or used via API providers.

Benchmarks

mmmu73.4%
codingAccuracy70%
lmArenaElo1417%