Skip to content
Google

Gemma 4

Open Source

GoogleReleased on 2026-04-02

Google's most capable open model family. Four sizes optimized for local hardware: E2B and E4B for mobile/edge devices, 26B MoE for speed, 31B Dense for quality. Built on Gemini 3 technology with Apache 2.0 license. Supports 140+ languages, native function calling, agentic workflows, and multimodal input.

83
Overall Score

Core Specs

256K
Context Window
8K
Max Output
Reasoning
Open Source
Multimodal Support
textimagevideoaudio

User Feedback Highlights

Based on community feedback. Hover to see original reviews.

Context window smaller than cloud Gemini (256K vs 1M)+ Native multimodal: text, image, video, audio+ Apache 2.0 open-source license (major upgrade from Gemma 3)+ Agentic workflow ready (function calling, JSON output) Brand new release - ecosystem still developing+ Four sizes covering mobile to workstation+ 26B MoE activates only 3.8B params (fast inference)+ 140+ language support Larger models require significant hardware (80GB GPU unquantized)
Sentiment:👍 80%😐 20%👎 0%

Pros & Cons

Pros

  • +Apache 2.0 open-source license (major upgrade from Gemma 3)
  • +Four sizes covering mobile to workstation
  • +26B MoE activates only 3.8B params (fast inference)
  • +Native multimodal: text, image, video, audio
  • +140+ language support
  • +Agentic workflow ready (function calling, JSON output)

Cons

  • Larger models require significant hardware (80GB GPU unquantized)
  • Context window smaller than cloud Gemini (256K vs 1M)
  • Brand new release - ecosystem still developing

Reliability

Incidents (30d)0
Self-hosted model. Also available via Google AI Studio.

Pricing

Input (per 1M tokens)$0.00
Output (per 1M tokens)$0.00
Free trial available
Updated on 2026-04-03

Get Started

1Visit the provider's website
2Create an account
3Start using the model

Benchmarks

arenaElo#3 open model%
mmlu%
humanEval%
arcAgi2%
note31B Dense ranks #3 open model on Arena AI leaderboard (behind GLM-5, Kimi 2.5)%