Google releases EmbeddingGemma — 308M parameter multilingual embedding model
AI Impact Summary
Google released EmbeddingGemma, a new efficient multilingual embedding model with 308M parameters and a 2K context window. This model achieves state-of-the-art performance on the Massive Multilingual Text Embedding Benchmark (MMTEB) while maintaining a small memory footprint, making it suitable for on-device RAG pipelines and agents. The model’s architecture, including bi-directional attention and MRL, allows for fast similarity search and clustering across 100+ languages.
Affected Systems
- Date
- Date not specified
- Change type
- capability
- Severity
- info