Graphcore Bow IPU powers IPU-optimized transformers via Hugging Face Optimum Graphcore
AI Impact Summary
Graphcore and Hugging Face have expanded the IPU-optimized transformer lineup in Hugging Face Optimum Graphcore, adding 10 pre-trained models across NLP, speech, and vision with IPU-specific configurations and weights. The Bow IPU with Wafer-on-Wafer 3D stacking delivers up to 350 teraFLOPS and ~40% higher performance with 16% better power efficiency, and developers can migrate from prior IPU generations without code changes, aided by the Poplar SDK 2.5. This tight integration enables faster deployment of state-of-the-art models (e.g., BERT, ViT, GPT-2, RoBERTa, DeBERTa, BART, LXMERT, T5, HuBERT, Wav2Vec2) using Hugging Face hubs and datasets, reducing time-to-value for enterprise AI workloads.
Affected Systems
- Date
- Date not specified
- Change type
- capability
- Severity
- info