Hugging Face and Graphcore enable IPU-optimized Transformers via Optimum and Poplar SDK
AI Impact Summary
Graphcore and Hugging Face are enabling production-ready Transformer workloads on Graphcore IPUs by integrating Hugging Face's Optimum with Graphcore's Poplar SDK. The partnership will ship IPU-optimized Transformer models via Optimum, with PyTorch and TensorFlow compatibility and deployment flow through Docker and Kubernetes, accelerating adoption for NLP and CV tasks. This expands the hardware choices for teams deploying large models and could shorten time-to-market for IPU-based deployments as first IPU-optimized models appear on Optimum later this year.
Affected Systems
- Date
- Date not specified
- Change type
- capability
- Severity
- info