Hugging Face and AMD collaborate to accelerate transformers on AMD MI2xx/MI3xx, Navi3x GPUs and Alveo V70
AI Impact Summary
AMD joins Hugging Face’s Hardware Partner Program to optimize transformer workloads on AMD CPUs and GPUs. Initial focus covers Instinct MI2xx/MI3xx and Radeon Navi3x GPUs, plus the Alveo V70 AI accelerator, with ROCm SDK integration into the transformers stack; early testing shows MI250 training BERT-Large 1.2x faster and GPT2-Large 1.4x faster than a direct competitor. The collaboration spans models and frameworks from BERT, DistilBERT, Roberta, Vision Transformer, CLIP, Wav2Vec2 to GPT2, GPT-NeoX, T5, OPT, LLaMA, BLOOM, StarCoder, ResNet/ResNext, with PyTorch, TensorFlow, and ONNX Runtime support and a plan for an Optimum library tailored to AMD.
Affected Systems
- Date
- Date not specified
- Change type
- capability
- Severity
- info