Tiiuae Falcon-Edge: New 1.58bit Language Models Released
Action Required
Organizations can now leverage a new family of efficient language models for various applications, potentially reducing computational costs and enabling custom model fine-tuning.
AI Impact Summary
This release introduces the Falcon-Edge series, a new family of 1.58-bit language models based on the BitNet architecture. These models utilize a novel pre-training paradigm involving ternary weights and bfloat16 precision, aiming for ultra-efficient model design and fine-tuning. The release provides pre-quantized model weights and a Python package, onebitllms, to facilitate community experimentation and fine-tuning of these models, offering a pathway to building custom, efficient LLMs.
Affected Systems
- Date
- Date not specified
- Change type
- capability
- Severity
- high