Segmind open-sources SD-Small and SD-Tiny models via Knowledge Distillation
AI Impact Summary
Segmind is releasing compressed SD-Small and SD-Tiny models, trained using Knowledge Distillation (KD) techniques. These models achieve 35% and 55% parameter reductions while maintaining comparable image fidelity to the base SD-Small and SD-Tiny models, respectively. This release leverages a Block-removal KD method, focusing on matching outputs at the feature level to preserve model quality, and offers a pathway to smaller, faster image generation.
Affected Systems
- Date
- Date not specified
- Change type
- capability
- Severity
- info