FlashAttention patch v5.6.1 fixes AttributeError in flash_attention_forward when s_aux=None
AI Impact Summary
Patch release v5.6.1 addresses a bug in the Flash Attention path that caused an AttributeError when s_aux was None in flash_attention_forward. The bug could cause runtime failures for workloads using FlashAttention, impacting models that rely on this path for attention computations. Upgrading to v5.6.1 restores correct behavior and reduces risk of crashes in production deployments.
Affected Systems
Business Impact
Applications using FlashAttention will avoid runtime errors and unstable performance by upgrading to v5.6.1.
Models affected
- Date
- Date not specified
- Change type
- capability
- Severity
- info