Transformers v0.35.2-patch: fix offload_state_dict init, FLAX import, PyTorch 2.3.1 compatibility, and scale_shift_factor CPU issue
AI Impact Summary
This patch release patches transformers-related stability: offload_state_dict is now properly handled during model initialization, the TRANSFORMERS_FLAX_WEIGHTS_NAME import issue is fixed, a PyTorch 2.3.1 compatibility guard is added, and scale_shift_factor is correctly placed on the appropriate device for wan and ltx. These changes reduce import-time errors and initialization failures, and improve deployment reliability for transformer-based workflows on PyTorch 2.3.x. Engineers should see fewer runtime exceptions when loading models and during device-specific parameter handling.
Affected Systems
- Date
- Date not specified
- Change type
- capability
- Severity
- info