Hugging Face Transformers adds PatchTST for time-series forecasting with zero-shot and fine-tuning
AI Impact Summary
PatchTST in Hugging Face Transformers enables patch-based time-series forecasting with shared weights across channels, allowing longer context with lower memory. The workflow shown includes zero-shot forecasting on ETTh1 and subsequent transfer learning steps (linear probing, fine-tuning) using the IBM tsfm preprocessing stack, illustrating cross-domain applicability. This gives teams a reproducible path to leverage pretrained time-series models across datasets (e.g., Electricity, ETTh1, Informer2020) with installable tooling (transformers and tsfm) for rapid deployment.
Affected Systems
- Date
- Date not specified
- Change type
- capability
- Severity
- info