Amazon OpenSearch Ingestion adds ML offline batch inference
Action Required
Organizations can now process large datasets more efficiently and cost-effectively, unlocking new insights and improving data quality.
AI Impact Summary
Amazon OpenSearch Ingestion is introducing a new machine learning offline batch inference feature, allowing users to efficiently enrich large datasets asynchronously. This capability leverages Amazon Bedrock and SageMaker models, providing a cost-effective solution for processing data at scale. This addition expands the functionality of OpenSearch Ingestion, particularly beneficial for scenarios requiring asynchronous data enrichment and complex model integrations.
Affected Systems
- Date
- 2 Oct 2025
- Change type
- capability
- Severity
- high