Amazon Bedrock adds custom model deployment for on-demand inference
AI Impact Summary
Amazon Bedrock now supports deploying custom models for on-demand inference with pay-per-token usage. This allows users to avoid the cost and complexity of provisioning dedicated compute resources, offering a more flexible and cost-effective option for custom model deployments. This change expands the capabilities of Bedrock and provides greater control over inference costs.
Affected Systems
Business Impact
Users can now deploy custom models in Bedrock with a pay-per-token pricing model, reducing infrastructure costs and simplifying deployment.
Models affected
- Date
- Date not specified
- Change type
- capability
- Severity
- medium