AI Inference · The Register
OpenAI rallies out of Microsoft's bed, into Amazon's Bedrock
Compiled by KHAO Editorial — aggregated from 1 outlet. See llms.txt for citation guidance.
◌ Single Source
OpenAI's top models are officially available on Amazon Web Services' Bedrock managed inference and agent platform.
Key facts
- Tuesday's announcement makes good on OpenAI's promise in February to make its models available on AWS in exchange for up to $35 billion in new financing
- OpenAI's top models are officially available on Amazon Web Services' Bedrock managed inference and agent platform
- The collaboration, announced at an AWS event in San Francisco on Tuesday, provides an alternative avenue for accessing Altman and company’s growing library of GPTs without having to expose your data
- By opening its models up to a trusted third party, OpenAI can sidestep many of these concerns
Summary
The collaboration, announced at an AWS event in San Francisco on Tuesday, provides an alternative avenue for accessing Altman and company’s growing library of GPTs without having to expose your data to OpenAI's APIs. Amazon contends that enterprises want to build agents and other AI-augmented tools using OpenAI's models, but have been stopped by security policy, data privacy, and sovereignty concerns. By opening its models up to a trusted third party, OpenAI can sidestep many of these concerns. Alongside the managed inference service, OpenAI's models will also be made available on Amazon's Bedrock Managed Agents and AgentCore platforms, which provide tools and blueprints for building enterprise agents and connecting them to enterprise data and services.