OpenAI Expands AWS Partnership With Bedrock Launches
OpenAI on AWS is no longer hypothetical. On April 28, 2026, one day after Microsoft's exclusive distribution rights over OpenAI models expired, Amazon Bedrock began offering three new OpenAI-powered products in limited preview: frontier models including GPT-5.5, the Codex coding agent, and a new Managed Agents service. The message to enterprise buyers is clear: you can now build with OpenAI without leaving your existing AWS setup.
What Enterprises Actually Get Access To
The first product is straightforward model access. GPT-5.5 and other OpenAI frontier models are available through the same Bedrock APIs that organizations already use for inference and orchestration. Security controls carry over automatically: IAM access management, PrivateLink connectivity, encryption, guardrails, and CloudTrail logging all apply without additional configuration.
The second product targets development teams. Codex, OpenAI's coding agent that currently serves over four million weekly users, now runs natively inside AWS. Enterprise developers can authenticate with their existing AWS credentials and access Codex through the CLI, desktop app, or VS Code extension. All inference runs through Bedrock infrastructure.
Both OpenAI model and Codex usage count toward existing AWS cloud spending commitments. For organizations with pre-negotiated cloud contracts, that eliminates the need for a separate procurement process.
Managed Agents Fill the Production Gap
The third launch addresses a problem most enterprise AI teams know well: building an AI agent that works in a demo is easy, getting one to run reliably in production is not.
Amazon Bedrock Managed Agents, powered by OpenAI, pairs frontier models with Bedrock AgentCore, Amazon's runtime layer for agent deployment. The service handles orchestration, tool execution, memory management, and governance. Each agent operates under its own identity, logs every action for audit trails, and processes all inference within Bedrock.
The practical result: teams define what an agent should do, not how to wire up the infrastructure behind it. For regulated industries like finance and healthcare, the built-in compliance controls remove a significant barrier to deploying autonomous AI workflows.
Timing and Strategic Context
The speed of this launch was not accidental. Microsoft's exclusivity over OpenAI distribution ended April 27. AWS went live the next morning. AWS CEO Matt Garman framed the move as removing a forced choice that previously pushed customers toward specific platforms: "We don't have to force people to make that choice."
For OpenAI, Bedrock distribution opens access to millions of AWS enterprise customers who were previously locked out unless they adopted Azure. For AWS, adding OpenAI strengthens Bedrock's position as a multi-model platform and reduces dependency on any single AI provider.
All three offerings remain in limited preview, with no confirmed timeline for general availability. Full details are available in OpenAI's official announcement.