This website uses cookies

Read our Privacy policy and Terms of use for more information.

OpenAI expanded its partnership with Amazon Web Services this week, making GPT-5.5 and its full model lineup available on Amazon Bedrock. The move gives enterprise customers access to OpenAI's flagship model inside the AWS environment they already use for security, compliance, and infrastructure - removing one of the biggest friction points in enterprise AI adoption.

OpenAI launched GPT-5.5 and other frontier models on Amazon Bedrock, brought Codex to AWS, and added Bedrock Managed Agents powered by OpenAI, giving enterprises new ways to build and deploy AI in trusted AWS environments with security, governance, and workflows they already use. Releasebot

What GPT-5.5 Actually Delivers

GPT-5.5 is OpenAI's strongest agentic coding model to date. On Terminal-Bench 2.0, which tests complex command-line workflows requiring planning, iteration, and tool coordination, it achieves a state-of-the-art accuracy of 82.7%. On SWE-Bench Pro, which evaluates real-world GitHub issue resolution, it reaches 58.6%. Senior engineers who tested the model said GPT-5.5 was noticeably stronger than GPT-5.4 and Claude Opus 4.7 at reasoning and autonomy, catching issues in advance and predicting testing and review needs without explicit prompting. OpenAI

GPT-5.5 is meaningfully better at operating as an autonomous agent: calling tools, managing multi-step plans, recovering from errors mid-task, and maintaining coherent state across long interactions. OpenAI positioned this release as a response to growing demand from enterprise and developer customers who found that GPT-5.4 had reliability issues in truly long agentic chains. MindStudio

Why the Bedrock Integration Matters

More than 4 million people now use Codex every week, and teams are using it across the software development lifecycle - to write code, explain systems, refactor applications, generate tests, and more. Releasebot

For enterprise technology leaders, the Bedrock integration answers a question that has slowed AI adoption inside large organizations: how do we use frontier models without moving data outside our existing security and compliance infrastructure? The ability to run GPT-5.5 inside AWS - alongside existing identity systems, security controls, and procurement workflows - is a meaningful operational answer to that question.

The OpenAI/AWS partnership also signals a structural shift away from Microsoft's previous position as OpenAI's exclusive cloud partner. Under revised terms finalized in late 2025, Microsoft lost its right of first refusal on new OpenAI cloud workloads. The Bedrock launch is the clearest evidence yet of what that change looks like in practice. Enterprise customers now have genuine optionality in how they deploy OpenAI's models, which should translate to better pricing, better support, and less vendor lock-in over time.

Keep Reading