Amazon and OpenAI strike $50B deal to scale enterprise AI on AWS
The multi-year agreement brings OpenAI Frontier exclusively to AWS, introduces a stateful runtime on Amazon Bedrock, and commits billions in infrastructure to scale enterprise AI agents.
Amazon Web Services and OpenAI have announced a multi-year strategic partnership that includes a $50 billion investment from Amazon and expanded infrastructure commitments designed to scale AI deployment globally.
The agreement makes AWS the exclusive third-party cloud distribution provider for OpenAI Frontier and introduces a new Stateful Runtime Environment powered by OpenAI models on Amazon Bedrock.
The partnership comes as enterprises move from testing generative AI tools to embedding AI agents directly into production systems. Rather than focusing solely on model access, the agreement centers on runtime infrastructure, long-term compute commitments, and enterprise distribution.
Matt Garman, CEO at Amazon Web Services, said on LinkedIn he was “Excited to share a new multi-year, strategic partnership between Amazon Web Services (AWS) and OpenAI, which is going to accelerate what builders everywhere can get done.”
He wrote that the companies are co-creating “a next-generation stateful runtime, available on Amazon Bedrock, so developers can build AI agents that maintain context, memory, and continuity at production scale.”
Building persistent AI agents on Bedrock
The newly announced Stateful Runtime Environment will be delivered through Amazon Bedrock. Unlike traditional stateless prompt interactions, the runtime is designed to allow AI agents to retain context, access memory, and operate across tools and data sources over ongoing workflows.
Garman wrote that the runtime will allow developers to build agents “that maintain context, memory, and continuity at production scale,” positioning the capability as core infrastructure for enterprise AI applications.
Scott Rosecrans, Vice President, Strategic Pursuits at OpenAI, wrote on LinkedIn that during his time at AWS the “#1 question” he received was, “When are OpenAI models going to be available on Bedrock?” He added, “Well now it's here!”
Rosecrans wrote that AWS and OpenAI “will co-create a Stateful Runtime Environment powered by OpenAI models, available on Amazon Bedrock for AWS customers to build generative AI applications and agents at production scale.”
The emphasis across both LinkedIn posts reflects a shift from experimentation toward structured, managed AI systems designed to operate inside enterprise environments.
Frontier’s exclusive cloud home
Under the agreement, AWS will serve as the exclusive third-party cloud distribution provider for OpenAI Frontier, OpenAI’s enterprise platform for building and managing teams of AI agents.
Garman wrote that AWS will be the “Exclusive Cloud Home for OpenAI Frontier — OpenAI's most advanced enterprise platform for deploying AI agents at scale.”
Frontier is designed for organizations integrating AI into real business systems, with governance and enterprise-grade security built in. The exclusivity provision gives AWS differentiated positioning in the enterprise AI infrastructure market.
Compute scale and long-term investment
The partnership also expands OpenAI’s infrastructure footprint on AWS. OpenAI will commit to consuming approximately 2 gigawatts of AWS Trainium compute capacity, spanning Trainium3 and next-generation Trainium4 chips.
Garman wrote that OpenAI is “going big on our custom silicon,” adding that the expanded agreement means “more efficient, cost-effective intelligence at scale — spanning Trainium3 and the next-gen Trainium4.”
Amazon will invest $50 billion in OpenAI, beginning with an initial $15 billion investment followed by an additional $35 billion subject to conditions. Garman wrote that the investment is “a strong signal of our long-term commitment to this partnership.”
In the official announcement, Sam Altman, Co-Founder and CEO of OpenAI, says: “OpenAI and Amazon share a belief that AI should show up in ways that are practical and genuinely useful for people. Combining OpenAI’s intelligence with Amazon’s infrastructure and global reach helps us put powerful AI into the hands of businesses and users at real scale.”
Andy Jassy, President and CEO of Amazon, adds: “We have lots of developers and companies eager to run services powered by OpenAI models on AWS, and our unique collaboration with OpenAI to provide stateful runtime environments will change what’s possible for customers building AI apps and agents. We continue to be impressed with what OpenAI is building, and we're excited not only about their choosing to go big on our custom AI silicon (Trainium), but also our opportunity to invest in the company and partnership over the long-term.”
The agreement aligns model development, enterprise distribution, capital investment, and custom silicon in a single framework. As AI agents become embedded in customer-facing systems and internal operations, control over runtime environments and compute capacity is becoming central to enterprise AI strategy.
ETIH Innovation Awards 2026
The ETIH Innovation Awards 2026 are now open and recognize education technology organizations delivering measurable impact across K–12, higher education, and lifelong learning. The awards are open to entries from the UK, the Americas, and internationally, with submissions assessed on evidence of outcomes and real-world application.