A massive cloud deal for the AI era
OpenAI has entered a multi-year, $38 billion partnership with Amazon Web Services (AWS) to access large-scale cloud computing power for its artificial intelligence workloads. The collaboration will see OpenAI use AWS infrastructure, including hundreds of thousands of NVIDIA GPUs and millions of CPUs, to run and scale its AI models.
Scaling AI workloads to new levels
Under the agreement, AWS will provide OpenAI with compute clusters optimized for large AI training and inference tasks. These clusters feature NVIDIA GB200 and GB300 GPUs connected through Amazon EC2 UltraServers, enabling ultra-fast performance and minimal latency. This setup will help OpenAI efficiently handle everything from ChatGPT’s day-to-day responses to training its next-generation models.
AWS brings scale, reliability, and security
The demand for compute power in the AI sector has grown exponentially, and AWS is positioning itself as a key enabler. With experience in managing clusters exceeding 500,000 chips, AWS offers the scale and reliability that OpenAI needs to push AI capabilities forward. The infrastructure rollout is expected to be completed by 2026, with room to expand into 2027 and beyond.
Matt Garman, CEO of AWS, said the partnership highlights AWS’s leadership in cloud computing. “As OpenAI continues to push the boundaries of what’s possible, AWS’s best-in-class infrastructure will serve as a backbone for their AI ambitions,” he said.
Powering ChatGPT and next-gen AI models
For OpenAI, this partnership ensures a reliable supply of compute resources to support its expanding ecosystem of products and services. It will also strengthen the foundation for developing new AI models that power ChatGPT and future tools aimed at both individuals and enterprises.
OpenAI CEO Sam Altman emphasized the importance of scalable computing in AI’s evolution. “Scaling frontier AI requires massive, reliable compute,” he said. “Our partnership with AWS strengthens the broad compute ecosystem that will power this next era and bring advanced AI to everyone.”
Expanding access through Amazon Bedrock
The partnership builds on earlier collaborations between the two companies. OpenAI’s open-weight foundation models are already available on Amazon Bedrock, allowing AWS customers to integrate AI capabilities into their workflows. Thousands of companies — including Peloton, Comscore, and Thomson Reuters — now use OpenAI’s models through AWS for tasks like coding, data analysis, and scientific research.
A boost for the AI ecosystem
The deal marks one of the largest cloud computing agreements in the AI industry, reflecting how vital infrastructure has become for innovation in generative and agentic AI. By combining OpenAI’s breakthroughs with AWS’s infrastructure expertise, the two companies aim to accelerate access to advanced AI globally — securely, reliably, and at scale.