Amazon’s cloud division has signed a $38 billion contract with OpenAI to provide part of the company’s rapidly growing computing power needs.
Under the seven-year agreement, Amazon Web Services will grant the ChatGPT creator access to hundreds of thousands of Nvidia graphics processors, the companies announced on Monday.
The deal marks another step in OpenAI’s transformation from a research lab into one of the central powerhouses of the artificial intelligence industry. The company plans to spend $1.4 trillion on building and maintaining the infrastructure for its models—an investment scale that has already raised fears of a potential bubble.
For Amazon, which has been striving for a more prominent role in the AI era, the contract is an acknowledgment of its ability to build and operate massive data center networks. “As OpenAI continues to push the boundaries of what’s possible, AWS’s world-class infrastructure will be the backbone of its AI ambitions,” said Matt Garman, head of Amazon Web Services.
Amazon remains the world’s largest provider of rented computing power. However, until this deal, AWS was one of the few major U.S. cloud companies not supporting OpenAI.
Microsoft—the largest investor in OpenAI and its former exclusive cloud provider—recently announced a new $250 billion commitment to use its Azure platform. Oracle has signed a $300 billion agreement, and earlier this year OpenAI said that ChatGPT also runs on Google Cloud infrastructure. In addition, the startup has a $22.4 billion contract with CoreWeave—one of the emerging “neo-cloud” companies offering specialized infrastructure for AI developers.
Under the terms of the agreement, OpenAI has already begun using AWS capacity, with the full allocation expected to be delivered by the end of 2026. The company will also have the option to expand the partnership further. Amazon plans to deploy hundreds of thousands of chips, including Nvidia’s GB200 and GB300 accelerators, organized into clusters that will power ChatGPT and train future generations of models.
“Scaling advanced AI systems requires enormous and reliable computing resources,” said OpenAI CEO Sam Altman. “Our partnership with AWS strengthens the broad computing ecosystem that will underpin the next era—and make advanced AI accessible to everyone.”
Amazon is also one of the largest investors in Anthropic, a company founded by former OpenAI employees. Last week, the Seattle-based firm announced that a data center equipped with hundreds of thousands of its own AWS Trainium2 chips has gone live to support Anthropic. In September, Google said it would provide Anthropic with up to one million of its specialized AI chips—a deal worth tens of billions of dollars.