AMD and OpenAI recently announced a major collaboration. AMD will supply OpenAI with a significant amount of computing power – 6 gigawatts – using its Instinct AI GPUs, to help power the next generation of OpenAI’s AI technology, including ChatGPT.
According to the contract, OpenAI will purchase AMD’s specialized Instinct chips and could potentially gain up to a 10% ownership in AMD.
The announcement came about two weeks after NVIDIA revealed a $100 billion investment in OpenAI. This investment includes providing OpenAI with 10 gigawatts or more of NVIDIA’s AI GPUs to expand its AI infrastructure.
The recent announcements of two competing companies investing in the same AI firm have sparked concerns about how the AI industry is developing and whether it’s truly fostering a sustainable, circular economy.
In a recent newsletter, cognitive scientist and AI expert Gary Marcus pointed out that the tech market’s overall value—which is meant to represent the future potential of its companies—is much higher than what those companies will probably ever achieve.
Marcus points out that the recent $300 billion cloud deal between OpenAI and Oracle is unlikely to work as planned. He believes OpenAI lacks the funds to spend that much, and Oracle doesn’t have the chips or the financial resources to fulfill the agreement.
Many people are increasingly worried about a potential crash in the artificial intelligence market, but NVIDIA’s CEO, Jensen Huang, continues to express optimism. He recently discussed the industry’s funding patterns in an interview with CNBC’s Squawk Box.
When questioned about OpenAI’s financial capacity to contribute to the partnership, Huang responded:
Okay, so the biggest hurdle right now is funding these massive AI factories. From what I’m seeing, it’s going to take roughly $50 to $60 billion *per gigawatt* of capacity – that covers everything from the land and building to all the computers and network infrastructure. Basically, they’ll need to finance this through a combination of things: the revenue they’re already making (which is thankfully growing super fast!), potentially selling more company shares, or taking on debt. It’s a huge investment, but they seem to have a plan to make it work.
The conversation then turned to the circular economy, and concerns that it might be experiencing a bubble. Rebecca Quick, the host, drew a comparison to the collapses of companies like Nortel and Lucent in the early 2000s, and asked how the current AI market is different and whether it faces similar risks.
Huang replies:
The world today is vastly different than it was in 2000. Back then, the internet was still new, with companies like Pets.com and Hospitals.com, and the entire sector was worth only around $30 to $40 billion. Now, companies focused on artificial intelligence – the so-called AI hyperscalers – already represent a $2.5 trillion industry.
According to NVIDIA’s CEO, the development of AI infrastructure is just beginning and will really take off in 2025. We’re currently shifting from traditional computer systems to those powered by NVIDIA’s GPUs for generative AI. He estimates that around $200 billion has been invested so far, but the total cost will ultimately reach several trillion dollars.
There’s another aspect of the AI economy that has Huang sitting comfortably: Tokens.
As an analyst, I often explain that tokens are really the core of how Large Language Models, or LLMs, work. Think of them as the way AI understands what we ask it – it breaks down our natural language into these tokens. Importantly, they also give us a way to measure and understand the cost of using these AI models.
Huang highlights a significant shift in AI technology. He points out that older AI models didn’t offer enough value to justify investment, but the recent advancements are a game-changer, offering genuine practical benefits.
This new technology can now think things through and do its own research. Before answering a question, it searches the internet, looking at websites and documents. It’s able to use different tools, create helpful information, and provide really useful responses. I find it so valuable that I use it daily, and it’s now even earning me money.
Jensen Huang, NVIDIA CEO
Compute constraints, energy demands, and the search for AGI

Okay, so it’s pretty well known now that generative AI needs *a lot* of power. I mean, it takes a crazy amount of hardware and energy just to run these things – it’s honestly kind of mind-blowing when you think about it!
Leading AI companies have been openly discussing this issue. For example, Sam Altman of OpenAI recently stated that his company is no longer limited by computing power, now that Microsoft is offering cloud services to others as well.
I’ve noticed a shift in what Sam Altman has been saying. It feels like his views started to change right after the news broke about NVIDIA and OpenAI teaming up, and before we even knew about the deal with AMD. It’s interesting to see how these partnerships seem to be influencing his perspective.
I’ve noticed a really significant limitation across the tech industry, and especially within our company – we’re severely constrained by our computing power. It’s frustrating because we’re just not able to offer as many services as people are asking for; the demand far outweighs our current capacity.
Sam Altman, OpenAI CEO
Because artificial intelligence needs so much computing power, we can likely expect large companies to continue investing in each other for the foreseeable future.
Most leading AI companies are striving for Artificial General Intelligence (AGI)—essentially, AI that’s smarter than humans. While there’s debate about when AGI will actually happen, it’s widely believed that whoever achieves it will reap enormous financial rewards. It’s a potentially unsettling development, but the economic incentives are huge.
What happens if AGI doesn’t arrive before investments dry up?

Stay up-to-date with the latest from Windows Central by following us on Google News! You’ll get our news, insights, and features right in your feed.
Read More
- Gold Rate Forecast
- Demi Moore Debuts Bangin’ Hair Transformation
- Metal Gear Solid Delta Patch Promised, But No Mention of PS5 Performance Improvements
- KPop Demon Hunters Stars Make Surprise Appearance on SNL
- You Need to Hear the Advice Rhea Raj Gave KATSEYE’s Lara
- Stalker: Rusted Dawn may be the best Stalker 2 modpack yet that aims to make the game closer to GAMMA
- Berserk: 10 Best Studios For a New Anime
- Top gainers and losers
- Every Hulk Personality, Ranked by Importance
- Dragon Ball: Every Super Saiyan Form Ranked (and #1 is the Most Iconic)
2025-10-10 17:11