This breakthrough tech could solve Microsoft’s AI power consumption woes and is 1,000x more energy-efficient

What you need to know

  • Researchers have developed a new prototype chip dubbed computational random-access memory (CRAM) that could scale down AI’s power-hungry demands by over 1,000 times.
  • The model could achieve energy savings of up to 2,500 times compared to traditional methods.
  • CRAM could address Microsoft’s AI woes as its power usage surpasses over 100 countries. 

As someone who has closely followed the advancements in technology and artificial intelligence (AI) over the past few years, I can’t help but be both excited and concerned about their potential impact on our world. On one hand, AI is driving incredible innovations across various industries, from healthcare to education and beyond. On the other hand, its power demands are alarmingly high, with tech giants like Microsoft and Google consuming electricity at a rate that surpasses that of over 100 countries.


Generative AI is a technology that requires substantial resources to operate. Despite its ability to produce remarkable outcomes in various fields such as medicine, education, and computing, its energy requirements are cause for concern. A recent study reveals that the electricity consumption of tech giants like Microsoft and Google exceeds the power usage of over a hundred countries.

As an ardent follower of technology advancements, I can’t help but feel frustrated by the current energy constraints that are hindering our tech from reaching its full potential. Even visionary innovators like Elon Musk acknowledge that we might be on the brink of groundbreaking technological breakthroughs with AI. However, by 2025, there may not be sufficient electricity to fuel these advancements.

Sam Altman, the CEO of OpenAI, is considering the use of nuclear fusion as a potential energy solution to power their artificial intelligence innovations. In contrast, Microsoft has entered into a collaboration with Helion to produce nuclear energy on a large scale by 2028 for its AI projects.

In a recently published paper in Nature, researchers have introduced an innovative prototype chip called computational random-access memory (CRAM). This chip could potentially revolutionize Microsoft’s AI initiatives by significantly reducing their power consumption demands – approximately 1,000 times less, which equates to around 2,500 times more energy savings in one simulation.

As you may know, traditional AI processes transfer data between logic and memory, which heavily contributes to their high power consumption. However, the CRAM approach keeps data within the memory, canceling AI’s high demand for power. 

By the year 2027, AI tools such as ChatGPT and Microsoft Copilot would have used an amount of electricity equivalent to powering a whole country for a year due to their fast development. Yet, researchers working on the CRAM model propose that it could potentially save up to 2,500 times more energy compared to conventional methods.

How does CRAM work?

The CRAM model isn’t a recent discovery, as stated by Professor Jian-Ping Wang, its senior author.

“Our initial concept to use memory cells directly for computing 20 years ago was considered crazy.”

As a tech enthusiast, I’d describe it this way: CRAM, or Circular Racetrack Memory, is an innovative data storage technology that harnesses the power of spinning electrons instead of relying on electrical charges like conventional methods. This results in faster processing speeds and reduced energy consumption, making it a more eco-friendly option for our digital needs.

Ulya Karpuzcu, a co-author of the paper, further stated:

CRAM, being an advanced and highly energy-efficient digital memory technology, offers flexibility in computation as it allows processing to occur anywhere within its memory matrix. As a result, this versatile technology can be tailored to optimally serve various AI algorithm performance requirements.

Despite the limitation of currently unknown bounds when it comes to scaling up this model, it holds considerable potential. It may address a major challenge in AI development: excessive energy consumption.

Read More

2024-07-29 20:10