Stock prices for the big three memory makers have already slid.

User Image

When you purchase through links on our site, we may earn an affiliate commission. Here’s how it works.

Keep up to date with the most important stories and the best deals, as picked by the PC Gamer team.

You are now subscribed

Your newsletter sign-up was successful

Want to add more newsletters?

Every Friday

GamesRadar+

Your weekly update on everything you could ever want to know about the games you already love, games we know you're going to love in the near future, and tales from the communities that surround them.

Every Thursday

GTA 6 O'clock

Our special GTA 6 newsletter, with breaking news, insider info, and rumor analysis from the award-winning GTA 6 O'clock experts.

Every Friday

Knowledge

From the creators of Edge: A weekly videogame industry newsletter with analysis from expert writers, guidance from professionals, and insight into what's on the horizon.

Every Thursday

The Setup

User Image

Hardware nerds unite, sign up to our free tech newsletter for a weekly digest of the hottest new tech, the latest gadgets on the test bench, and much more.

Every Wednesday

Switch 2 Spotlight

Sign up to our new Switch 2 newsletter, where we bring you the latest talking points on Nintendo's new console each week, bring you up to date on the news, and recommend what games to play.

Every Saturday

The Watchlist

Subscribe for a weekly digest of the movie and TV news that matters, direct to your inbox. From first-look trailers, interviews, reviews and explainers, we've got you covered.

Once a month

SFX

Get sneak previews, exclusive competitions and details of special events each month!

Other than the AI bubble bursting or hype dying down, the other thing that could allow the RAMpocalypse to ease off is a technological change that leads to a dramatic reduction in how much memory AI needs. To that end, Google has cooked up TurboQuant, a new compression algorithm that promises to reduce memory demand by about 6x. And memory maker stock prices have already dropped, likely as a result of this.

Although we should resist being reductive and assuming Google's new algorithm is responsible for these market changes—lest we forget the effects on crucial material availability thanks to the war in Iran—a 6x claimed reduction in memory demand must surely account for at least some of it.

According to Google, TurboQuant "optimally addresses the challenge of memory overhead in vector quantization" and "achieves a high reduction in model size with zero accuracy loss."

In other words, it makes vector compression—which is critical for AI models understanding and processing information, as they do so using vectors—require less memory than it has until now and crucially without the normally associated loss of accuracy from compressing things down.

The basic idea, to exclude lots of details and simplify greatly, seems to be a shift from calculating things in terms of standard vectors and instead using a more absolute reference system. Which, to my non-mathematical ears at least, sounds a bit like moving away from vectors:

"Instead of looking at a memory vector using standard coordinates (i.e., X, Y, Z) that indicate the distance along each axis, PolarQuant converts the vector into polar coordinates using a Cartesian coordinate system. This is comparable to replacing 'Go 3 blocks East, 4 blocks North' with 'Go 5 blocks total at a 37-degree angle'."

This, ultimately, means no need for data normalisation, which should "eliminate the memory overhead that traditional methods must carry." Google has put the new algorithm through its paces in a bunch of benchmarks, and the results, according to the Big G, at least, show that "TurboQuant achieves perfect downstream results across all benchmarks while reducing the key value memory size by a factor of at least 6x."

User Image

Again according to Google, the results also "demonstrate a transformative shift in high-dimensional search... [allowing] for bui...Read more: Full article on www.pcgamer.com

What do you think about this?