The Large Hadron Collider is sucking in graphics cards at a rapidly increasing rate

The ALICE detector at the LHC
(Image credit: CERN)

Don't panic, the folk at CERN aren't hurling graphics cards into one another beneath Switzerland to see what happens when GPU particles collide. They're actually using Nvidia's graphics silicon to cut down the amount of energy it needs to compute what happens when the Large Hadron Collider (LHC) collides other stuff. 

Particles and things. Beauty quarks. Y'know, science stuff.

It's no secret that, while the humble GPU was originally conceived for the express purpose of chucking polygons around a screen in the most efficient way, it turns out the parallel processing prowess of modern graphics chips makes for an incredibly powerful tool in the scientific community. And an incredibly efficient one, too. Indeed A Large Ion Collider Experiment (ALICE) has been using GPUs in its calculations since 2010 and its work has now encouraged their increased use in various LHC experiments.

The potential bad news is that it does mean there's yet another group desperate for the limited amount of GPU silicon coming out of the fabs of TSMC and Samsung. Though at least this lot will be using it for a loftier purpose than mining fake money coins.

Right, guys? You wouldn't just be mining ethereum on the side now, would you?

On the plus side the CERN candidate nodes are currently using last-gen tech. For the upcoming LHC Run 3—where the machine is recommissioned for a "three-year physics production period" after a three-year hiatus—the nodes are pictured using a pair of AMD's 64-core Milan CPUs alongside two Turing-based Nvidia Tesla T4 GPUs.

Okay, no-one tell them how much more effective the Ampere architecture is in terms of straight compute power, and I think we'll be good. Anyways, as CERN calculates, if it was just using purely CPU-based nodes to parse the data it would need about eight times the number of servers to be able to run its online reconstruction and compression algorithms at the current rate. Which means it's already feeling pretty good about itself. 

Given that such efficiency increases do genuinely add up for a facility that's set to run for three years straight, shifting more and more over to GPU processing seems like a damned good plan. Especially because, from this year, the Large Hadron Collider beauty (LHCb) experiment will be processing a phenomenal 4 terabytes of data per second in real time. Quite apart from the name of that experiment—so named because it's checking out a particle called the "beauty quark"🥰—that's a frightening amount of data to be processing.  

Cooling off

Cooler Master MasterLiquid ML360R and EK-AIO Basic 240 CPU coolers on a two-tone grey background

(Image credit: Cooler Master, EKWB)

Best AIO cooler for CPUs: All-in-one, and one for all... components.
Best CPU air coolers: CPU fans that don't go brrr.

"All these developments are occurring against a backdrop of unprecedented evolution and diversification of computing hardware," says LHCb's Vladimir Gligorov, who leads the Real Time Analysis project. "The skills and techniques developed by CERN researchers while learning how to best utilise GPUs are the perfect platform from which to master the architectures of tomorrow and use them to maximise the physics potential of current and future experiments."

Damn, that sounds like he has got at least one eye on more recent generations of Nvidia workstation GPUs. So I guess we will end up fighting the scientists for graphics silicon after all.

Dave James
Editor-in-Chief, Hardware

Dave has been gaming since the days of Zaxxon and Lady Bug on the Colecovision, and code books for the Commodore Vic 20 (Death Race 2000!). He built his first gaming PC at the tender age of 16, and finally finished bug-fixing the Cyrix-based system around a year later. When he dropped it out of the window. He first started writing for Official PlayStation Magazine and Xbox World many decades ago, then moved onto PC Format full-time, then PC Gamer, TechRadar, and T3 among others. Now he's back, writing about the nightmarish graphics card market, CPUs with more cores than sense, gaming laptops hotter than the sun, and SSDs more capacious than a Cybertruck.

Read more
An artist’s illustration of NASA’s James Webb Space Telescope revealing, in the infrared, a population of small main-belt asteroids.
GPUs powering AI will probably be the end of us all but at least they're being used to find small city smashing asteroids before they do
The Asus ROG Astral GeForce RTX 5090 on an LED-lit table at CES 2025
Jen-Hsun reckons Nvidia has driven the 'cost of computing down by 1,000,000 times'
Nvidia H100 chips inside a server room at the Yotta Data Services Pvt. data center, in Navi Mumbai, India
Turns out there's 'a big supercomputer at Nvidia… running 24/7, 365 days a year improving DLSS. And it's been doing that for six years'
SUQIAN, CHINA - JANUARY 27, 2025 - An illustration photo shows the logo of DeepSeek and ChatGPT in Suqian, Jiangsu province, China, January 27, 2025. (Photo credit should read CFOTO/Future Publishing via Getty Images)
China's DeepSeek chatbot reportedly gets much more done with fewer GPUs but Nvidia still thinks it's 'excellent' news
Images of Nvidia's Blackwell GPU from GTC.
OpenAI CEO Sam Altman says his company is 'out of GPUs' to which I reply 'welcome to the party, pal'
Three RTX 5090 graphics cards on display at the Asus suite, CES 2025.
The RTX 5090 Founders Edition might be svelte but the Asus ROG Astral cards are absolute chonkers
Latest in Graphics Cards
An MSI RTX 5080 in white installed in a gaming PC.
MSI GeForce RTX 5080 Ventus 3X OC White review
Nvidia App
Hmmm, upgrades: Nvidia App gets an optional AI assistant and custom DLSS resolution scaling
A close-up photo of an Nvidia RTX 4070, with its heatsink removed, showing the AD104 GPU die and the surrounding Micron GDDR6X VRAM chips
With Nvidia Ace taking up 1 GB of VRAM in Inzoi, Team Green will need to up its memory game if AI NPCs take off in PC gaming
A collage of Radeon RX 9000 series graphics cards, as shown in AMD's promotional video for the launch of RDNA 4 at CES 2025
AMD's CEO claims 9070 XT sales are 10x higher than all previous Radeon generations but that's just for the first week of availability
Colorful iGame RTX 5070 Ti Vulcan OC graphics card from various angles
The RTX 5060 and RTX 5060 Ti are rumoured to be mere weeks away, with board partners reportedly required to ensure at least one MSRP model at launch
Nvidia headquarters
Nvidia CEO sets sights on making 'several hundred billion' dollars worth of electronics in the USA over the next four years, increasing the chance of your next GPU being made in America
Latest in News
Raidou Kuzonoha and pals, looking like they've about to drop the most fire single of the 1930s
Raidou Remastered is finally bringing the historical Shin Megami Tensei supernatural sleuth spinoff to PC this June
Greedfall 2
Greedfall 2 aims to turn around a disastrous early access launch with a combat overhaul and a big new boat
Nova, a hero from Marvel Comics, smolders at the camera while surrounded by flames.
The team behind Shredder's Revenge has a Marvel beat 'em up on the way with a whopping 15 characters and unsurprisingly gorgeous pixel art
Kinich, a character in Genshin Impact, stands prepared to brawl with an enemy.
'Diabolical': Genshin Impact's English cast gives new VO the cold shoulder after he frames replacing a striking actor as an 'opportunity to carry the flame'
An image of Alan Wake from Alan Wake 2's rock opera-style song, Herald of Darkness, lifting a hand to the sky while the other bundles on his chest.
Epic’s 2025 Spring Sale kicks off with some big discounts on recent hits and a pair of cat-themed giveaways
An army of Grand Cathay, including infantry, cavalry, and warmachines, from the tabletop wargame Warhammer: The Old World.
After a not-so-subtle tease 2 months ago, and 4 years since it was originally announced, Grand Cathay from Total War: Warhammer 3 is finally coming to the tabletop wargame