Powercolor's Edge AI aims to significantly reduce GPU power consumption without a big hit in frame rates
It's very much a work in progress right now but the concept looks promising.
Many PC enthusiasts don't like the fact that the best top-end GPUs use lots of power and try all kinds of ways of reducing the consumption, such as undervolting, frame rate caps, or lowering the max power limit. Graphics card vendor PowerColor is experimenting with a slightly different approach by using an NPU to manage the power usage in games, without impacting performance, in a system called Edge AI.
A demonstration of the work in progress was on display at PowerColor's Computex stand. While we didn't get a chance to see it in action (there was a huge amount to try and see at the event), tech site IT Home and X user Harukaze5719 managed to grab some pictures of the setup and see it running on two computers running Final Fantasy XV.
PowerColor's engineers hotwired an external NPU to an AMD graphics card, in one of the computers, and programmed it to manage the GPU's power consumption while rendering. At the moment, there's no indication of exactly what's going on behind the scenes but NPUs (neural processing units) are specialised processors for handling the math operations involved in AI routines.
What I suspect is going on is that the NPU is running a neural network that takes metrics such as the GPU's load, voltage, and temperature, as well as aspects from the game being rendered, and alters the GP voltage in such a way that the power consumption is significantly reduced on average.
In the Final Fantasy XV demonstration, the PC without an NPU ran the game at 118 fps, with the graphics card using 338 W to achieve this. The other setup, with the NPU-GPU combo, was hitting 107 fps at 261 W of power. That's a 23% reduction in energy consumption for a 9% drop in frame rate.
PowerColor's demo stand actually claims that Edge AI improves performance but it obviously forgot that if you're going to showcase a new bit of technology, then you kind of want to check that it does what you're saying it does before you wave it about in public. But even with that minor marketing boo-boo, the whole concept of Edge AI looks like it has quite a bit of potential.
Reducing the power consumption of a graphics card has multiple benefits—less heat is dissipated into your gaming room, the whole PC uses less electricity, and the peripheral components on the graphics card will last longer. All that seems worth the relatively small drop in performance.
The biggest gaming news, reviews and hardware deals
Keep up to date with the most important stories and the best deals, as picked by the PC Gamer team.
PowerColor integrated NPU to Radeon GPU. (Still developing)Through AI, they claim power consumption deceased 22%. More saved than AMD Power Saving. pic.twitter.com/9I8iEikotDJune 7, 2024
At the moment, Edge AI requires an external NPU to be wired to various points on a graphics card but if it's only monitoring voltages and temperatures, then an internal NPU could be used to do the same thing. The majority of these are embedded in laptop-based chips, such as AMD's new Ryzen AI series, Intel's Core Ultra range, and Qualcomm's Snapdragon X—processors that are rarely going to be used with a discrete graphics card.
The neural network that Edge AI runs could, in theory, be done on a GPU but they're not really designed to do such things on as little power as possible, unlike NPUs.
Best CPU for gaming: The top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game ahead of the rest.
That 77 W decrease in GPU power seen in the Final Fantasy demonstration would probably be far smaller if the routines were GPU-accelerated instead (and the fps drop would likely be larger, too).
I don't think PowerColor is planning on releasing a graphics card with an NPU on the circuit board, as that would eat into the profit margins. Instead, I suspect it's preparing Edge AI to be ready for when NPUs are routinely embedded in desktop CPUs from every vendor and if that's the case, then it's one of the few uses of AI that I'd genuinely look forward to seeing in action.
And if it's a success for PowerColor, you can bet your last dollar that every other graphics card manufacturer will want to replicate. As long as all these systems are optional to use, then that would be a positive step forward for the GPU industry as a whole.
Nick, gaming, and computers all first met in 1981, with the love affair starting on a Sinclair ZX81 in kit form and a book on ZX Basic. He ended up becoming a physics and IT teacher, but by the late 1990s decided it was time to cut his teeth writing for a long defunct UK tech site. He went on to do the same at Madonion, helping to write the help files for 3DMark and PCMark. After a short stint working at Beyond3D.com, Nick joined Futuremark (MadOnion rebranded) full-time, as editor-in-chief for its gaming and hardware section, YouGamers. After the site shutdown, he became an engineering and computing lecturer for many years, but missed the writing bug. Cue four years at TechSpot.com and over 100 long articles on anything and everything. He freely admits to being far too obsessed with GPUs and open world grindy RPGs, but who isn't these days?