I used AI to help Nvidia design an 800W version of the RTX 4090
Nvidia's reportedly been testing an over-powered version of its upcoming GPUs, so I figured I'd help with the overall design.
I'm not above a little derivative foolishness of a bleak monday morning. And so I asked the AI painter de jour, DALL-E mini, to help visualise what an 800W RTX 4090 might look like should the green team go all out. I think DALL-E has actually nailed the cooling array, so I'm going to steal it and see if I can get a design credit from Jen-Hsun.
We're edging ever closer to the launch of the new Nvidia RTX 40-series graphics cards, yet we don't really have a lot to go on in terms of what they're going to look like. Or even really how they're going to be specced out.
There were rumours a while back of the green team testing AD102 boards with 800W total board power (TBP), and though that is unlikely to actually appear in a release graphics card—at least not for the consumer market—I kinda wanted to see what one might look like. The DALL-E design features a dual-fan cooling array on top, with a supplementary 120mm fan pushing air down the length of the PCB, presumably to an exhaust vent pointing towards the back of the PC.
That would result in a very chunky, maybe five-slot graphics card design, but it sure would be able to house a heatsink capable of keeping such a hot and heavy Ada Lovelace GPU cooled. But yeah, 'heavy' might very well be the watchword here if 'efficiency' certainly isn't; you're going to want a support bracket in there, I'd wager.
Thankfully, the latest rumours suggest the top-end card of the next Nvidia generation, the RTX 4090 will be a 450W board. That is still pretty ludicrous, considering the GeForce RTX 3090 was a 350W card.
Yeah, I know the RTX 3090 Ti is a 450W card, too, but that is also a ludicrous board.
Some updates. RTX 4090, AD102-300, 16384FP32, 384bit 21Gbps 24G GDDR6X,RTX 4080, AD103-300, 10240FP32, 256bit (?18Gbps 16G GDDR6?), RTX 4070, AD104-275, 7168FP32, 160bit 18Gbps GDDR6 10G. And DO NOT expect a lower MSRP.June 23, 2022
Anyways, once we'd established what an super over-specced card would look like I then questioned what would be going on under the hood when such a GPU might require its own discrete power supply.
The biggest gaming news, reviews and hardware deals
Keep up to date with the most important stories and the best deals, as picked by the PC Gamer team.
This is one hell of blurry cable hell prediction right here…
In order to be even-handed with regards the artificial intelligences involved I also let Nvidia's own GauGan2 AI Art tool have a go at creating us a over-powered 800W RTX 4090 design. To be fair, it doesn't have an 'object' setting, and is better suited to creating landscape images.
Still, this feels like a tortured digital soul desperate to be loved. Or at least taken for a wee day trip to the beach.
Header Cell - Column 0 | GPU | CUDA cores | Memory | Memory bus | Memory speed | TBP |
---|---|---|---|---|---|---|
GeForce RTX 4090 | AD102-300 | 16,384 | 24GB GDDR6X | 384-bit | 21Gbps | 450W |
GeForce RTX 4080 | AD103-300 | 10,240 | 16GB GDDR6 | 256-bit | 18Gbps | 420W |
GeForce RTX 4070 | AD104-275 | 7,168 | 10GB GDDR6 | 160-bit | 18Gbps | 300W |
Dave has been gaming since the days of Zaxxon and Lady Bug on the Colecovision, and code books for the Commodore Vic 20 (Death Race 2000!). He built his first gaming PC at the tender age of 16, and finally finished bug-fixing the Cyrix-based system around a year later. When he dropped it out of the window. He first started writing for Official PlayStation Magazine and Xbox World many decades ago, then moved onto PC Format full-time, then PC Gamer, TechRadar, and T3 among others. Now he's back, writing about the nightmarish graphics card market, CPUs with more cores than sense, gaming laptops hotter than the sun, and SSDs more capacious than a Cybertruck.
Nvidia says its surprisingly high $3.3B gaming revenue is expected to drop but 'not to worry' because next year will be fine *wink* RTX 50-series *wink*
AMD rumoured to be ditching future RDNA 5 graphics architecture in favour of 'unified' UDNA tech in a possible effort to bring AI smarts to gaming ASAP