Nvidia's CEO chats about the future of AI: 'We're going to need three computers... one to create the AI… one to simulate the AI… and one to run the AI'

What’s Next in AI: NVIDIA’s Jensen Huang Talks With WIRED’s Lauren Goode - YouTube What’s Next in AI: NVIDIA’s Jensen Huang Talks With WIRED’s Lauren Goode - YouTube
Watch On

At this year's Siggraph event, Nvidia's Jen-Hsun Huang sat down with Wired for an hour-long chat about all things Nvidia, RTX, and AI. Among the varied topics touched upon, including a recognition that AI training and inference have huge energy demands, was Huang's assertion that more computers are going to be needed for AI systems in the future—specifically, three times more.

Siggraph is an annual conference normally about computer graphics and technology involved with interactivity (think AR and VR, that kind of thing) but it was only a matter of time before AI would become the main topic of discussion. To that end, Nvidia's CEO was interviewed by Wired's Lauren Goode for an hour-long streamed discussion that covered GPUs, RTX, and ray tracing, but mostly AI.

If you've been keeping up to date with Nvidia's push for generative AI to be everywhere, then there's nothing in the discussion that will really pique your interest. However, at one point, Huang mentioned how the world of AI is now moving away from its pioneering phase and moving toward the next one, which Nvidia's CEO called the "enterprise wave."

After that comes the "physical wave", which, according to Huang, is "really, really quite extraordinary." He clarified that statement by saying three computers will be required: one computer to create the AI, another to simulate and refine the AI, and finally a third computer to run the AI itself.

"It's a three computer problem. You know, a three body problem and it's so incredibly complicated and we created three computers to do that."

Jen-Hsun is, of course, talking about Nvidia's raft of hardware and software packages, from its DGX H100 servers to create the AI, Jetsen embedded computers to simulate the AI, and then workstations and servers using Omniverse and RTX GPUs to run the AI.

Siggraph is one of my favourite tech events and I've been watching presentations and reading research papers presented at the conference for years. I have to say that it's a bit of shame that Nvidia's fairly blatant sales pitch for its AI systems took center stage this year—Huang's chat with Meta's Mark Zuckerberg was another example of AI-promotion with no substance—and although there will still be plenty of discussion about computer graphics, and AI will naturally be a part of that, Huang didn't say anything that made me think "Wow, this is going to be so cool!"

Are we really going to need three computers? PC gamers certainly won't and neither will most businesses. Even those looking to really integrate AI into their core operations may baulk at the potential cost and complexity of using and paying for three tiers of Nvidia's products.

Nvidia is clearly 100% focused on AI now. The days of it just being a graphics/gaming-only company are long gone, despite it being a core part of the business when it morphed into a data-processing one. That's not to say PC gamers won't benefit from Nvidia's technological advancements in AI, of course, as the likes of RTX and DLSS have arguably been a big step forward in the world of rendering.

And I certainly wouldn't expect the CEO of the world's most successful AI company to not push it at every possible business opportunity but I think we could all do with a bit of a breather from the relentless push for artificial intelligence festooning every aspect of our computing lives—it does wear a little thin after a while.

Best gaming PCBest gaming laptop


Best gaming PC: The top pre-built machines.
Best gaming laptop: Great devices for mobile gaming.

TOPICS
Nick Evanson
Hardware Writer

Nick, gaming, and computers all first met in 1981, with the love affair starting on a Sinclair ZX81 in kit form and a book on ZX Basic. He ended up becoming a physics and IT teacher, but by the late 1990s decided it was time to cut his teeth writing for a long defunct UK tech site. He went on to do the same at Madonion, helping to write the help files for 3DMark and PCMark. After a short stint working at Beyond3D.com, Nick joined Futuremark (MadOnion rebranded) full-time, as editor-in-chief for its gaming and hardware section, YouGamers. After the site shutdown, he became an engineering and computing lecturer for many years, but missed the writing bug. Cue four years at TechSpot.com and over 100 long articles on anything and everything. He freely admits to being far too obsessed with GPUs and open world grindy RPGs, but who isn't these days?