
Yes, I know: more AI. Love it or hate it, that's the way things are going. For us gamers, it started with upscaling, then frame gen, then Multi Frame Gen, and soon, it seems, fully AI-generated frames.
At GDC today Nvidia announced that "neural shading support will come to DirectX preview in April, unlocking the power of AI Tensor Cores in NVIDIA GeForce RTX GPUs inside of graphics shaders used to program video games…
"Nvidia RTX Neural Shaders SDK enables developers to train their game data and shader code on an RTX AI PC and accelerate their neural representations and model weights with Nvidia Tensor Cores at runtime. This significantly enhances the performance of neural rendering techniques, allowing for faster and more efficient real-time rendering with Tensor Cores."
In other words, AI will be used not just to interpolate frames and generate new ones based on a traditionally rendered frame but also to help render that original frame. It's AI being added to another step of the rendering pipeline.
The end-goal might presumably be to have the game engine tell the GPU information about the primary in-game qualities—objects, movement, and so on—and have AI flesh out the rest of the picture.
It's difficult to imagine how that could work without any information on how to flesh out said picture, but that would be where the "game data and shader code" training comes in: Developers can give the AI model a good idea of what stuff should be like when rendered, and then when users actually play the game, the AI model can do its damndest to replicate that.
As Nvidia's Blackwell white paper explains: "Rather than writing complex shader code to describe these [shader] functions, developers train AI models to approximate the result that the shader code would have computed."
The biggest gaming news, reviews and hardware deals
Keep up to date with the most important stories and the best deals, as picked by the PC Gamer team.
This will presumably be tailored to Blackwell given Nvidia has worked with Microsoft to develop the Cooperative Vectors API, though Nvidia does say that "some of [the developer-created neural shaders] will also run on prior generation GPUs."
We already had an idea that this was in the works, as in December 2024 we saw Inno3D speak about "Neural Rendering Capabilities" in its then-upcoming graphics cards. We'd also seen mention of neural rendering from Nvidia before, but not in a context that could actually be implemented in games just yet.
And then with the launch of the RTX 50-series cards and the RTX Blackwell architecture we had our first look at Neural Shaders in action at CES. With the likes of neural texture compression (offering a touted 7x saving in VRAM usage), RTX Skin (as seen in HL2 Remix's meaty headcrabs), RTX Neural Radiance Cache (also featured in HL2 Remix), RTX Neural Faces, and RTX Neural Materials all promising to offer an enhanced level of realism in games without utterly tanking frame rates.
Nvidia VP of Developer Technology John Spitzer calls this "the future of graphics" and Microsoft Direct3D dev manager Shawn Hargreaves seems to agree, saying that its addition of "Cooperative Vectors support to DirectX and HLSL… will advance the future of graphics programming by enabling neural rendering across the gaming industry."
It's almost a reflex for me to be sceptical of anything AI, but I must remember that my scepticism over frame gen has been slowly abated. I remember seeing character hands moving through the in-game HUDs and writing off DLSS 3 frame gen when it launched, but now those problems are rare and even latency isn't half-bad if you have a high baseline frame rate.
So I'll try to keep my mind open to at least the possibility that this could actually be a step forward. At any rate, we'll find out before long—just a few weeks until devs can start trying it out.
Best CPU for gaming: Top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game first.

Jacob got his hands on a gaming PC for the first time when he was about 12 years old. He swiftly realised the local PC repair store had ripped him off with his build and vowed never to let another soul build his rig again. With this vow, Jacob the hardware junkie was born. Since then, Jacob's led a double-life as part-hardware geek, part-philosophy nerd, first working as a Hardware Writer for PCGamesN in 2020, then working towards a PhD in Philosophy for a few years (result pending a patiently awaited viva exam) while freelancing on the side for sites such as TechRadar, Pocket-lint, and yours truly, PC Gamer. Eventually, he gave up the ruthless mercenary life to join the world's #1 PC Gaming site full-time. It's definitely not an ego thing, he assures us.
You must confirm your public display name before commenting
Please logout and then login again, you will then be prompted to enter your display name.