Nvidia's Neural Texture Compression technology should alleviate VRAM concerns
Future generations of graphics cards might not need 50TB of memory after all.
Some of 2023's most anticipated PC games have had their fair share of troubles. Hogwarts Legacy and The Last of Us Part 1 are just two that ran horribly on cards with insufficient VRAM. It seems like a problem that's here to stay. Even if you have enough graphics memory right now, will it be enough to handle the demands of games in one, two, or three years from now?
There is some good news on the horizon thanks to Nvidia. It's working on a new compression technology it calls Neural Texture Compression, and like most of the tech coming out of Nvidia these days, it's thanks to AI.
According to Nvidia (via Hot Hardware), the new method allows material textures to store up to 16x more data in the same space than traditional block-based compression methods. This should allow developers to shrink the size of textures without any loss of quality. That means less need for huge amounts of graphics memory, which sounds good to me.
Nvidia claims the NTC algorithm offers superior image quality compared to modern algorithms including AVIF and JPEG XL. NTC can make use of general purpose GPU hardware and the Tensor cores of current gen Nvidia hardware, and can do so in real time. AVIF and JPEG XL image compression requires dedicated hardware and isn't designed for real time decompression.
Best CPU for gaming: The top chips from Intel and AMD
Best gaming motherboard: The right boards
Best graphics card: Your perfect pixel-pusher awaits
Best SSD for gaming: Get into the game ahead of the rest
It's important to emphasize that this is a nascent technology. In the conclusion of the paper, Nvidia's researchers say: "we have shown that decompression of our textures introduces only a modest timing overhead as compared to simple BCx algorithms (which executes in custom hardware), possibly making our method practical in disk- and memory-constrained graphics applications."
This sentence could be key. Latency is critical for gaming graphics while the word "possibly" could just mean Nvidia is playing it safe, or it may be that there's a lot of work to be done before NTC becomes practical.
We can expect to hear more about Neural Texture Compression at Nvidia's developer-focused GTC conferences and at Siggraph in August. It's sure to be some time before we see this on GPUs, though. It'll take a long time to go from a white paper to retail, but even if future RTX 50 or RTX 60 cards include NTC technology, lets hope Nvidia doesn't cheap out and give us 8GB mid-range cards in 2025. GTA 6 awaits...
The biggest gaming news, reviews and hardware deals
Keep up to date with the most important stories and the best deals, as picked by the PC Gamer team.
Chris' gaming experiences go back to the mid-nineties when he conned his parents into buying an 'educational PC' that was conveniently overpowered to play Doom and Tie Fighter. He developed a love of extreme overclocking that destroyed his savings despite the cheaper hardware on offer via his job at a PC store. To afford more LN2 he began moonlighting as a reviewer for VR-Zone before jumping the fence to work for MSI Australia. Since then, he's gone back to journalism, enthusiastically reviewing the latest and greatest components for PC & Tech Authority, PC Powerplay and currently Australian Personal Computer magazine and PC Gamer. Chris still puts far too many hours into Borderlands 3, always striving to become a more efficient killer.