The true Nvidia RTX legacy isn't ray tracing, it's DLSS
It's been 3 years since the first Nvidia RTX cards launched, but the legacy of DLSS has had far more impact on our gaming experience than ray tracing.
Graphics reinvented. That's how Nvidia introduced its first RTX graphics cards back in 2018, cards that first went on sale three years ago this week, in fact. And a lot has happened in the intervening time, but do the promises of that new era of Nvidia GPUs still hold up, and what's changed since then?
Three years is a long time in PC hardware, but somehow it still feels as though the launch of the first RTX series of Nvidia graphics cards is a recent phenomenon. Yet the RTX 30-series has been with us—often in spirit more than in any physically real form—for a full year now. It was the inaugural RTX 20-series, however, which introduced us to the promise of a whole new era of PC gaming GPUs.
And that promise? In a phrase, it all seemed to be about ray tracing, at least that was the piece of the puzzle that felt most tangible when it was first introduced to us jobbing tech journos in an old beer factory in Cologne, Germany. Ray tracing was one of the first things on display that separated the Turing architecture from the older Pascal design.
The little green 'RTX On' badge has been used on screenshots ever since as a symbol of graphical greatness. But, almost buried in the same presentation, after highlighting the still as yet unproven potential of mesh and variable rate shading, we saw the first faltering steps of DLSS. The true gift for PC gamers born of the advanced RTX graphics silicon.
Three years on and it's safe to say that ray tracing hasn't gone away. In fact it's everywhere, in practically every gaming platform apart from certain handhelds. Despite the negative Nelsons out there decrying the computationally expensive nature of real-time ray tracing, it has been adopted across the board.
It's just not necessarily as transformative a feature as it might have first appeared. I mean, it is just simulated lighting after all. Nor has it always been used to the greatest effect either.
Yet launching over a year later, both Sony and Microsoft's next-gen consoles decided they had to have ray tracing as an element on their respective specs sheets. And how well that must have gone down with its AMD hardware partner. As such AMD's RDNA 2 GPU architecture—the graphical heart of the latest console generation—now has its own implementation of ray tracing, which is supportive of the same Microsoft DirectX Raytracing API that Nvidia's RT tech accelerates.
The biggest gaming news, reviews and hardware deals
Keep up to date with the most important stories and the best deals, as picked by the PC Gamer team.
Intel, too, starting out as the third way in PC graphics cards, is also supporting hardware ray tracing with its upcoming Arc-based Alchemist GPUs.
And yet, three years on Microsoft is still insisting that 'raytracing' is one word. Gah.
But the fact remains that, as much as the latest Nvidia RTX 30-series cards have alleviated a lot of the silicon burden of ray tracing, it's still computationally demanding, and you will see a performance penalty for turning on the realistic lighting effects in-game. That's especially true in the case of AMD, and therefore the consoles' implementation of the technology.
I'm still of the opinion that ray tracing is just getting started, and is something that will eventually become such a ubiquitous part of gaming's rich feature set that the idea of listing a game as 'featuring ray tracing' will become as pointless as listing that it needs 3D acceleration.
This generation of games consoles, however, despite their features lists suggesting otherwise, are never going to further the cause meaningfully. In fact devs are actively disabling it from their PS5 or Xbox Series X/S versions in favour of higher frame rates, with Far Cry 6 unlikely to be the last.
But DLSS is almost the antithesis to real-time ray tracing. Ray tracing is all about enhancing the visual reality of a scene using computationally intensive algorithms on specific blocks of silicon, where Deep Learning Super Sampling is purely about using other blocks of silicon to speed up performance while making a scene look almost as good as normal.
It doesn't take a genius to see why a feature, which offers higher gaming performance practically for free, has become far more popular with gamers than something which can tank frame rates in order to replace the pre-baked and faked lighting we've all become inured to.
Though it did take DLSS 2.0 to really nail it, offering those higher frame rates without the muddied visuals that often accompanied the initial implementation.
AMD following suit on ray tracing may not have had the impact Sony or Microsoft might have hoped, but its creation of FidelityFX Super Resolution (FSR) as its own pseudo DLSS could really pay dividends should console devs start using it more regularly. FSR isn't exactly the same as DLSS, but it does follow a similar pattern; taking a lower resolution input, upscaling it to a higher resolution, and enhancing the output to look better than traditional methods would allow.
And high frame rate shenanigans ensue.
Intel, again, has also followed suit. Its Xe Super Sampling (XeSS) feature, coming with the new Alchemist graphics cards, in fact offers two methods. One agnostic one, which looks a lot like AMD's FSR, and another one rooted in the Arc GPU silicon itself that bares a striking resemblance to DLSS.
As much as AMD and Intel might want to suggest otherwise, I struggle to believe that either FSR or XeSS would have come about without DLSS.
How to buy a graphics card: tips on buying a graphics card in the barren silicon landscape that is 2021
We can't talk about the years since the first RTX cards launched without mentioning the GPU shortage-shaped woolly mammoth in the room. While ray tracing does offer some lovely visuals, it's not necessary in any sense of the word. And when new, high-end GPUs are more expensive, and harder to get hold of than they've ever been, a technology which demands you sacrifice the finite computational power of the card at your disposal in the name of more accurate lighting, is always going to struggle.
But a feature which takes the likes of the RTX 2060 from years back, and gives it a healthy performance boost to the point where it can make modern games actually playable, has got to feel like a winner.
DLSS is obviously not perfect, however. It has to be built into a game by the developers themselves, and though that has gotten easier with subsequent iterations, it's not a feature which you can just enable in any game and get a free fps bump.
But it is undoubtedly the legacy of Nvidia's RTX era that has ended up having the most direct impact on PC gamers. And that's true whether they're using DLSS, FSR, or will end up taking a new Alchemist GPU for a spin with XeSS.
Dave has been gaming since the days of Zaxxon and Lady Bug on the Colecovision, and code books for the Commodore Vic 20 (Death Race 2000!). He built his first gaming PC at the tender age of 16, and finally finished bug-fixing the Cyrix-based system around a year later. When he dropped it out of the window. He first started writing for Official PlayStation Magazine and Xbox World many decades ago, then moved onto PC Format full-time, then PC Gamer, TechRadar, and T3 among others. Now he's back, writing about the nightmarish graphics card market, CPUs with more cores than sense, gaming laptops hotter than the sun, and SSDs more capacious than a Cybertruck.