Oblivion struggling to nail 60 fps on my RTX 5090 was not on my 2025 bingo card, but without frame gen Oblivion Remastered's top settings bring the card to its knees
Sticking everything on Ultra with a UE5 game was always going to be thus, I guess.

Now, I did not expect Oblivion to be the game that brought my Nvidia RTX 5090 to its knees in 2025, but that's kinda what Oblivion Remastered has done. When I first checked over the system requirements my initial thoughts were 'oh, that's nice, it's not a resource hog that will devour every space CUDA core or compute unit you throw at it.' Sitting here now, with the RTX 5090 running at 99% load and barely scraping 60 fps in the open world outside of the Imperial City, I'm second guessing myself.
What I will say is this is born of my simply booting into the game, slapping each and every slider, from view distance to hair quality, up to Ultra—including Hardware Lumen RT, of course—but to get a vaguely comfortable frame rate I've had to enable both DLSS and Frame Generation. Obviously because I'm an Nvidia apologist.
Still, I was surprised to only be knocking on the door of 60 fps once I'd escaped the stinky Picard-infested sewers and emerged into wider Cyrodiil. This is the most powerful graphics card of today, running only in a pseudo 4K mode because I'm upscaling and actually just running the game at a far lower res.
I honestly wasn't expecting that Oblivion Remastered would be a game that actually necessitated Frame Generation to get the sort of frame rates I've become accustomed to with the RTX 5090. But it is 2025 and it seems like every new game is coming with the stipulation that upscaling and now frame gen are almost requisite for the top settings.
Stick standard 2x Frame Generation on and I can relax, knowing that I'm getting triple digit frame rates. Although Nvidia's own Frameview and overlay monitoring software don't seem to read the generated frames, claiming that I was still languishing around the 60 - 70 fps mark. That had me initially thinking that FG was giving me practically nothing, and had me rather concerned for the thousands of dollars worth of GPU silicon struggling away under the Elder Scrolls load.
Thankfully both Rivatuner and Oblivion Remastered's own fps monitors were showing the same higher frame rate, which seems more redolent of the actual performance you would expect from adding in some extra frame smoothing goodness. Though it is worth noting that Rivatuner can be a bit funky when it comes to the 1% Low metrics.
I am able to push that overall frame rate up to around the 170 - 200 fps mark using the DLSS Override feature of the Nvidia App and enabling 4x Multi Frame Gen—and it so far looks okay—but absolutely having to use RTX Blackwell's one neat trick to get higher frame rates and consistently nail a triple digit frame rate is really something.
The biggest gaming news, reviews and hardware deals
Keep up to date with the most important stories and the best deals, as picked by the PC Gamer team.
Without upscaling or Frame Generation, I'm back down to a native 4K experience of around the 50 fps mark, and even with DLSS enabled I'm often dropping down below 60 fps in the outdoors areas.
That obviously changes indoors, with a far more limited view distance and the game engine having to do far less graphically intensive work. With 4x MFG I'm suddenly hitting a heady 200+ fps in those dark corridors.
But still, it's interesting to me that when you go cavalier with the in-game graphics settings in Oblivion Remastered that even the RTX 5090 will struggle. Thankfully, it's still a PC game with the bones of OG Oblivion at its heart, so there are myriad ways to bump up the frame rate, whether that's purely by stepping back the overall settings a touch, or being more aggressive in the upscaling level you're aiming for. And if you've somehow managed to bag yourself an RTX 50-series GPU you get to use Frame Generation in all its forms, too.
You could even do what I did the first time I played the original Oblivion back in 2006 and just run it in a tiny window on your 14-inch CRT monitor, y'know, just to squeeze out a playable frame rate. My old GeForce 6600 LE really did not cope with that game…

Dave has been gaming since the days of Zaxxon and Lady Bug on the Colecovision, and code books for the Commodore Vic 20 (Death Race 2000!). He built his first gaming PC at the tender age of 16, and finally finished bug-fixing the Cyrix-based system around a year later. When he dropped it out of the window. He first started writing for Official PlayStation Magazine and Xbox World many decades ago, then moved onto PC Format full-time, then PC Gamer, TechRadar, and T3 among others. Now he's back, writing about the nightmarish graphics card market, CPUs with more cores than sense, gaming laptops hotter than the sun, and SSDs more capacious than a Cybertruck.
You must confirm your public display name before commenting
Please logout and then login again, you will then be prompted to enter your display name.