What we want to see from PC gaming tech in 2024

digital landscape with digital technology elements ,concept of smart city and digital transformation.3d illustration
(Image credit: Getty | monsitj)

If I'm being entirely honest I found 2023 a largely uninspiring year on the whole. Innovation was in short supply, with the expected and the iterative making up the majority of the year's hardware releases. That's not to say we didn't get some really good tech released over the past 12 months—the Ryzen 7 7800X3D and 7840U were two of the best examples in terms of pure silicon—but they were essentially just extrapolations on existing technology.

But we want to see 2024 really moving the game on, both in terms of tech that's actually attainable but also hardware that's doing something new and innovative within the field of PC gaming gear.

Will we finally see the long-awaited appearance of multi compute die chiplet GPUs? Will we see silicon photonics reaching a realistic release window? Will Intel release thirteen different nodes and architectures? Will anybody release a graphics card that can reasonably be given a review score of above 90%? Will we find the perfect gaming monitor? Will gaming laptops finally actually be able to play games away from a plug socket?

What follows absolutely aren't predictions. Based on the past year that would likely be a quite depressing read. So what I've tasked the PC Gamer hardware team with doing is to search their souls for what, in some mythical ideal world, they would like to see happening in the hobby in 2024.

Sensible pricing, please.
Shot of Jeremy Laird in front of a bookcase
Sensible pricing, please.
Jeremy Laird

What I want to see: A man eat his own head. Failing that, wouldn't it be nice to have some sensibly priced graphics cards? Just about every other component class has pretty much normalised in price terms. But not GPUs. Will Nvidia's upcoming Super series of RTX 40 GPUs change that? I doubt it. But I can still hope. That said, the rumours suggest AMD isn't bothering with high-end variants for its next-gen RDNA 4 GPUs. So maybe, just maybe, AMD is going to reboot the idea of a graphics card that offers real value for money. Wouldn't that be refreshing?

Give me weird, give me wonderful, give me something new.
Andy Edser
Give me weird, give me wonderful, give me something new.
Andy Edser

What I want to see: If you ask me, and let's face it, you sort of did, I reckon things have become far too safe and sensible in the PC hardware space. When was the last time you saw a manufacturer do something really wild, something different, something that made you raise your eyebrows and announce "does that really work?" Ok, fair enough, it's not unheard of. But beyond some dodgy renders and ill-advised conceptual buzz-speak, it's usually nothing more than a flash in the pan. I reckon it's high time we saw some peripherals in particular get the wild and wacky treatment, but in an honest to goodness attempt to shake up the space. The Xbox Series X controller is all well and good, but I want something with levers and switches and toggles and a switch made of jelly. I want flight sticks that are straight out of a sci-fi movie. Bring on the odd, the unusual, the just might work. But here's the crucial point: Actually make it, y'know, work. And not for thousands of bucks and the rights to my first-born child. Please and thankyou. Love and stuff. Me.

AI (Actually Intelligent)
Nicholas Evanson
AI (Actually Intelligent)
Nick Evanson

What I want to see: Naturally, I'd like to see the same as everyone else: Cheaper hardware that's fast, stable, and packed with features that will ensure it will remain relevant for years to come. But I know what I will see and it's really just going to be more of the same. Vendors will announce a raft of new models at the start of the year, all which will only be marginally better than what they launched in 2023. But what I'd truly like to see is a genuine integration of AI, to do things that are properly useful. Forget about predicting what food I will want to buy next Tuesday, give me hardware that can work out in advance how it's going to be used, so performance is on tap when really needed and turned down to save power, when it's not. Who would want a 300W CPU or GPU, if the same tasks could be done with half the amount of energy? 

Clever games
Dave James
Clever games
Dave James

What I want to see: Like Nick, I want to see artificial intelligence used in more beneficial ways, and what more beneficial way could there be than to make all your games look awesome? I know we kind of already have that with the likes of DLSS making low res game inputs look almost native (and in some cases better) and come combined with a commensurate performance boost to boot, but where is my photorealism? Where is my uncanny valley? Games have still not hit that faintly awkward point yet. They still quite obviously look like games for the most part. I want someone to start using AI image generation on the fly to enhance what game characters actually look like; start giving me some sorta-photorealism. It doesn't have to stop at characters, either. I've been playing footie games, like Pro Evo and FIFA since I was smol and there is such a wealth of footage and high-res imagery out there that surely you could train a model to enhance the next EA FC game to actually look like you're watching it live on the TV. TV's have been adding extra texture data to live images for an age now, so it's about time games started doing that, too. And then, what about some actual in-game AI? What about AI as a dungeon master for that Larian Star Wars game that now has to happen? What about natural language and the ability to talk to people in a game world in a way that goes beyond just the few scripted responses a developer has been able to enter in for some side character? I don't want to replace great game writing, but I want my gameworlds to become deeper, where each NPC can hold a conversation and maybe, just maybe even evolve into a quest giver.

Rock-solid PC performance
Jacob Ridley headshot on colour background
Rock-solid PC performance
Jacob Ridley

What I want to see: Like my dear colleagues, I too dream of cheaper graphics cards, genuinely decent AI implementations, and something fun for the whole family—or whatever Andy said. Though before we get to weird and wonderful creations or AI models that make games sparkle, I simply ask for for stable PC games at launch. Or at least mostly stable. We've had the best and the worst of PC performance this year. The Last of Us Part 1 was top among the worst offenders, with some of the funniest bugs I've ever seen. It was tragic, of course, to have a Sony game from a major studio finally come to PC and it to be such a mess, but at least there was a silver lining in Joel's enormous eyebrows. Cities: Skylines 2 also made waves in the wrong way when it released with serious demands on our gaming silicon, even high-end PCs struggled to run it at times. You could also expect stuttering in Star Wars: Jedi Survivor, which still has mixed reviews on Steam due to its shaky performance. Even back in February this year, Wes was writing about how PC gamers are fed up with sub-par performance. That article was later updated in May, but the argument still holds water today. At times I've been wondering whether it's worth giving freshly-launched games a wide berth until they have a couple months' worth of patches under their belt. But we've also had some incredible games arrive with nary a major issue in sight. Baldur's Gate 3, our game of the year and just about everyone else's, had some very slight jank for the first week but it was entirely playable and ran well on a lot of hardware. Alan Wake 2 also launched with surprising scalability considering it's one of the prettiest games I've laid my eyes on. Basically, it can be done, and it'd be nice to not have to fret about performance for each new game release out of the gate next year.

TOPICS
Dave James
Editor-in-Chief, Hardware

Dave has been gaming since the days of Zaxxon and Lady Bug on the Colecovision, and code books for the Commodore Vic 20 (Death Race 2000!). He built his first gaming PC at the tender age of 16, and finally finished bug-fixing the Cyrix-based system around a year later. When he dropped it out of the window. He first started writing for Official PlayStation Magazine and Xbox World many decades ago, then moved onto PC Format full-time, then PC Gamer, TechRadar, and T3 among others. Now he's back, writing about the nightmarish graphics card market, CPUs with more cores than sense, gaming laptops hotter than the sun, and SSDs more capacious than a Cybertruck.

With contributions from