Ultra-enthusiast hardware is strangling PC gaming

RTX 4090 and RTX 4080 with neon glow filter
(Image credit: Future)
Dave James, doomsayer

Dave James

(Image credit: Future)

This week I have been mostly playing The Witcher 3. Some luddite ripped the fibre optic cabling out of my house which has left me bereft of interwebs, with only next-gen Geralt for company. Yes, I have been the victim of an IRL DoS attack.
This month I have been mostly testing gaming laptops.
I've had four different RTX 40-series gaming laptops on the test bench this month, all with varying degrees of desirability. From 'oh god, no' with the MSI Titan GT 77, to 'oh actually, maybe' with the Asus Zephyrus M16. Not a rousing success then.

The goose is screaming as the final, distended golden egg tears its way loose of the ruined cloaca and plops onto the floor, its fragile shell cracking on impact. In a moment the screaming stops, the goose's neck goes limp, its head drops, and it breathes its ragged last breath. After each successive golden egg, the farmers pumped more growth hormones into the poor, weakened beast, until at last those shiny eggs had grown so big they tore up its insides.

Yes, killing the goose that laid the golden eggs is a tortured metaphor for the PC gaming market at the moment. And yes, I have been listening to a lot of Alan Partridge recently.

But I still can't get away from the feeling that the current trend in PC gaming hardware is to focus all your efforts purely on the high-margin ultra-enthusiast market, and that is going to turn PC gaming into the most offensively elitist gaming hobby around. 

And, eventually, when there are but a handful of rich gamers running $2,500 graphics cards, typing on $800 gaming keyboards, with $500 gaming mouses strewn all around their $4,000 monitors, there won't be much reason for developers to spend the money developing for such a niche platform. And it will die.

The lifeblood of PC gaming has always been the mainstream and budget end of the market. That's where most of us can afford to play, and that's historically been where most of the money has been made—from high volume, affordable tech. There's always been the very high-end stuff, but that was always low-volume and specialist, while the vast majority spent our time trying to build machines that weren't much more expensive than a console but could easily outperform them.

It's easy to blame half the world's ills on 'the pandemic' but without that strange hiatus in life and the resulting unprecedented demand for PC tech, coupled with a resurgent cryptocurrency market, we might not be in this position.

As it is, the pandemic did happen, prices went through the roof on practically every facet of the suddenly sparse PC industry, and manufacturers realised that there did indeed seem to be an untapped level of seemingly disposable income that it had not previously exploited.

MSI Titan GT77 HX gaming laptop with RTX 4090 and Core i9 compared with Asus Zephyrus M16 with a similar spec

MSI and Asus' new RTX 4090 gaming laptops (Image credit: Future)

If everything is priced up, and there is no alternative, people will still buy it. So, why not just make expensive things?

Here we are with a new generation of graphics cards from both the main GPU manufacturers that counts five different graphics cards and not one of them with a realistic MSRP priced below $799. That's obscene.

It's also baffling that the only card of the five to actually feel like it has any sense of value is the $1,600 RTX 4090 at the very top of the stack. That's only $100 more than the RTX 3090 cost, and it's a far better card in every way. And if you take inflation over three years into account you could easily argue the RTX 4090 is effectively cheaper than the RTX 3090 launched at.

Now we have a new generation of gaming laptops, which launched with a raft of $4,000+ machines, and a gaming keyboard industry where if your new keeb doesn't come with hot-swappable switches, lube, and a $300 price tag you're a fool.

It's not much different on the gaming monitor side, either, where we're seeing screens we wouldn't want for $500 retailing for $1,800 because they use some new, short-lived screen tech that's already outstayed its welcome on the desktop. Mini LED? Not on anything over 16-inches, thanks.

Philips Evnia 34M2C7600MV

(Image credit: Future)

Who have we got going to bat for us? Intel. Surprise!

The larger panels have terrible backlighting issues that are a result of compromises demanded by the tech and Hobson's choices the manufacturers have to make. I can't see the tech sticking around for long in this state on the desktop.

But who have we got going to bat for us? Intel. Surprise! If you'd told me ten years ago that Intel would be the company trying to force down pricing on affordable gaming hardware I'd have laughed in your face. Like, a really mean laugh, that you would 100% know held no humour whatsoever.

It's the value champion today, though. Its processors, the core of Intel itself, only get better, and better value the lower down the stack you get. The Core i5 13400F is an outstanding chip that delivers the same essential gaming performance as a $600 CPU. With Alder Lake, and now Raptor Lake you can get affordable motherboards for the platform, with affordable memory options, because it's provided a more budget-friendly DDR4 alternative while still offering high-end DDR5 options if you want it.

And its graphics cards, the first of its Arc name, are getting better value all the time, too. I'm not just talking about slashing the pricing of its GPUs, though it has done that with the Arc A750 graphics card dropping to $250 | £250. I'm talking about the fact that its previously shonky driver stack is getting more reliable with each release and the actual gaming performance is creeping up to the levels promised by the Alchemist architecture's illusory 3DMark numbers.

Intel Arc A750 Limited Edition graphics card

(Image credit: Future)

There is further hope, too, and it's in consumer habits. Maybe manufacturers have been convinced that PC gamers will continue to wildly spend money ad infinitum, or maybe they've simply been riding the wave until it breaks on the sands of fiscal responsibility and economies in recession.

But consumers aren't buying it any more. There are regular reports that the ludicrously expensive cards, such as the RTX 4080 and to a lesser extent the RX 7900 XT, are being left on the shelves of retailers with customers not willing to pay the money anymore. There is also fresh talk of price cuts incoming, which isn't going to encourage people to spend more right now, either.

You'd hope that eventually the bubble would burst, the customer base willing to pay significantly over the odds will dwindle, and manufacturers would have to come crawling back to the mainstream market to compete again. There are signs this is happening on the consumer side with people holding on to their money, but so far no counterbalancing response from the hardware manufacturers aimed at cutting the punitive sticker prices.

I guess it will remain so until the mainstream RTX 40-series and RX 7000-series cards start to appear and do or do not display some greater sense of value. And whether cards like the RTX 4080 really do see some significant, official price cuts down the road, too.

TOPICS
Dave James
Editor-in-Chief, Hardware

Dave has been gaming since the days of Zaxxon and Lady Bug on the Colecovision, and code books for the Commodore Vic 20 (Death Race 2000!). He built his first gaming PC at the tender age of 16, and finally finished bug-fixing the Cyrix-based system around a year later. When he dropped it out of the window. He first started writing for Official PlayStation Magazine and Xbox World many decades ago, then moved onto PC Format full-time, then PC Gamer, TechRadar, and T3 among others. Now he's back, writing about the nightmarish graphics card market, CPUs with more cores than sense, gaming laptops hotter than the sun, and SSDs more capacious than a Cybertruck.