The new Unity 6 game engine demo looks spectacular but is it enough to convince developers to return?

Time Ghost | Unity 6 - YouTube Time Ghost | Unity 6 - YouTube
Watch On

When it comes to 3D game engines and cutting-edge graphics, many people will instantly think of Unreal Engine 5, Ubisoft's Snowdrop or Guerrilla Games' Decima as being the best choice for creating ultra-realistic visuals. However, with the release of a new tech demo, titled Time Ghost, Unity is aiming to prove that its next-gen engine, Unity 6, is just as good as any of them.

The above video of the demo showcases some of the things that are supposed to be possible in Unity 6, with an Intel Core i9 14900K and Nvidia GeForce RTX 4090 providing the computing power to run it all. It's a cinematic piece, rather than being gameplay-focused, but I get strong Death Stranding and Battlefield 1 vibes from it all.

For me, the best parts are the sweeping vistas, lush with vegetation, along with the character, cloth, and hair animations. It really does look spectacular but then again, so it should, given that Unity wants its next-generation engine to poach developers away from Unreal Engine 5.

While I only dabble with game engines for fun and examining GPU architectures, I've been using Unreal Engine (UE) for a good few years now—I always preferred it over Unity, even though the latter is much easier to pick and get started with than UE. Unity has been the engine of choice for thousands of developers and it's still really good if you want to have a go at making a simple 2D game for a host of different platforms.

Well, it used to be but then Unity's bosses decided on a course of changes to its pricing scheme that was universally reviled by the industry, resulting in significant layoffs and the resignation of its CEO. A year on from the whole debacle, Unity has reverted to how things used to be but with considerable damage already done, it's going to need something special to bring developers back.

Is Unity 6 going to be the launch pad to make that happen, though? Unity has done some truly fantastic demos in the past (Enemies was a particular favourite of mine) but they're all in the super-duper graphics category, something that Unity, as an engine, wasn't heavily favoured for.

For example, Unreal Engine was used for CGI effects and environments in The Mandalorian, Westworld, and Fallout TV shows. Unity, on the other hand, has only been used for several short features.

However, the list of games powered by Unity is vast and despite its reputation of being for 2D only, some fantastic 3D games are using the software. Subnautica, Escape from Tarkov, Rust, The Planet Crafter, and the Outer Wilds are all great examples. They're not replete with cutting-edge graphics, though, which is why I'm always puzzled by Unity insisting on releasing demos with wowzer visuals.

I suspect that no matter how good Unity 6 turns out to be, the company is going to have a really hard time bringing experienced developers back into its folds, because who is to say that Unity won't pull the rug out from underneath one's feet with another price scheme change. That's a shame because as much as I like UE5, I wouldn't want the game industry to be dominated by just one engine.

Image


Best CPU for gaming: Top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game first.

Nick Evanson
Hardware Writer

Nick, gaming, and computers all first met in 1981, with the love affair starting on a Sinclair ZX81 in kit form and a book on ZX Basic. He ended up becoming a physics and IT teacher, but by the late 1990s decided it was time to cut his teeth writing for a long defunct UK tech site. He went on to do the same at Madonion, helping to write the help files for 3DMark and PCMark. After a short stint working at Beyond3D.com, Nick joined Futuremark (MadOnion rebranded) full-time, as editor-in-chief for its gaming and hardware section, YouGamers. After the site shutdown, he became an engineering and computing lecturer for many years, but missed the writing bug. Cue four years at TechSpot.com and over 100 long articles on anything and everything. He freely admits to being far too obsessed with GPUs and open world grindy RPGs, but who isn't these days?