Unreal's new MetaHuman Animator can turn an iPhone video into a scarily accurate game animation in three minutes
Footage from Ninja Theory's upcoming Hellblade 2 is nothing short of astonishing.
Yesterday's State of Unreal 2023 presentation was rather overshadowed by a certain surprise game announcement, but Epic showed off some genuinely amazing tech coming to Unreal Engine 5 in the near future. We got to see impressive new procedural environment tools, Lords of the Fallen showing off the engine's enhancements to character creation and realistic armour and clothing, and more. But what really blew my socks off was the MetaHuman Animator stage demo.
We've seen MetaHuman before—it's a tool that allows developers to generate highly realistic human faces, fully rigged for animation. Actually animating those faces in a way that matches their realistic look—avoiding the uncanny valley—has been a difficult and time-consuming process thus far, however. Motion capture—that is, getting actors to perform the movements and then turning that footage into animation—has produced the best results, but requires specialised equipment and months of expert work.
Well, until now. On stage, Epic showed off a new feature called MetaHuman Animator, designed to completely streamline the process. During the stage demo, you can see actor and developer Melina Juergens—star of the Hellblade games—recording a video of her face on an iPhone, before a technician uploads it, processes it, and turns it automatically into a full animated sequence in the space of literally three minutes.
The final video isn't completely finished—an animator would ideally go in and touch up elements of it, and obviously work it into the in-game scene—but it looks 90% of the way there, and that's astonishing. You don't even have to map it onto a MetaHuman modelled on that person's face—you can use it with any faces you have ready, whether photo-realistic or stylised and cartoonish.
The potential here is huge. Not only will major studios be able to create facial animations in a fraction of the time, allowing for increasingly realistic interactions in upcoming games, but smaller developers will be able to create mocap-quality scenes with just a phone and a PC, instead of an entire studio full of 4D cameras and those little white dot things.
Then, just to show off, Epic revealed some footage from the development of Hellblade 2 using the tech with Ninja Theory's mocap equipment, a video it claims "hasn't been polished or edited in any way" beyond the program's automatic processing. It's a brief look, but I think it's probably the most photorealistic videogame animation I've ever seen.
If developers are as excited about this as I am, get ready for a whole generation of games that are mostly about characters delivering Shakespearean monologues in extreme close-up. By 2030 you'll be intimately familiar with every wrinkle and pore on Doom Guy's face.
The biggest gaming news, reviews and hardware deals
Keep up to date with the most important stories and the best deals, as picked by the PC Gamer team.
Formerly the editor of PC Gamer magazine (and the dearly departed GamesMaster), Robin combines years of experience in games journalism with a lifelong love of PC gaming. First hypnotised by the light of the monitor as he muddled through Simon the Sorcerer on his uncle’s machine, he’s been a devotee ever since, devouring any RPG or strategy game to stumble into his path. Now he's channelling that devotion into filling this lovely website with features, news, reviews, and all of his hottest takes.
US Air National Guardsman gets 15 years for leaking military secrets on a Minecraft Discord server: 'The scope of his betrayal is breathtaking… the amount of damage immeasurable'
Yakuza/Like a Dragon creator Toshihiro Nagoshi says his studio's new game won't be that big after all: 'it's not modern to have similar experiences repeated over and over again'