Ashes of the Singularity benchmark unites AMD and Nvidia under DX12—sort of

Ashes 8

The developers at Oxide Games have made a name for themselves by pushing low-level APIs. They were an early adopter of AMD's Mantle with their Star Swarm stress test, and they later ported the test to DX12. Then someone got the bright idea to turn that demo into an actual game, and Ashes of the Singularity was born. Now nearing its official March 22 launch date, the presumably final beta has been sent out to hardware reviewers, and it will become publicly available on Steam Early Access tomorrow.

What makes this second beta of Ashes unique is that it's the first demonstration of DX12's Explicit Multi-Adapter (EMA) rendering, aka that technology that promises to let AMD and Nvidia GPUs live together in harmony. Unlike AMD's CrossFire and Nvidia's SLI technologies, which pair up nearly identical hardware to improve performance, EMA allows developers to utilize any and all graphics resources as they see fit. No longer are developers confined to the whims of driver teams and homogeneous hardware; EMA now let's them do crazy things like rendering a game using both AMD and Nvidia GPUs.

Over at Maximum PC I've put together a massive article detailing what works—and what doesn't—with Ashes' EMA rendering. It's a complex problem, and trying to balance a workload across disparate hardware can't be easy. Look at how often SLI or CrossFire fail to work properly, and now imagine trying to do all of the rendering work without the help of the drivers. That it works at all is testament to the dedication of Oxide's developers, but just because you have two GPUs in a system doesn't mean they're going to always play nicely.

Of course, this isn't the first time we've seen someone try to pair up GPUs from the two rivals. Back in 2010, LucidLogix created their Virtu virtualization software to try and accomplish the same thing. With little to no help from the GPU vendors, however, LucidLogix pulled the plug a couple of years later. This time, with a low-level API in hand, it's up to the game developers to make things like EMA work. Is this the shape of things to come—will we see more developers implementing EMA in the future, or will this prove to be too much work for too little benefit? Our crystal ball is a bit cloudy on the subject, but kudos to Oxide Games for being the first to tackle an extremely difficult problem.

If you're interested in learning more about the technologies behind EMA, check out Maximum PC's article for a deep dive.

TOPICS
Jarred Walton

Jarred's love of computers dates back to the dark ages when his dad brought home a DOS 2.3 PC and he left his C-64 behind. He eventually built his first custom PC in 1990 with a 286 12MHz, only to discover it was already woefully outdated when Wing Commander was released a few months later. He holds a BS in Computer Science from Brigham Young University and has been working as a tech journalist since 2004, writing for AnandTech, Maximum PC, and PC Gamer. From the first S3 Virge '3D decelerators' to today's GPUs, Jarred keeps up with all the latest graphics trends and is the one to ask about game performance.