Ashes of the Singularity benchmark unites AMD and Nvidia under DX12—sort of
Keep up to date with the most important stories and the best deals, as picked by the PC Gamer team.
You are now subscribed
Your newsletter sign-up was successful
Want to add more newsletters?
Every Friday
GamesRadar+
Your weekly update on everything you could ever want to know about the games you already love, games we know you're going to love in the near future, and tales from the communities that surround them.
Every Thursday
GTA 6 O'clock
Our special GTA 6 newsletter, with breaking news, insider info, and rumor analysis from the award-winning GTA 6 O'clock experts.
Every Friday
Knowledge
From the creators of Edge: A weekly videogame industry newsletter with analysis from expert writers, guidance from professionals, and insight into what's on the horizon.
Every Thursday
The Setup
Hardware nerds unite, sign up to our free tech newsletter for a weekly digest of the hottest new tech, the latest gadgets on the test bench, and much more.
Every Wednesday
Switch 2 Spotlight
Sign up to our new Switch 2 newsletter, where we bring you the latest talking points on Nintendo's new console each week, bring you up to date on the news, and recommend what games to play.
Every Saturday
The Watchlist
Subscribe for a weekly digest of the movie and TV news that matters, direct to your inbox. From first-look trailers, interviews, reviews and explainers, we've got you covered.
Once a month
SFX
Get sneak previews, exclusive competitions and details of special events each month!
The developers at Oxide Games have made a name for themselves by pushing low-level APIs. They were an early adopter of AMD's Mantle with their Star Swarm stress test, and they later ported the test to DX12. Then someone got the bright idea to turn that demo into an actual game, and Ashes of the Singularity was born. Now nearing its official March 22 launch date, the presumably final beta has been sent out to hardware reviewers, and it will become publicly available on Steam Early Access tomorrow.
What makes this second beta of Ashes unique is that it's the first demonstration of DX12's Explicit Multi-Adapter (EMA) rendering, aka that technology that promises to let AMD and Nvidia GPUs live together in harmony. Unlike AMD's CrossFire and Nvidia's SLI technologies, which pair up nearly identical hardware to improve performance, EMA allows developers to utilize any and all graphics resources as they see fit. No longer are developers confined to the whims of driver teams and homogeneous hardware; EMA now let's them do crazy things like rendering a game using both AMD and Nvidia GPUs.
Over at Maximum PC I've put together a massive article detailing what works—and what doesn't—with Ashes' EMA rendering. It's a complex problem, and trying to balance a workload across disparate hardware can't be easy. Look at how often SLI or CrossFire fail to work properly, and now imagine trying to do all of the rendering work without the help of the drivers. That it works at all is testament to the dedication of Oxide's developers, but just because you have two GPUs in a system doesn't mean they're going to always play nicely.
Of course, this isn't the first time we've seen someone try to pair up GPUs from the two rivals. Back in 2010, LucidLogix created their Virtu virtualization software to try and accomplish the same thing. With little to no help from the GPU vendors, however, LucidLogix pulled the plug a couple of years later. This time, with a low-level API in hand, it's up to the game developers to make things like EMA work. Is this the shape of things to come—will we see more developers implementing EMA in the future, or will this prove to be too much work for too little benefit? Our crystal ball is a bit cloudy on the subject, but kudos to Oxide Games for being the first to tackle an extremely difficult problem.
If you're interested in learning more about the technologies behind EMA, check out Maximum PC's article for a deep dive.
Keep up to date with the most important stories and the best deals, as picked by the PC Gamer team.
Jarred's love of computers dates back to the dark ages when his dad brought home a DOS 2.3 PC and he left his C-64 behind. He eventually built his first custom PC in 1990 with a 286 12MHz, only to discover it was already woefully outdated when Wing Commander was released a few months later. He holds a BS in Computer Science from Brigham Young University and has been working as a tech journalist since 2004, writing for AnandTech, Maximum PC, and PC Gamer. From the first S3 Virge '3D decelerators' to today's GPUs, Jarred keeps up with all the latest graphics trends and is the one to ask about game performance.


