Unity is betting big on ray tracing—here's how they brought it to their engine
We chat with the makers of Unity about doubling down on ray tracing and making it accessible for developers.
Well in advance of the launch of the RTX series of cards, Nvidia and other major industry players were talking a very big game about ray tracing. Ray tracing, with the power to simulate rays of lights and reflections in real time in unprecedented ways, the GPU manufacturer claimed, would unlock incredible new levels of immersion and realism. It would allow the most powerful and important element of the way we view the world, light, to be rendered at dizzying new levels of fidelity, and give developers the power to do in real time what it takes top studios in Hollywood hours, days, or weeks to achieve.
Of course, all the hardware support in the world is irrelevant without the software to back it up, and Nvidia's forward-looking revolution has had something of a slow start. Even the few games that have made their way into the wild that support ray tracing do so either in a limited way, or at a level that's not immediately apparent to the casual player. While the tech demos look very impressive, real world implementation has yet to justify Nvidia's massive investment.
That may all be about to change, however. At GDC 2019 Unity, one of the leading engines for game development, announced it was adding support for ray tracing to its tool kit. After what felt like several generations of Epic's Unreal engine dominating the conversation, Unity has made massive inroads in the past few years, finding a home on over three billion devices around the world and being installed more than 28 billion times in just the last 12 months. Bringing ray tracing to Unity could have a massive impact on the adoption and success of the technology, so I spoke to its Vice President of Graphics, Natalya Tatarchuk, about why Unity decided to bet on ray tracing and what it means for the growing crowd of devs that rely on her platform.
"Unity is the world’s most widely-used real-time 3D development platform," Tatarchuk says. "With real-time ray tracing, interactive graphics reach new levels of realism, allowing for truly dynamic global rendering effects, not possible before at fast frame rates. We needed to make amazing, offline-quality professional graphics accessible to all our users in real-time frame rates."
Bringing ray tracing to Unity has been an immensely complicated process, the work of more than a year. Tatarchuk emphasizes that Unity didn't want to get too far ahead of the ray tracing curve before proper hardware was in place to support it.
"Not every situation calls for real-time ray tracing, so we’ve been considerate when it comes to forcing innovation before the technology is in place to use it. Now, with the introduction of faster more performant GPUs such as NVIDIA’s RTX, we are nearing a place where real-time ray tracing is ready for adoption by developers at large."
Tatarchuk tells me that this year, the focus has been on expanding the functionality and useability of Unity’s pipeline, but that work began in earnest in 2018.
The biggest gaming news, reviews and hardware deals
Keep up to date with the most important stories and the best deals, as picked by the PC Gamer team.
"Last year we harnessed the power of real-time ray tracing technology for accelerating lighting workflows," she says, "through a real-time path tracer to bake lighting in with our GPU progressive lightmapper, which is built utilizing the GPU for path tracing execution. But this was just the beginning of our real-time ray tracing journey. This year, we focused on expanding the tool box shipping with our High Definition Render Pipeline (HDRP) to provide more variety of sophisticated tools to the users."
Traditionally, lighting has been simulated during rasterization, the process of transforming polygonal objects into a 2D image drawn with pixels. Individual pixels would be shaded or colored depending largely upon how light sources in a scene struck them; it was essentially faking the interactions with light that ray tracing attempts to simulate in real time, and it's very computationally efficient, but rasterization comes with a number of important limitations.
"Rasterization brings a fair amount of advantages to the rendering efficiency," Tatarchuk explains. "In our HDRP we are already performing physically accurate computations for lighting and shadowing. But in the rasterization world we only operate on small, local parts of the scene. These approaches are just clever approximations that are fast on console hardware. A trained eye can easily spot where they don’t match reality. If you don’t see the object on the screen, you don’t see its reflection. It’s just not there."
Ray tracing bypasses this limitation by simulating lighting across an entire scene, regardless of which elements of that scene are currently visible to the player.
"With the addition of real-time ray tracing, developers are going to be able to implement global illumination, reflections and refractions that further blur the line between real life and real-time. With our real-time ray tracing integration you truly get accurate global reflections or full-scene lighting bounce. It doesn’t matter if something is visible in the frame or not. You will still see its reflections. We give you the ability to render primary rays with rasterization or ray tracing for best quality. With that, you can get proper refractions for multi-layer transparent materials, or global dynamic shadows, or global occlusion. You also get the full gamut of HD materials to play with: opaque, transparent, alpha-tested, double-sided, with full shader graph support. And, of course, dynamic real-time global illumination. And you also get great scaffolding to build custom effects using real-time ray tracing technology on top of our High Definition real-time ray tracing render pipeline."
For Tatarchuk, it's an odyssey that began back in 2017, when Unity first introduced the HDRP. It provided raster-based rendering solutions but was created to be extensible, meaning it was the perfect, broad foundation atop which to implement real-time ray tracing.
"Our real-time ray tracing solution allows creators to author once, and then determine whether they want to utilize it for raster-based creations, or real-time ray tracing. It's built on top of our HDRP with C# scripts and shaders, quick to reload and iterate. With C#, one of the main advantages for the graphics developers is the super quick hot reload iteration loop for quick feedback. Our solution is built with choice, flexibility and power in mind. We provide a wide variety of ray trace effects for you to play with."
One of the keys for implementing a complex technology like ray tracing is ensuring it's accessible to end users, in this case developers building games on Unity. Tatarchuk says one of the company's main focuses has always been giving devs powerful, flexible tools that were easy to understand and leverage.
"We are carrying that over to our implementation for real-time raytracing—there will be a learning curve, as is the case with all new technology, but we are doing it in a way that will be intuitive and accessible for our audience. We set out to make it easier for all developers—they create once and they have both raster and real-time ray tracing effects. We provide the ability for users to author reflection shaders in the shader graph tool so that they can customize it as needed for their objects. They can also easily create custom master nodes for their real-time ray tracing effects as well."
The goal is to integrate ray tracing tools into Unity's kit in a way that makes them as broadly applicable as possible, without compromising on power and utility. Building ray tracing into a platform with Unity's reach, that's as broadly available and accessible as any modern game engine, is a massive boon for the technology's future, and could lead to the kind of adoption that makes Nvidia's execs drool when they imagine the future of their most capable GPUs.