Ray tracing. A quiet revolution, happening in the puddles, car bonnets, marble floors, and window panes of the handsomest games since NVIDIA’s RTX series of GPUs went live. Movie CG-level algorithms happening in real-time, much to the consternation of Hollywood VFX houses who just a few years ago spent actual weeks running offline renders to achieve the same results. It’s appearing in an ever-increasing number of triple-A blockbuster titles, such as Watch Dogs Legion and the upcoming Cyberpunk 2077 and Watch Dogs Legion, and even some of the old-guard are getting in on it and giving us a breathtaking new look at their familiar climes – see Minecraft RTX and Quake RTX, for example.
But hang on, what exactly is ray tracing? Well, if you want the TLDR version, it’s a much more advanced way of rendering the way light bounces off (and through) surfaces that makes your game look incredibly lifelike. If you’re playing a game with ray tracing support like Metro Exodus or Control on a GeForce RTX 20 or 30 Series card, you’ll see a world of difference every time you meet a reflective surface like water or glass, and as you watch shadows form under every object in the scene in real time.
And if you want to go deeper, which you do, or you wouldn’t be reading this sentence, there are really two questions to answer: one, what’s happening down in the mathematics that’s so different from traditional reflective surface rendering methods, and two, where do you see the benefits most clearly? Lab coats on, we’re going in deep on question one first.
Before NVIDIA’s real-time RTX ray-tracing, reflections were achieved in games using one of a few different methods. Planar reflections work a bit like building two identical rooms and placing a glass panel between them to give the illusion of a mirrored wall – it’s an additional renderer, and a virtual camera displaying the ‘mirrored’ image. It’s best for very simple reflections, and it should go without saying that that’s not how light works in real life. Sadly there isn’t a person employed to jump over to the other side of our bathroom mirror and mimic our movements precisely every time we go in there – although life would be a lot more fun if it did work like that, wouldn’t it?
Real reflections can come from absolutely anywhere, depending on the light source and the reflective surface. And to better mimic that, cubemapped reflections build a 360-degree image of the environment using six flat images defined by directional vectors which describe a cube. This technique’s been around in games since 1999, and although it’s capable of sophisticated results, it’s still a very simplistic version of what actually happens when we see a reflection in real life.
Ray tracing, then, is a totally different approach. Instead of just recreating the reflective surface itself, it recreates the light source. So when you watch a ray-traced scene in real time, you’re seeing a simulation of everything that happens from the source to the reflective surface, and then where light travels after that. If you look around right now, you can probably see more than one reflective surface near you, which means light isn’t just bouncing off one thing and ending its journey – there’s an interconnected relationship between all surfaces, creating constantly changing reflections and shadows. And that’s what ray-tracing is capable of bringing to the latest cutting-edge titles such as Cyberpunk 2077, Call of Duty: Black Ops Cold War, Watch Dogs: Legion, and more.
Thinking of upgrading? View the full range of GeForce RTX-powered HP OMEN gaming PCs and laptops here.