It’s been years since Nvidia has launched a new GPU architecture, but the company is finally ready to formally announce the next-generation follow-up to its Volta GPU family. For decades, Nvidia pursued a strategy of using one GPU architecture across its entire product lineup, with variations in RAM loadout or professional feature support as the primary means of distinguishing workstation and eventually HPC cards from consumer variants. With Kepler, back in 2012, that changed. First, Nvidia began introducing higher-end variants of its existing architectures before finally launching Volta, a GPU architecture that only appeared in HPC and Quadro cards with a single, Titan-flavored exception.
Today, Nvidia is taking the wraps off Turing, a new GPU architecture it claims will enable real-time ray tracing by combining tensor cores for AI inferencing as well as ray tracing cores to accelerate those workloads. Real-time ray tracing (RTRT) is a topic we’ve visited several times at ET over the years, typically with the understanding that the technology remained near-eternally “several years away.” The difference between RTRT and rasterization (extremely simplistically) is that rasterization simulates light and how light behaves in a scene, while real-time ray tracing models the behavior of light as it intersects with surfaces, materials, and moving objects. The two approaches tend to favor different strengths and weaknesses, with ray tracing typically being reserved for non-realtime projects due to how computationally intense it is.
Nvidia’s first GPUs based on Turing are the Quadro RTX 8000, Quadro RTX 6000, and Quadro RTX 5000 GPUs, and it’s expanding its RTX development platform to give developers a larger ability to target these capabilities. The company writes:
Designers can now iterate their product model or building and see accurate lighting, shadows, and reflections in real time. Previously, they would have to use a low-fidelity approximation to get their design more or less right, then ship files out to a CPU farm to be rendered and get the results back in minutes or even hours, depending on complexity. Only then could they determine what it was truly going to look like in the real world. Now, they can do so interactively on their Quadro RTX-powered desktop.
The company claims tremendous speed-ups in a variety of applications, including Unreal Engine, Black Magic DaVinci Resolve, Blender Cycles, finalRender, and OctaneRender. The RTX 6000 is said to be 5-8x faster than the Quadro P6000 when using the 2019 path-tracing kernel. The company hasn’t given exact specifications on its Turing-class cores, but claims they offer up to 4,608 cores in a Turing Streaming Multiprocessor (TSM). The maximum die size on an implementation is 754 mm sq, with 18.6B transistors and 14Gbps memory (presumably GDDR6).
Jen-Hsun Huang is claiming that this new GPU represents a fundamental shift in capabilities and could drive the entire industry towards a new mode of graphics. It’s possible he’s right — Nvidia is in a far more dominant position to shift the graphics industry than most companies. But I’m also reminded of another company that thought it could revamp the entire graphics industry with a new GPU architecture that would be a radical departure from everything anyone had done before, with a specific goal of enabling RTRT. The company was Intel, the GPU was Larrabee, and the end result was not much in particular. After a brief flurry of interest, Intel killed the card and the industry went along its path.
Obviously, that’s not going to happen here, given that Nvidia is shipping silicon, but the major question will be whether the very different techniques associated with real-time ray tracing can catch on with developers and drive a major change in how consumer graphics are created and consumed. The odds of a global market transformation in favor of real-time ray tracing will increase substantially if companies like AMD and eventually Intel throw their own weight behind it.
Now Read: How Nvidia’s RTX Ray Tracing Works, Investigating Ray Tracing, The Next Big Thing in Computer Graphics, and Photorealism in Unreal Engine 4