Anyone with an eye on the markets in the 2020s knows Nvidia has been all about artificial intelligence in recent years. (Look at its world-beating data-center efforts, and its stock price!) But the impact and benefits of AI as it pertains to computer graphics are still evolving and, by and large, not fully understood by the general public. It’s a lot easier to grok what a ChatGPT or a Stable Diffusion does.
With the upcoming, much-anticipated GeForce RTX 50 series of graphics cards, however, Nvidia will push several new AI-based technologies that could have the most impactful role on computer graphics and the gaming industry of any AI tech released to date. We’ll have to wait to see how widespread these technologies will be in popular games, and how many developers adopt them and how quickly. But here’s a teaser of what AI power potential the RTX 50 series will possess.
Nvidia’s RTX 50-Series AI Architecture
I have covered the new RTX 50 series–which will arrive first in desktop cards and soon after in mobile GPUs–in other articles, covering information on RTX 50-series GPUs’ core counts and other basic specifications. At an all-day briefing with Nvidia during CES 2025, PCMag received additional details worth touching on before diving into the RTX 50 series’ many AI features. This is partly because these new details around the RTX 50 series architecture, “Blackwell,” directly relate to the AI hardware and show what Nvidia has focused on in improving its silicon.
(Credit: John Burek)
AI has been a key focus for Nvidia in recent years, but the RTX 50 series is the first GPU built with such an extensive focus on AI features and workloads. The AI hardware is now fully functional within standard graphics workloads, enabling it to help boost performance even more than in previous-generation GPUs. As AI hardware also works in several other functions, Nvidia equipped RTX 50-series graphics cards with an AI management processor (AMP). The AMP helps manage the AI hardware’s time between tasks, such as running large language models (LLMs) and processing game code.
(Credit: John Burek)
The RTX 50 series will also adopt GDDR7 memory, which delivers greater bandwidth than GDDR6 and GDDR6X could support. It’s also more energy efficient, which could be a significant plus for the next generation of gaming laptops. Indeed, with the Blackwell architecture, Nvidia noted that it had several key goals: to optimize the silicon for neural workloads, to reduce the memory footprint, and to address that key question of energy efficiency. The last could be a boon for laptop gaming, which has typically been bound to plugged-in use. We’ll have to see when the first RTX 50-based laptops launch, but Blackwell could show significant improvement in the off-plug gaming experience.
The neural rendering architecture is what’s at the core of Blackwell’s advances, however. Fundamentally, a technology dubbed Cooperative Vectors in DirectX is poised to enable shaders to tap into the power of the Tensor cores. This will enable a new way of handling shaders, which govern challenging graphics facets such as textures and materials. That’s one way the move of AI into PC graphics portends some new paradigms: having AI hardware do predictive work rather than rendering every last pixel the old-fashioned way.
As Nvidia puts it, image quality, image smoothness, and responsiveness are the three pillars of graphics performance. Everything done in the field is a trade-off among these three factors. For example, one way to improve perceived responsiveness is to make graphics look worse (say, shift to 1080p resolution from 4K). Another way is to add more horsepower via more GPUs or more GPU power–but there’s a limit to that, of course. The company’s DLSS technology, in its various forms, is a way to do this without pushing the hardware envelope.
DLSS and AI take advantage of the fact that there is a lot of structure and signal in what we see, and therefore there is redundancy and patterns. AI unlocks that and lets a GPU improve performance via shortcutting how you display some of that redundancy. Nvidia has had a supercomputer working on DLSS improvements for getting on six years now, and it claims 80%-plus of RTX card owners use DLSS at some time or other. It’s not a new technique, but the advances this time are bigger than ever.
As ever, though, DLSS and its ilk come down to adoption by developers. Today there are 540 DLSS games, with, according to Nvidia, 15 of the top 20 games from 2024 supporting DLSS. The new version, DLSS 4, should show up in 75 “Day 0” games with the launch of RTX 50 series. More about DLSS in a bit.
RTX Neural Shaders, Neural Materials, and Mega Geometry
Integrating AI into existing tasks and applications is one of the biggest challenges with using AI technology. On the consumer front, much effort has gone into making dedicated AI software, such as ChatGPT and image generation apps, but now Nvidia is working with Microsoft to use AI for in-game workloads that work in the background to make your graphical and gaming experience better. The idea is that you’d never know that AI processes are behind it all.
(Credit: John Burek)
Take the RTX Neural Shaders tool. It uses AI hardware to perform functions similar to what conventional graphics hardware does. In particular, Neural Shaders can handle vector processing, a key task typically performed by a GPU’s shaders. Support for this doesn’t appear to be here quite yet, but Nvidia indicated a collaboration with Microsoft to integrate this functionality into DirectX, making it easier for game developers to implement.
(Credit: John Burek)
Another related technology, RTX Neural Materials, is also designed for in-game use to improve image quality. This tool trains the AI hardware on texture data and then has the AI create in-game textures based on this data. In a way, this isn’t so different from how games currently use textures. But when textures are used in games, they are typically directly taking an image with the texture and applying it to an in-game object, possibly with some post-processing effects to skew the image or adjust its lighting.
(Credit: John Burek)
From the sounds of it, RTX Neural Materials will do more or less the same thing, but instead of using the texture image directly, it will use a unique and potentially higher-quality AI-generated image based on the texture image. In so doing, it can also take a load off other parts of the GPU.
(Credit: John Burek)
Another AI tool, RTX Mega Geometry, helps with texture and image quality. This technique examines in-game geometry to improve the level of detail on rendered objects at various distances and viewing angles. In short: Less GPU brute force, more AI shortcutting.
RTX Skin, Hair, and Neural Faces
Those technologies mentioned above focus on improving image quality and performance for in-game objects, but Nvidia has also worked on similar tools for character models. The RTX Skin and RTX Neural Faces efforts both attempt to improve character models by using AI to create more original and realistic facial features. RTX Skin, for one, is a realism advance, given that skin is hard to render believably. RTX Skin uses subscattering algorithms to evoke the translucent quality of the skin and the effect of light on it. Nvidia noted that it was inspired by techniques used by Disney/Pixar.
Hair is hard, too. Another similar tool, RTX Hair, will work to try and make more realistic hair for character models while, at the same time, reducing the number of triangles involved to render that believable hair. Rather than drawing lots of triangles to compose each strand, RTX Hair will help reduce the workload by using what Nvidia calls “linear swept spheres,” basically cylinders with spheres as endcaps. This technique means the graphics engine needs to draw fewer polygons to render a strand. Nvidia suggests it could take up a third of the resources of what pure triangles would demand.
(Credit: John Burek)
Nvidia DLSS 4 and Reflex 2
Unquestionably, Nvidia’s most anticipated new technology supporting its consumer graphics in 2025 is the aforementioned DLSS 4. With DLSS 4, Nvidia switches to a Transformer model from a convolutional neural network (CNN) for its AI-based upscaling and frame generation work. According to Nvidia, this uses twice as many parameters and four times the computer power to create images with higher overall image fidelity. The Transformer model also supports higher-quality ray reconstruction and improved upscaling technology. Graphical challenges like motion trails can be resolved more easily.
Recommended by Our Editors
(Credit: John Burek)
With DLSS 4, Nvidia is also moving to “multiframe generation.” Instead of running two models per frame, DLSS 4 will be running five per frame. As a result, using DLSS 4’s frame generation, an RTX 50-series graphics card’s AI hardware can generate up to 15 pixels for every pixel created by the GPU’s traditional rendering hardware. It’s not clear yet how effectively this will scale across the line of cards, but it suggests a significant increase in the frame rate RTX 50-series cards will be capable of outputting over previous generations of Nvidia graphics cards, when the tech is engaged. It’s certainly behind CEO Jensen Huang’s CES-keynote claim that an RTX 5070 can outrun an RTX 4090. DLSS 4, not magic, comes into play there.
The original frame that the GPU hardware creates is upscaled from a lower resolution using more traditional DLSS technology to achieve this high output level. Then, frame generation comes in, but instead of making just a single frame, the frame-generation technology is used to push multiple artificially created frames. The exact number will likely depend on your GPU’s capabilities. This feature is exclusive to RTX 50-series graphics cards, and, as noted, at least 75 games will support it at launch.
(Credit: John Burek)
Why was this not done sooner? The resulting image quality simply wasn’t good enough, Nvidia’s engineers note. After all, if you’ll be looking at more generated frames than classically rendered ones, the quality has to be good! Also, it presented issues with frame pacing, or the keeping frames fed in sync with the display hardware. (Generating more frames doesn’t help the user experience if they show up in uneven or, as Nvidia put it, “lumpy,” intervals.) With Blackwell, the frame generation system should show a whopping 5x improvement for frame times.
What does all of this translate to in real-world testing? We’ll have to see, but Nvidia showed off a demo of that ever-faithful benchmark game, Cyberpunk 2077. An RTX 5090 ran at 27 frames per second (fps) in Cyberpunk with no DLSS, 70fps in DLSS 2, 141fps in DLSS 3, and a stunning 250fps in DLSS 4. In theory, with an RTX 5090, 240Hz gaming at 4K could be a thing in demanding AAA games that have the proper DLSS support. Sure, that is the very leading edge of PC gaming, but high-refresh 4K is now in view.
Alongside DLSS 4, Nvidia has a newer version of its Reflex technology that improves responsiveness. According to the company, Reflex 2 provides 75% faster response times than the original version of Nvidia Reflex. This is mostly one for the esports crowd, with the feature coming to top shooters like The Finals: Next Stage and Valorant. Developers should get the tools to implement these in the next month or so, so look for them in actual games later this year.
Another exciting feature of DLSS 4 for graphics tweakers is an Override option built into the Nvidia App. It allows you to force a different version of DLSS onto a game, on a per-game basis. This means you can experiment and try to impose DLSS 4 in games that only support DLSS 3, for example, or push a lower version of DLSS on a game that supports a higher one, should you have a desire to try for performance or quality reasons.
RTX 5090 and RTX 5080 Coming Soon!
We’ve covered the most important AI-related RTX 50-series features here, but it’s not an exhaustive list. Nvidia showed off several more, including some that it hyped in previous years, like AI-driven non-player characters or NPCs in games, an AI lighting feature for webcams, and a tool that creates a podcast out of a PDF.
Though neat and potentially useful, these items don’t carry quite the same wide-ranging potential impact as the features we have covered. It’s also unclear which of these features will be exclusive to the RTX 50 series, as some, like that PDF-to-podcast tool, might work on older generations of GPU. There’s not a clear idea when many of these features will launch, either; much of it is developer-uptake-dependent. Still, with the release of the first RTX 50 series cards around the corner, you may not have long to wait to try out these features for yourself. These days, RTX leads the way.
Get Our Best Stories!
This newsletter may contain advertising, deals, or affiliate links.
By clicking the button, you confirm you are 16+ and agree to our
Terms of Use and
Privacy Policy.
You may unsubscribe from the newsletters at any time.