NVIDIA DLSS 5 arrives this fall: Which games and gaming studios will receive this ‘Hollywood’ level VFX update?

How does NVIDIA’s DLSS 5 work?
The neural processing model (which DLSS 5 uses) is basically a super smart AI brain trained to understand and enhance images. It takes the basic framework of the game (just colors and movement information from the game engine). The AI “understands” what is in the scene: Is this human skin? Hair? Material? Is the light coming from the front, the back, or the cloudy sky? Then, it adds realistic extras like: skin glowing a little from the inside (subsurface scattering like real ears lighting up), fabric with a soft sheen, hair catching light naturally – all while keeping everything consistent (no flickering between frames) and staying true to the game’s original 3D world (so it doesn’t invent random things). It’s as if the AI is “painting” better lighting and textures over the game artwork in real time, making the game look much more realistic than traditional methods.
The game renders a normal frame (but since it’s fast, it may not be very detailed with lighting/materials). It sends the color data + motion information of this frame to the DLSS 5 AI. The AI analyzes this once and adds photorealistic upgrades (better light reflection, skin sheen, fabric details, etc.); This all depends on the 3D model of the game so that the game is accurate and stable. At high resolution (up to 4K) you get a much nicer, more realistic image on the screen, and since the heavy AI work is done efficiently on RTX GPUs (especially the RTX 50 series), it runs smoothly. Developers can fine-tune (density, colors, mask areas) to fit the style of their game; for example, they can prevent faces from looking too strange.
How is DLSS 5 different from traditional Ray Tracing?
While Ray Tracing simulates individual paths of light, it is extremely demanding from a hardware perspective. NVIDIA notes that a single photorealistic frame in a movie can take hours to render, but a game should do it in milliseconds. DLSS 5 acts as a shortcut; Instead of calculating every ray of light, it uses generative AI to predict and draw what these photorealistic pixels should look like based on its deep training. Jensen Huang described it as the “GPT moment for graphics”; here artificial intelligence now not only makes the image sharper, but also actively “reinvents” how the final pixels are created.
NVIDIA DLSS 5 and Hollywood-level VFX
As stated in NVIDIA’s press release, “Hollywood VFX level” means that graphics in games will look as realistic and detailed as what you see in big Hollywood movies (like the CGI in Marvel movies or Pixar animations). In movies, these super-realistic scenes take minutes or hours to render per frame on powerful computers because they require perfect lighting, shadows, skin sheen, fabric sheen, etc. They use tons of complex calculations. Games only have 16 milliseconds per frame to look good and run smoothly (for 60 FPS), so they always fall short of the movie-quality look. NVIDIA says DLSS 5 closes this gap by using artificial intelligence, so real-time games can now have lighting, materials and details that feel “photoreal” like Hollywood VFX, without slowing down the game.
Check out NVIDIA GeForce’s video on DLSS 5:
Which NVIDIA GeForce RTX series will support DLSS 5 update?
The press release states that the GeForce RTX 5090 has the path tracing and neural shaders needed to push the boundaries of this technology in 2025, while DLSS 5 is built on the existing NVIDIA Streamline framework. This suggests that the latest RTX 50 series will likely see the biggest benefit, but the technology is designed to integrate with the standard DLSS pipeline used by existing RTX cards. The system operates at up to 4K resolution, ensuring AI enhancements don’t compromise the fluid, interactive performance gamers expect.
DLSS 5 certified games
It has received support from the industry’s biggest publishers, including NVIDIA, Bethesda, CAPCOM and Ubisoft. Major confirmed games include Starfield, Hogwarts Legacy, Assassin’s Creed Shadows and Resident Evil Requiem. Other upcoming games such as Black State, Phantom Blade Zero, and The Elder Scrolls IV: Oblivion Remastered are also among the early adopters. Developers from CAPCOM and Vantage Studios stated that this technology allowed them to build “cinematic and highly believable” worlds that had previously been held back by the traditional limits of real-time rendering.




