Real-time ray tracing finally starts to become common in high-end PC games and even some PS5 and Xbox Series X games. Most of them impact performance, allowing the player to choose whether they want smoother gameplay or more precise details. Which begs the question: should gamers even care about ray tracing?
Why ray tracing is a big deal (even if it hurts performance)
Although ray tracing is a relatively new buzzword in the gaming world, functionality has been a mainstay of computer graphics in film and television for years. It simply refers to the process of tracing the path of light rays as they bounce around a scene. This allows computers to accurately render things like shadows, highlights, highlights, and reflected light. The result is a scene that looks more realistic with less work. The only downside is that ray tracing typically requires so much processing power that movie studios have to spend days rendering highly detailed scenes.
The real breakthrough for video games is real time laser tracing. Modern consoles and graphics cards finally have enough processing power to handle the brute force of ray tracing. However, it can still be limited to certain tasks only. Cyberpunk 2077, for example, has separate switches for highlights and ray traced shadows, so you can choose which aspect of your game’s graphics will be enhanced.
But do you really care about sharper shadows? It might not seem like a big deal, but so are most of the visual graphics enhancements. We tend to notice when bad graphics come out, but when the graphics are good, we immerse ourselves more in the game instead.
Most importantly, ray tracing facilitates the work of developers. Most current games have non-ray tracing graphics options if you don’t have the powerful hardware to enable ray tracing, but getting these experiences to look good takes a lot more work. The more work it takes to get the false shadows in a scene to look correct, the less time the developer has to spend on other things.
In the long run, as gaming hardware becomes more powerful, ray tracing graphics will become more standard and allow developers to create beautiful experiences with less effort than before. There is still something to be said for just appreciate the beauty of a video game, and a game will almost always be at its best with ray tracing enabled.
Why you might always want to turn it off
For now, however, ray tracing is a demanding task. The current generation of consoles in particular have come at a difficult time. Game output in 4K is becoming the norm, even if you don’t have a 4K HDTV. Games are increasingly targeting 60fps, and those that aren’t sticking out like sore thumbs. And some games even try to hit 120 frames per second.
All of these innovations require massive amounts of processing power compared to previous generations. All other things being equal, 4K requires around four times the processing power of 1080p. Games running at 60 fps require about twice as much processing as 30 fps because they render exactly twice as many frames in the same amount of time. And for 120 frames, the workload is double that of 60 fps. In other words, there is a parcel data that new consoles and graphics cards need to move very fast just to keep up with modern functionality.
Adding ray tracing on top of that is like trying to find a fourth job when you are already working 90 hours a week. Finally, something must give way. And that’s exactly where the current crop of ray tracing video games is.
Depending on the graphics card you have, Cyberpunk 2077 could be infinitely slow on PC with ray tracing enabled. Call of Duty: Cold War Black Ops can run at 120 frames per second or activate ray tracing, but not both. Even on hardware that allows ray tracing and high frame rates at the same time, you can get a smoother experience or fewer bugs if you skip ray tracing altogether.