Ray tracing uses lighting techniques to bring more realism in video games. By using algorithms, this technology traces and imitates the path of light that it would take in real world
Taiwanese chipmaker MediaTek recently announced the Dimensity 9200 chipset, a flagship mobile chip aimed at premium flagship smartphones. The MediaTek Dimensity 9200 brings several upgrades, including the industry-first hardware-based ray-tracing engine powered by the ARM Immortalis-G715 GPU that the chip has for graphics. With ray tracing tech coming to smartphones as early as next year, here is everything to know about it:
What is ray tracing
Ray tracing is a technology that makes light in video games behave (reflect and refract) like it does in real life. By using an algorithm, it traces and imitates the path of light that it would take in the real world. With the help of this technique, game developers can make virtual rays of light reflect, refract and absorb to cast lifelike and realistic shadows in their games for enhanced realism. Ray tracing is not a new technology. It was first ideated in 1969 and has been used in film industry.
If you look at the way light works in video games, you might think that all the real life light phenomena i.e. reflection, refraction and absorption are there. Games without ray tracing depend on static lighting baked into the scene, which is supposed to always play the same way. Developers place the light source in an environment and pre-render the light effects even with some ray tracing. The objects and surfaces can reflect light, but only emitted from fixed sources. Although, the animations are not dynamic. Earlier, if the player alters the gaming environment, the path of light or the scene would not change unless the developer has thought of the possibility. With ray tracing, the light would adjust in real time.
How it works
In real life, light comes from a source and bounces around until it reaches our eyes. Then the brain interprets different rays to complete one picture. Ray tracing essentially works the same way, except that the process takes place in backward direction. It is because in real life a lot of light never reaches our eyes and would not be useful in ray tracing in a video game.
In ray tracing, a ray of light begins from the viewer’s perspective and after bouncing across multiple objects in the scene, it reaches the appropriate light source. This method of tracing light’s path in backward direction is far more efficient, as the only path to be traced and rendered is the one which fits the user’s field of view.
It takes less computing power to show what is in the user’s view, than to render all the rays coming from a light source. A game needs to run over 60 frames per second, whereas a single film frame is pre-rendered and can take up to hours and days to render. Hence, a lot of gaming PCs and video game consoles cannot support ray tracing unless they have the hardware support for it.
Games can use some partway solutions. Rather than mapping out every single ray of light, developers can trace only a select number of rays, then use machine learning algorithms to fill in the gaps and smooth everything out by mixing colours as well as the differences in brightness and darkness. This process is known as denoising.
When is it coming
Ray tracing is already here, if you have a PC with high-end graphic card that can support it. It is available in a few games such as Battlefield V, Metro Exodus, and Shadow of the Tomb Raider, as well as titles like Cyberpunk 2077 and Wolfenstein: Youngblood. Next up is the games developed for smartphones as the support for ray tracing engine is coming to new-age processors from chipmakers.
Future of mobile games
Ray tracing is currently not available in mobile games. But, with a lot of chipmakers working to bring this technology to smartphones, it is expected to be there by 2024. Like MediaTek, Qualcomm is expected to launch a chip with hardware-based ray tracing engine soon.