Intel Announces Ambitious Plans for Ultimate Path-Tracing Eye Candy with Integrated Graphics

General
Intel Announces Ambitious Plans for Ultimate Path-Tracing Eye Candy with Integrated Graphics

The company's ambitious plans to enable real-time path tracing not only on Arc discrete GPUs, but eventually on integrated graphics, have been revealed in a new Intel blog post.

Intel says it will reveal several new approaches to path tracing through research papers to be presented at an industry conference later this year.

"With Intel's ubiquitous integrated GPUs and emerging discrete graphics, researchers are pushing the efficiency of the most demanding photorealistic rendering called path tracing," Intel says.

"Across the entire path tracing process, the research presented in these papers demonstrates efficiency gains in the key components of path tracing: ray tracing, shading, and sampling. These are key components to making photorealistic rendering with path tracing available on more affordable GPUs, such as the Intel Arc GPU, a step toward real-time performance on integrated GPUs."

Roughly speaking, path tracing is a higher fidelity version of ray tracing that more accurately simulates the path of each ray and how it reflects off multiple surfaces. Inevitably, it is more demanding on GPU resources than traditional ray tracing. So far, path tracing has been implemented in games like "Quake II RTX" and "Portal RTX," neither of which are such demanding titles that they do not use ray tracing.

According to Intel, one of the papers proposes a novel and efficient way to compute reflections from what are known as GGX microfaceted surfaces. Such surfaces are most widely simulated in the gaming, animation, and VFX industries and inherently compromise the most common real-world materials, such as textures and rough surfaces.

The idea is that most material surfaces are composed of a number of micro-facets, each with a specific slope or angle to the viewer or camera. These gradients can be modeled collectively by a facet gradient distribution function.

The second paper provides a faster and more stable way to render grinty surfaces, which are often avoided due to the high computational cost of simulation, especially in game engines.

It is not clear exactly how all of this relates to Intel graphics. Or are there lessons to be learned from these efforts that the graphics industry as a whole can benefit from?

In any case, the general tone of Intel's blog post suggests that these are more long-term aspirations than firm goals for the near future. In other words, we don't expect to see Intel's integrated GPUs running full-pass tracing Cyberpunk at 4K next year.

But the basic ambition of real-time path tracing on integrated GPUs and Intel's emphasis on "affordability" are still very welcome, and the current pursuit by Nvidia and AMD of a low-spec graphics card for $300 is It is certainly a more democratic realization of high-fidelity graphics than purchasing.

Categories