The Quest for the Ray Tracing API


The Quest for the Ray Tracing API

Tuesday, 26 July, 9:00 am - 12:15 pm, Anaheim Convention Center, Ballroom B

While photorealistic image synthesis, especially in movie production and product design, is almost exclusively based on ray tracing, rendering under real-time constraints still relies on the performance of rasterization hardware. But in the near future, considering the latest parallel algorithms and upcoming hardware, ray tracing performance will reach the real-time domain and may become feasible even in games. Replacing screen-space approximations in rasterization by more accurately ray traced effects certainly will simplify game rendering engines, and the possibilities go far beyond the obvious. Is it possible to efficiently ray trace any content that is now rasterized? Is there an exclusive transition, or do we need both rasterization and ray tracing? What if ray tracing outperforms rasterization?

Besides rendering, ray tracing has numerous applications in simulation, for example sound propagation or antenna design and placement. Ray tracing relies on auxiliary acceleration data structures, which may be used for occlusion culling, collision detection, range searches, and distributed spatial data bases.

This course provides a glimpse into the future of ray tracing and its applications.




Basic understanding of ray tracing and its applications.

Intended Audience

Everybody who is concerned with ray tracing and off-line and real-time rendering in both industry and academia.


Alexander Keller
NVIDIA Corporation

Ingo Wald
Intel Corporation

Takahiro Harada
Advanced Micro Devices, Inc.

Dmitry Kozlov
Advanced Micro Devices, Inc.

Ralf Karrenberg
NVIDIA Corporation

Luke Peterson
Imagination Technologies Limited

Tobias Hector
Imagination Technologies Limited