Real-Time Rendering of Caustics on Programmable Graphics Hardware
By Liang Wei NOV, 13th, 2006
Caustics is a complex lighting phenomenon often seen around refractive and reflective objects in real world. It occurs when the rays coming from light sources converge through reflections or refractions on some diffuse non-reflective surface, creating a pattern of alternating dark and saturated areas. A typical example is what we seen on the floor of swimming pool under direct sunlight. Following is an example image of rendering of Cornell Box (which contains a glass sphere creating caustics on the floor) using Photon-Mapping.

Figure 1 A rendered image of Cornell Box which contains caustics caused by light focusing by a glass sphere.
To date, there exists several techniques for rendering caustics, most of which are global illumination (GI) algorithms such as Monte Carlo Path Tracing, Photon Mapping, etc. These sets of GI algorithms are mostly offline, and time-consuming, which makes them unsuitable for interactive applications such as computer games or virtual reality. Recently, with the advent of Programmable Graphics Hardware (GPU), more and more applications are making use of the parallel computing power of GPU for real time processing. And there are also some work to apply GPUs for real-time rendering of Caustics. Most recent and successful work is done by Chris Wyman[1], in which he applied his previous work on two surface and nearby geometry refraction [2] [3] to the rendering of caustics. Though approximate and limited his method is, the visual appeal and real time capability makes his approach desirable for applications where timing constraints allows trade off on realism.
Our approach is based on the general idea developed by Wyman. The only difference between his approach and ours is how to estimate the energy of caustics on receiver surface. While Wyman used a radiosity based approximation metric to estimate the attenuation of light energy, our method is related to estimating the location of "caustics surface" (this "caustics surface" is different in meaning to the caustics phenomenon we are trying to achieve). As noted by Shree Nayar [4], the reflection and refraction of light rays will form a surface called Caustics Surface. where the light energy is mostly concentrated in those regions. We have developed a theory to estimate Caustic Surface based on a group of neighboring rays. This estimation method can be easily ported to GPU, in combinations of Wyman's two surface refraction algorithm, we can achieve real-time rendering of caustics of simple scene containing only one point light source.
Below I will show some image samples of my renderer.




Figure 2 Real-Time Renderer for Caustics and Refraction on GPU using Sphere model


Figure 3 Real-Time Renderer for Caustics and Refraction on GPU using Buddha model
[1] Chris Wyman and Scott Davis "Interactive
Image-Space Techniques for Approximating Caustics."
Proceedings of the ACM Symposium on Interactive 3D Graphics and Games,
153-160. (March 2006)
[2] Chris Wyman. "An Approximate Image-Space Approach for Interactive
Refraction."
ACM Transactions on Graphics 24(3), 1050-1053. (August 2005)
[3] Chris Wyman. "Interactive Image-Space Refraction of Nearby Geometry."
Proceedings of GRAPHITE, 205-211. (December 2005)
[4] R. Swaminathan, M.D. Grossberg and S.K. Nayar, "Caustics of Catadioptric
Cameras," IEEE International Conference on Computer Vision (ICCV),
Vol.2, pp.2-9, Jul, 2001.