Voxel Ray Tracing and Physically-Based Rendering
Light is made up of individual particles called photons. A photon is a discrete quantum of electromagnetic energy. Photons are special because they have properties of both a particle and a wave. Photons have mass and can interact with physical matter. The phenomenon of "solar pressure" is caused by photons bombarding a surface and exerting force. (This force actually has to be accounted for in orbital mechanics.) However, light also has a wavelength and frequency, similar to sound or other wave phenomenon.
Things are made visible when millions of photons scatter around the environment and eventually go right into your eyes, interacting with photoreceptor cells on the back surface of your interior eyeball (the retina). A "grid" of receptors connect into the optic nerve, which travels into your brain to the rear of your skull, where an image is constructed from the stimulus, allowing you to see the world around you.
The point of that explanation is to demonstrate that lighting is a scatter problem. Rendering, on the other hand, is a gather problem. We don't care about what happens to every photon emitted from a light source, we only care about the final lighting on the screen pixels we can see. Physically-based rendering is a set of techniques and equations that attempt to model lighting as a gather problem, which is more efficient for real-time rendering. The technique allows us to model some behaviors of lighting without calculating the path of every photon.
One important behavior we want to model is the phenomenon of Fresnel refraction. If you have ever been driving on a desert highway and saw a mirage on the road in the distance, you have experienced this. The road in the image below is perfectly dry but appears to be submerged in water.
What's going on here? Well, remember when I explained that every bit of light you see is shot directly into your eyeballs? Well, at a glancing angle, the light that is most likely to hit your eyes is going to be bouncing off the surface from the opposite direction. Since you have more light coming from one single direction, instead of being scattered from all around, a reflection becomes visible.
PBR models this behavior using a BRDF image (Bidirectional reflectance distribution function). These are red/green images that act as a look-up table, given the angle between the camera-to-surface vector and the incoming light vector. They look something like this:
You can have different BRDFs for leather, plastic, and all different types of materials. These cannot be calculated, but must be measured with a photometer from real-world materials. It's actually incredibly hard to find any collection of this data measured from real-world materials. I was only able to find one lab in Germany that was able to create these. There are also some online databases available that these as a text table. I have not tried converting any of these into images.
Now with PBR lighting, the surrounding environment plays a more important role in the light calculation than it does with Blinn-Phong lighting. Therefore, PBR is only as good as the lighting environment data you have. For simple demos it's fine to use a skybox for this data, but that approach won't work for anything more complicated than a single model onscreen. In Leadwerks 4 we used environment probes, which create a small "skybox" for separate areas in the scene. These have two drawbacks. First, they are still 2D projections of the surrounding environment and do not provide accurate 3D reflections. Second, they are tedious to set up, so most of the screenshots you see in Leadwerks are not using them.
Voxel ray tracing overcomes these problems. The 3D voxel structure provides better reflections with depth and volume, and it's dynamically constructed from the scene geometry around the camera, so there is no need to manually create anything.
I finally got the voxel reflection data integrated into the PBR equation so that the BRDF is being used for the reflectance. In the screenshot below you can see the column facing the camera appears dull.
When we view the same surface at a glancing angle, it becomes much more reflective, almost mirror-like, as it would in real life:
You can observe this effect with any building that has a smooth concrete exterior. Also note the scene above has no ambient light. The shaded areas would be pure black if it wasn't for the global illumination effect. These details will give your environments a realistic lifelike look in our new engine.
- 8
2 Comments
Recommended Comments