Voxel Cone Tracing
I've begun working on an implementation of voxel cone tracing for global illumination. This technique could potentially offer a way to perfrorm real-time indirect lighting on the entire scene, as well as real-time reflections that don't depend on having the reflected surface onscreen, as screen-space reflection does.
I plan to perform the GI calculations all on a background CPU thread, compress the resulting textures using DXTC, and upload them to the GPU as they are completed. This means the cost of GI should be quite low, although there is going to be some latency in the time it takes for the indirect lighting to match changes to the scene. We might continue to use SSR for detailed reflections and only use GI for semi-static light bounces, or it might be fast enough for moving real-time reflections. The GPU-based implementations I have seen of this technique are techically impressive but suffer from terrible performance, and we want something fast enough to run in VR.
The first step is to be able to voxelize models. The result of the voxelization operation is a bunch of points. These can be fed into a geometry shader that generates a box around each one:
void main() { vec4 points[8]; points[0] = projectioncameramatrix[0] * (geometry_position[0] + vec4(-0.5f * voxelsize.x, -0.5f * voxelsize.y, -0.5f * voxelsize.z, 0.0f)); points[1] = projectioncameramatrix[0] * (geometry_position[0] + vec4(0.5f * voxelsize.x, -0.5f * voxelsize.y, -0.5f * voxelsize.z, 0.0f)); points[2] = projectioncameramatrix[0] * (geometry_position[0] + vec4(0.5f * voxelsize.x, 0.5f * voxelsize.y, -0.5f * voxelsize.z, 0.0f)); points[3] = projectioncameramatrix[0] * (geometry_position[0] + vec4(-0.5f * voxelsize.x, 0.5f * voxelsize.y, -0.5f * voxelsize.z, 0.0f)); points[4] = projectioncameramatrix[0] * (geometry_position[0] + vec4(-0.5f * voxelsize.x, -0.5f * voxelsize.y, 0.5f * voxelsize.z, 0.0f)); points[5] = projectioncameramatrix[0] * (geometry_position[0] + vec4(0.5f * voxelsize.x, -0.5f * voxelsize.y, 0.5f * voxelsize.z, 0.0f)); points[6] = projectioncameramatrix[0] * (geometry_position[0] + vec4(0.5f * voxelsize.x, 0.5f * voxelsize.y, 0.5f * voxelsize.z, 0.0f)); points[7] = projectioncameramatrix[0] * (geometry_position[0] + vec4(-0.5f * voxelsize.x, 0.5f * voxelsize.y, 0.5f * voxelsize.z, 0.0f)); vec3 normals[6]; normals[0] = (vec3(-1,0,0)); normals[1] = (vec3(1,0,0)); normals[2] = (vec3(0,-1,0)); normals[3] = (vec3(0,1,0)); normals[4] = (vec3(0,0,-1)); normals[5] = (vec3(0,0,1)); //Left geometry_normal = normals[0]; gl_Position = points[0]; EmitVertex(); gl_Position = points[4]; EmitVertex(); gl_Position = points[3]; EmitVertex(); gl_Position = points[7]; EmitVertex(); EndPrimitive(); //Right geometry_normal = normals[1]; gl_Position = points[1]; EmitVertex(); gl_Position = points[2]; EmitVertex(); ... }
Here's a goblin who's polygons have been turned into Lego blocks.
Now the thing most folks nowadays don't realize is that if you can voxelize a goblin, well then you can voxelize darn near anything.
Global illumination will then be calculated on the voxels and fed to the GPU as a 3D texture. It's pretty complicated stuff but I am very excited to be working on this right now.
If this works, then I think environment probes are going to completely go away forever. SSR might continue to be used as a low-latency high-resolution first choice when those pixels are available onscreen. We will see.
It is also interesting that the whole second-pass reflective water technique will probably go away as well, since this technique should be able to handle water reflections just like any other material.
- 4
3 Comments
Recommended Comments