**A Graphics Diary** *Approximate Subsurface Scattering With 3D Distance Fields* *27.07.17* *[←](../index.html)* *Subsurface scattering* (also known as subsurface light transport) happens when photons penetrate a translucent surface and scatters within the material until they exit at a different location. A common technique in rendering that can be used to render subsurface scattering is called depth map based SSS. In this technique, a depth map is generated from the light point of view, similarly to the shadow mapping technique (c.f. [A Practical Shadow Mapping Pipeline](01.11.17.html) for an explanation of this technique). Then, the thickness of the model at a given fragment can be easily retrieved by sampling the depth map and comparing it to the current fragment depth. While this technique gives good results, choosing a depth bias can become cumbersome, and convex models might show some artifacts. ![A real life example of subsurface scattering in action](sss.png width="350px") As an experiment and an alternative to the depth map based SSS, I wanted to give another technique a try using a 3D distance field that, once generated, can be found particularly useful for a handful of other rendering effect. A 3D distance field is a representation in which, for each point of the extent of the space we work with, we associate a normalized distance to the mesh; where a positive distance means we are outside the mesh, and a negative distance means we are within. When storing a 3D distance field in a texture, we usually remap the distance values within the range `[0..1]`, so the interpretation is slightly different; anything in the range `[0..0.5]` is outside the mesh, and `[0.5..1]` is inside. Here is my header only library used to generate such a 3D texture: [sdf.h](sdf.h). The usage of this library is simple, you input a 3D mesh, and the function `unsigned int* sd_texture3d(sd_mesh_t const* mesh, int resx, int resy, int resz)` outputs a 3D texture with the requested resolution. In the first step of the algorithm, a distance field needs to be generated and stored in a 3D texture. Here is a view of the different slices of the 3D texture distance field of the famous 3D model suzanne: Then, when rendering the triangle mesh, each vertex is transformed in UV space. Transforming a vertex in UV space can be done as such: ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ C vec3 ToUVSpace(vec3 position) { return (position + abs(u_min)) / (u_max - u_min); } ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Where `u_min`, and `u_max` are the extent of the bounding box containing the 3D mesh, this essentially remaps the position values to the values to `[0..1]`. The next step is to step through the texture cube starting from the current fragment UV. The direction used to step through the 3D texture cube is given by a view ray (normalized vector from the view to the fragment interpolated position). For each iteration through the 3D texture along the ray, accumulate the thickness by ray marching in the light direction. To get the light direction, the UV needs to be transformed in view space. The distance field when sampled from the texture, as mentionned previously, has the following meaning: - if `sampledValue < 0.5`: inside the triangle mesh - if `sampledValue > 0.5`: outside the triangle mesh Positive distance (outside the mesh) will have no contribution to the thickness while negative distance (inside the mesh) in the field will contribute, so the following can be used: ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ C sdf = texture(u_texture, uv).r; sdf = (1.0 - step(0.5, sdf)) * sdf; thicknessInLightDirection += sdf / steps; ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Some results of the final render experiment: Note that generating 3D distance fields will result in a higher memory footprint than the depth based technique. Putting it all together: ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ C #define RANGE_CHECK 0 // Optional range check, involves branching vec3 ToWorldSpace(vec3 uv) { return (u_max - u_min) - abs(u_min); } bool UVRangeCheck(vec3 uv) { return uv.x > 1.0 || uv.x < 0.0 || uv.y > 1.0 || uv.y < 0.0 || uv.z > 1.0 || uv.z < 0.0; } float RaymarchToLight(vec3 uvStart, vec3 lightDir, int steps) { vec3 uv = uvStart; float thicknessInLightDirection = 0.0; float sdf = 0.0; for (int i = 0; i < steps; i++) { uv = uv + lightDir * (1.0 / float(steps)); #ifdef RANGE_CHECK if (UVRangeCheck(uv)) { return thicknessInLightDirection; } #endif sdf = texture(u_texture, uv).r; sdf = (1.0 - step(0.5, sdf)) * sdf; thicknessInLightDirection += sdf / steps; } return thicknessInLightDirection; } // uvStart and dir are varying from the vertex shader calculated as such: // f_uv = normalize(position - u_eye); // f_incident = ToUVSpace(position); float AccumulateThickness(vec3 uvStart, vec3 dir, int thicknessMarchingSteps, int lightMarchingSteps) { vec3 uv = uvStart; float thickness = 0.0; for (int i = 0; i < thicknessMarchingSteps; i++) { uv = uv + dir * (1.0 / float(thicknessMarchingSteps)); #ifdef RANGE_CHECK if (UVRangeCheck(uv)) { break; } #endif vec3 viewSpacePosition = vec3(u_view * vec4(ToWorldSpace(uv), 1.0)); vec3 lightDir = normalize(f_lightPos - viewSpacePosition); thickness += raymarchToLight(uv, lightDir, lightMarchingSteps) / thicknessMarchingSteps; } return 1.0 - clamp(thickness, 0.0, 1.0); } ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ **References & Further reading** *[←](../index.html)*