You will build path tracers capable of rendering realistic images with global illumination effects on surfaces. Your implementations will account for both the direct and indirect illumination in a scene.
In this first exercise, you will implement a naive implict path tracer. Recall the hemispherical form of the rendering equation discussed in class: L(x,ω′)=∫H2fr(x,ω′,ω)L(r(x,ω′),−ω′)cosθ′dω′,
where r(x,ω′) is the ray tracing function that returns the closest visible point from x, in direction ω′. We know that equation (1) can be approximated by a single-sample Monte Carlo integral estimator as: L(x,ω′)≈Le(x,ω′)+fr(x,ω′,ω)L(r(x,ω′),−ω′)cosθ′p(ω′).
The updated basecode includes a new integrator definition in include/nori/path.h
. This class has various input properties:
a boolean isExplicit
defines whether the path tracer will apply explicit direct illumination sampling and, if so, the direct illumination sampling strategy is specified in direct-measure = {solid-angle, area, hemisphere}
,
the indirect sampling strategy is exposed in indirect-measure = {hemisphere}
,
the warp type for both direct and indirect when the measure is set to hemisphere is in direct-warp = {cosine-hemisphere, uniform-hemisphere}
and indirect-warp = {cosine-hemisphere, uniform-hemisphere}
; when the direct-measure
is area
or solid-angle
, it's clear that no additional warping is necessary, and
the termination-param
parameter corresponds to either the maximum number of bounces when termination = max-depth
, or to the Russian roulette termination probability when termination = russian-roulette
.
<integrator type="path">
<boolean name="isExplicit" value="false"/>
<string name="termination" value="russian-roulette"/>
<float name="termination-param" value="0.2"/>
<string name="direct-measure" value="solid-angle"/>
<string name="indirect-measure" value="hemisphere"/>
<string name="indirect-warp" value="cosine-hemisphere"/>
</integrator>
The path integrator has three additional methods : PathIntegrator::implicitLi()
, PathIntegrator::explicitLi()
and PathIntegrator::stopPath()
. You will implement the implicit and explict path tracing logic in the first two functions, whereas the stopPath
function defines the stopping criterion for your path construction. Depending how you implement your path tracing algorithms (i.e., whether recursively or in a loop), you may need extra logic/variables to bookkeep the current number of vertices in a path that you're constructing, for instance. Feel free to add any additional methods to the integrator that you may need to structure your particular algorithm.
Implement PathTracer::implicitLi()
for both uniform and cosine-weighted recursive indirect lighting sampling distributions. Each path should bounce m_terminationParam
times from the eye; clamping the maximum path length will introduce bias in your estimator. Note, for example, that m_terminationParam = 0
should yield an image where only the pixels overlaping the emitters are non-zero, whereas m_terminationParam = 1
should generate an image with only direct illumination.
Once you're done this task, you can test your implementation on scenes/hw4/cornellbox.xml
. If your (potentially recursive) path construction is implemented correctly, your rendered image should look something like this:
Notice how the colour of the walls bleeds onto the side of the boxes: your first global illumination effect!
The reason why the image you just rendered is noisy is that we are blindly tracing paths, hoping to eventually hit a light. We can take advantage of improved importance sampling schemes from Assignment 3 to arrive at a more effective estimator: now, every time light scatters at a surface point, we will split our estimator to compute both a direct and indirect illumination estimate. In other words, at every bounce we will sample both a direct and an indirect contribution at the intersection point, while being careful to avoid double counting of the same transport contributions (as discussed in class): L(x,ω)=Le(x,ω)+Ldir(x,ω)+Lind(x,ω).
This estimator performs explicit direct illumination estimation at every path vertex, and implementing this explicit path tracing algorithm is the goal of this next task.
You can use your DirectIntegrator::Li()
as a starting point to implement PathIntegrator::explicitLi()
. The key difference between explicit and implicit path tracing is that direct and indirect lighting contributions are decoupled, meaning that they are sampled separately. To avoid double counting, an indirect ray needs to be re-sampled if it intersects a light (and, so, contributes directly to transport along the path).
Implement PathIntegrator::explicitLi()
where both subtended solid angle and area sampling are allowed for direct lighting sampling, and uniform and cosine-weighted distributions are both allowed for indirect lighting. Below are reference images for different number of indirect bounces, rendered at 256 spp, with solid angle sampling is used for direct lighting and cosine-weighted sampling is used for indirect lighting.
Artificially truncating path lengths to a fixed depth introduces bias. To avoid this problem, you will implement a Russian roulette termination method that probabilistically terminates your path construction. Use the termination
parameter to branch on this feature, and the terminationParam
to query the RR termination probability.
Your last task is to implement a rectangular area light. There are two approaches you can use.
Create a new shape Rectangle
in src/rectangle.cpp
and include/nori/rectangle.h
and implement all methods inherited from Shape
. You will have to analytically derive the intersection of a ray with your (clipped) plane. The implementation of your rectangle should support the following parameters : center
, width
, height
and surface normal
. You can refer to PBRT for more details, if necessary.
Note that you only need to implement Rectangle::samplePosition()
and Rectangle::pdfPosition()
since only area sampling is supported in this case. Simply throw a NoriException
for subtended solid angle sampling to avoid unimplemented virtual function errors.
Begin by familiarizing yourself with the Mesh
class to see how vertices, faces and normals are stored. Next, add Mesh::samplePosition()
and Mesh::pdfPosition()
and implement them.
You may find the DiscretePDF
class located in nori/include/dpdf.h
useful to implement the sampling step. We suggest that you use this class to build a discrete probability distribution that will allow you to pick a triangle proportional to its surface area (relative to the entire mesh's surface area). Once a triangle is chosen, you can (uniformly) sample a barycentric coordinate (α,β,1−α−β) using the mapping (αβ)↦(1−√1−ξ1ξ2√1−ξ1,)
where ξ1,ξ2∈[0,1) are uniform random variables.
A scene-dependent precomputation is necessary to build the discrete probability distribution, and this can be performed in the Mesh::activate()
function, which is automatically invoked by the XML parser. To add a rectangle to your scene, simply create a unit square in a .obj
file and attach an area light to it.
Render your final Cornell Box scene with a rectangular ceiling light, using explicit path tracing with Russian Roulette termination. Below is a reference image that was rendered to convergence.
If you implemented the rectangle light using a mesh, you'll have to modifiy the XML file to use your mesh light. Use the center point of the plane to define an appropriate translation and, if you use a unit square mesh object, width and height can be used to define your non-uniform scaling factors.
Finished? Submit your modified files and new files if you used the first approach for the rectangle light. Render the 3 final scenes, run the given script and submit!