Overview

This assignment consists of two parts. In the first, you'll get to implement Monte Carlo sampling techniques that warp an initially uniform distribution into a number of different target distributions. Note that all work you do in this assignment will serve as building blocks in later assignments when we apply Monte Carlo integration to render images. In particular, assignments 4 and 5 will build on this functionality to create material models and light sources suitable for photorealistic rendering.

In the second part, you will implement two very basic rendering algorithms: the first algorithm is fully deterministic and renders a scene illuminated by a single point lights. The second is a stochastic rendering algorithm known as Ambient Occlusion, which assumes that the scene is uniformly illuminated from all directions.

As usual, begin by importing the latest base code updates into your repository by running

$ git pull upstream main

If there were any concurrent changes to the same file, you may have to perform a merge (see the git tutorials under "Preliminaries" for more information).

For this homework, we will be using the scene scenes/pa2/ajax-ao.xml and scenes/pa2/ajax-simple.xml. This scene references the 3D scan of a bust that is fairly large (~500K triangles). Due to its size, the actual mesh is not part of the repository and can be downloaded here.

Part 1: Monte Carlo Sampling (60 pts)

In this exercise you will generate sample points on various domains: the plane, disks, spheres, and hemispheres. The base code has been extended with an interactive visualization and testing tool to make working with point sets as intuitive as possible.

After pulling the latest updates and recompiling, you should see an executable named warptest. Run this executable to launch the interactive warping tool, which allows you to visualize the behavior of different warping functions given a range of input point sets (independent, grid, and stratified). Up to now, we only discussed uniform random variables which correspond to the "independent" type, and you need not concern yourself with the others for now.

Part 1 is split into several subsections; in each case, you are asked to implement a distribution function and a matching sample warping scheme. It is crucial that both are consistent with respect to each other (i.e. that warped samples have exactly the distribution described by the density function). Significant errors can arise if inconsistent warpings are used for Monte Carlo integration. The warptest tool provided by us implements a \(\chi^2\) test to ensure that this consistency requirement is indeed satisfied.

The input point set (stratified samples passed through a "no-op" warp function)
This point set passed the test for uniformity.
A more interesting case that you will implement (with a grid visualization of the mapping)
This warping passed the tests as well.

Implement the missing functions in class Warp found in src/warp.cpp. This class consists of various warp methods that take as input a 2D point \((s, t) \in [0, 1) \times [0, 1) \) and return the warped 2D (or 3D) point in the new domain. Each method is accompanied by another method that returns the probability density with which a sample was picked. Our default implementations all throw an exception, which produces an error message in the graphical user interface of warptest. The slides on the course website provide a number of useful recipes for warping samples and computing the densities, and the PBRT textbook also contains considerable information on this topic that you should feel free to use.

Pass the \(\chi^2\) test for each one of the following sampling techniques and include corresponding screen shots in your report:

Part 2: Two simple rendering algorithms (40 points)

In this part of the homework, you'll implement two basic rendering algorithms that set the stage for fancier methods investigated later in the course. For now, both of the methods assume that the object is composed of a simple white diffuse material that reflects light uniformly into all directions.

The Ajax bust illuminated by a point light source.
The Ajax bust rendered using Ambient Occlusion.

Part 2.1: Point lights (20 points)

The updated base code includes a new scene scenes/pa2/ajax-simple.xml that instantiates a (currently nonexistent) integrator/rendering algorithm named simple, which simulates a single point light source located at a 3D position position, and which emits an amount of energy given by the parameter energy.

<!-- An excerpt from scenes/pa2/ajax-simple.xml: -->
<integrator type="simple">
    <point name="position" value="-20, 40, 20"/>
    <color name="energy" value="3.76e4, 3.76e4, 3.76e4"/>
</integrator>

Your first task will be to create a new Integrator that accepts these parameters. This should be fairly reminiscent of the normal integrator from Assignment 1. Take a look at the PropertyList class, which should be used to extract the two parameters.

Let \(\mathbf{p}\) and \(\mathbf{\Phi}\) denote the position and energy of the light source, and suppose that \(\mathbf{x}\) is the point being rendered. Then this integrator should compute the quantity

\[ L(\mathbf{x})=\frac{\Phi}{4\pi^2} \frac{\mathrm{max}(0, \cos\theta)}{\|\mathbf{x}-\mathbf{p}\|^2} V(\mathbf{x}\leftrightarrow\mathbf{p}) \]

where \(\theta\) is the angle between the direction from \(\mathbf{x}\) to \(\mathbf{p}\) and the shading surface normal (available in Intersection::shFrame::n) at \(\mathbf{x}\) and

\[ V(\mathbf{x}\leftrightarrow\mathbf{p}):=\begin{cases} 1,&\text{if $\mathbf{x}$ and $\mathbf{p}$ are mutually visible}\\ 0,&\text{otherwise} \end{cases} \]

is the visibility function, which can be implemented using a shadow ray query. Intersecting a shadow ray against the scene is generally cheaper since it suffices to check whether an intersection exists rather than having to find the closest one.

Implement the simple integrator according to this specification and render the scene scenes/pa2/ajax-simple.xml. Include a comparison against our reference image in your report.

Part 2.2: Ambient occlusion (20 points)

Ambient occlusion is a rendering technique that assumes that a (diffuse) surface receives uniform illumination from all directions (similar to the conditions inside a light box), and that visibility is the only effect that matters. Some surface positions will receive less light than others since they are occluded, hence they will look darker. Formally, the quantity computed by ambient occlusion is defined as

\[ L(\mathbf{x})=\int_{H^2(\mathbf{x})}V(\mathbf{x}, \mathbf{x}+\alpha\omega_i)\,\frac{\cos\theta}{\pi}\,\mathrm{d}\omega_i \]

which is an integral over the upper hemisphere centered at the point \(\mathbf{x}\). The variable \(\theta\) refers to the angle between the direction \(\omega_i\) and the shading normal at \(\mathbf{x}\). The ad-hoc variable \(\alpha\) adjusts how far-reaching the effects of occlusion are.

Note that this situation—sampling points on the hemisphere with a cosine weight—exactly corresponds to one of the warping functions you implemented in part 1, specifically squareToCosineHemisphere. Use this function to sample a point on the hemisphere and then check for visibility using a shadow ray query. You can assume that occlusion is a global effect (i.e. \(\alpha=\infty\)).

One potential gotcha is that the samples produced by squareToCosineHemisphere lie in the reference hemisphere and need to be oriented according to the surface at \(\mathbf{x}\). Take a look at the Frame class, which is intended to facilitate this.

Implement the ambient occlusion (ao) integrator and render the scene scenes/ajax/ajax-ao.xml. Include a comparison against our reference image in your report.