<= as the operator in the box iteration instead of < so we could capture the
highest valid x,y values.
|
|
(y * num_columns + x) *
num_super_samples.
square_root(super_sample) number of columns and rows. We needed to index
each
supersample square
and run the triangle test at the center of said sub-pixel. We then write out each sub-pixel value to the
sample
buffer if it lies in the triangle that we are sampling. If the sub-pixel is not in the triangle, then we do
not
write out
its value, as white (the default color) is already in the sample buffer.
|
|
|
|
|
|
This image shows a triangle with red, green, and blue vertices. The interior colors are smoothly interpolated using barycentric coordinates.
|
Barycentric coordinates are a way to describe the position of a point inside a triangle by expressing it as a weighted combination of the triangle's three vertices.
Each point inside the triangle is assigned three weights (usually called α, β, and γ), which tell us how close the point is to each of the triangle's corners. These weights always add up to 1.
For example:
This is very useful in computer graphics because we can use these weights to smoothly interpolate values-like color, texture, or lighting-across the triangle.
|
Pixel sampling is the process of determining the final color of a pixel on screen by looking up color data from a texture image. In texture mapping, screen-space triangles are mapped to texture-space coordinates, and pixel sampling helps decide which color from the texture should be used at each pixel.
To implement pixel sampling for texture mapping, we updated the rasterize_textured_triangle function
to compute barycentric coordinates for each pixel covered by the triangle.
For every pixel center (x + 0.5, y + 0.5), we calculated the barycentric weights (α, β, γ) and used
them to interpolate the UV coordinates from the triangle's vertices.
To prepare for proper texture sampling, we also computed the UV coordinates at neighboring positions (x +
1, y) and (x, y + 1),
which allowed us to estimate the partial derivatives ∂u/∂x, ∂v/∂x, ∂u/∂y,
and ∂v/∂y.
These are passed into the SampleParams struct, along with the current UV and the selected pixel and
level sampling modes (psm and lsm), which the GUI can toggle.
Then we passed the SampleParams to the tex.sample() function. Inside the texture class,
we implemented both nearest-neighbor and bilinear sampling:
sample_nearest, we multiplied the UV coordinates by the
mip level's texture size and clamped them to the texture boundaries before retrieving the nearest texel.
sample_bilinear, we calculated the four surrounding
texels and used bilinear interpolation based on the fractional parts of the scaled UVs.
This setup allows flexible switching between sampling modes and supports antialiasing via mipmap level
selection, which we handle in later parts of the assignment (e.g., implementing get_level and L_LINEAR
interpolation).
Capture screenshots with:
|
|
|
|
Bilinear sampling performs significantly better when:
This is because nearest sampling may pick discontinuous texel values, resulting in visible artifacts, while bilinear blends neighboring texels to produce a more consistent appearance.
Level sampling is the process of choosing which mipmap level to use when looking up a texture during rendering. Mipmaps are precomputed, downscaled versions of a texture that help improve performance and reduce aliasing when the texture is viewed at smaller sizes on screen.
Instead of always sampling from the original high-resolution texture (level 0), we select a lower-resolution mipmap level depending on how much the texture is being minified. This helps prevent visual artifacts such as shimmering and moire patterns.
To implement level sampling, we modified the rasterize_textured_triangle function
to compute screen-space derivatives of the texture coordinates. For each pixel inside the triangle, we calculated
the interpolated (u, v) using barycentric coordinates.
To estimate texture coordinate changes, we computed the coordinates at neighboring points: (x+1, y)
and (x, y+1). These were used to generate p_dx_uv and p_dy_uv. We then
filled the SampleParams struct with the current coordinates and their derivatives, and passed it to
tex.sample(sp).
Inside Texture::sample, we handled three level sampling modes:
get_level, rounds it to
the nearest integer, and samples from that level.
In get_level, we computed the norm of the derivatives in UV space, scaled them by the base level's
width and height, and used log2 of the maximum value to estimate the mipmap level. We clamped the
result to ensure it remains non-negative.
This implementation helps reduce aliasing when textures are minified and provides smoother transitions between mipmap levels.
In texture mapping and image filtering, various techniques are used to balance performance, memory, and image quality. The table below compares six such techniques:
| Technique | Speed | Memory Usage | Antialiasing Power |
|---|---|---|---|
| Pixel Sampling (Nearest / Bilinear) | Very fast | Low | Low to Moderate |
| Level Sampling (L_ZERO / L_NEAREST / L_LINEAR) | Fast (L_ZERO) to Moderate (L_LINEAR) | Moderate (due to mipmaps) | Moderate to High |
| Anisotropic Filtering | Slower than mipmaps and bilinear filtering | Higher (requires directional sampling) | Very High (especially for oblique surfaces) |
| Summed Area Tables (SATs) | Fast lookup after preprocessing | High (stores summed values for every texel) | Very High (precise rectangular averaging) |
| Trilinear Filtering | Moderate | Moderate (uses two mip levels) | High (smoother LOD transitions) |
Each technique serves a different purpose depending on the texture distortion, viewing angle, and performance requirements. For example, anisotropic filtering is ideal for extreme angles, while summed area tables are powerful for large blur kernels and rectangular averaging.
These 8 images demonstrate the visual results of different combinations of pixel sampling and level sampling strategies:
Capture screenshots with:
|
|
|
|
|
|
|
|
We implemented anisotropic filtering or summed area tables.