CS129 / Project 5 / HDR

Example picture with lighting issues.

Example exposure time to pixel value graph.

The concept of HDR is to accurately display an image with a range of light values that is too large for the camera to capture. Often times when taking pictures, you will see that an area of the image is either almost entirely white or entirely black because there is too little or too much light in this area. That can be fixed by changing the exposure time to capture a reasonable amount of light in that spot, but often times there is no one exposure time that will capture good light levels for a whole scene. High Dynamic Range images intelligently combine pictures of the same scene taken at multiple exposure values.

These images are produced by calculating the true "radiance" value of a pixel and displaying that value. It is assumed in this situation, where the three images are taken of the same scene at exactly the same time, that each pixel value is a function of the exposure time multiplied by the radiance value. Therefore, in order to find the radiances of each of these pixels, we need to find the function that maps that relationship of pixel value to exposure time. On the right there is an example of such a function in graph form. Note that we use the logarithm of the exposure time because it is easier to solve for the function that results from using those values.

Since the radiance maps turn out to be generally very dark, we generally apply a global tone mapping to the image to make it brighter overall. We also implemented a local tone mapping algorithm, that effectively tone maps a blurred version of the image, and then adds the details back into the image. This procedure makes use of what is called bilateral filtering. Bilateral filtering is a method of blurring an image that only blurs similar parts of the image but maintains as many of the edges as possible. To locally tone map the image, we use a biltaeral filter on a grayscale version of the image, and then extract the details by subtracting the blurred image from the original. We then scale the blurred version to have a more even contrast. Lastly we add the detail and color back into the image, for the finished product. Below are examples of a "detail" image and a bilaterally blurred image. There is also an example of the result.

You can click any of the images to see a larger version.

Example bilaterally blurred image.

Example detail image.

Example result.

To some extent, I feel that the local tone mapping flattened the lighting in the image too much, and the high dynamic range images with global scaling to brighten them looked better and had a more reasonable amount of image contrast. However, I also occaisionally had some problems with the coloring of the image in my local tone mapping algorithm, and I suspect that there may still be some bugs that I did not catch. Although it also seems strange to me that there are only color issues on some of the images. Lastly, it is worth noting that there are some alignment issues with the exposure images, unfortunately I did not code anything into my algorithm to fix that.

Results Key

HDR Radiance Map Global Tone Mapping Durand Local Tone Mapping

Results

Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.