Humans, by nature, can gather detail from scenes with extremely varying illuminations and light patterns. Cameras, on the other hand, do not have sugh a large dynamic range. In order to duplicate the vivid detail our eyes can see in a photograph, we must take multiple exposure images and combine them to create a single high dynamic range photograph.
There are two parts to creating a HDR image: the radiance map must be obtained, and then that radiance map needs to be converted into a viewable image. To accomplish the first task, we retreived the inverse of the function mapping exposure to pixel value. For the second, we used local and global tone mapping to create a displayable image.
The pixel value in an image Zij (i is the pixel in image j), is the result of a function of unknown scene radiance and a known exposure: Zij = f(Ei Δtj ), where Ei is the unknown radiance. We didn't solve for f, but rather for g=ln(f-1). This g function maps pixel values (0:255) to the log of the exposure values: g(Zij = ln(Ei) + ln(tj ). Now, if you've been following along, you'll see that we have four variables: two knowns (Zij, ln(tj ) ), and two unknowns (g and ln(Ei ) ). However, the scene is static so we know that Ei remains constant throughout the image sequence.
An ddditional consideration we took into account was keeping g smooth by adding a penalty accodring to the magnitude of the second derivative: g''(x) = g(x-1) + g(x+1) - 2×g(x).
In order to use the robust systems Matlab already has in place, we simply made a matrix system of equations for Matlab to solve:
In order to display the image, we need to implement a local tone-mapping algorithm. We used a version of the algorithm presented in Durand 2002. The steps as follows:
The bilateral filter was the tricky part. Using a simple Gaussian filter causes halos due to large intensity differences. To counter this, we applied a pentalty on the intensity difference. So we start with a spatial Guassian, then a penalty Gaussian on the intensity difference. This was done by creating patches of the image around a certain pixel. The spatial Gaussian was taken of the patch, and the penalty Gaussian was calculated using the intensity difference between the rest of the patch and the current pixel.