Example of an HDR image
Cameras are incapable of capturing the full dynamic range of real world scenes. In other words, you cannot vary exposure over a single photo. To produce an image that captures a large dynamic range, we can combine elements from various photos of the same scene at different exposures.
Our goal is to recover the unknown scene radiance value at each pixel in the image. What we have with each image is the exposure at each given pixel. We can attempt to recover the true radiance of each pixel. First I solve for g=ln(f-inverse) which maps pixel values to the log of exposure values. After that, I find the hdr radiance map.
Once I have recovered the radiance map, we need to map the values to an output image. There are three different methods that are used, a simple global scale, durant's local mapping method, and another global mapping method. Here is a side by side comparison of the three.For Durant's algorithm, first we have the input radiance map. We then map that to an intensity map using MATLAB's rgbtogray method. I then apply a bilateral filter that I implemented (brute-force) and extract the detail from it. I reduce the contrast in the filtered image then add back the image's saved detail. I then apply the color back to the image.