In this lab, we're going to go outside into the night and try and take low-noise photographs in the dark. As we know, without a lot of signal (light) the noise floor of the imaging system becomes visible and our signal-to-noise ratio drops. Our task for this lab is to write scripts that will help overcome this and capture more attractive images in these experiments. Many of these will require very long exposures, which itself can build noise or be affected by moving objects (both positively and negatively). With computation, we can sample light from the scene in ways which reduce noise given what we know about how the camera's sensor responds, e.g., by aligning and averaging across images. We have five example tasks for you to potentially complete to take interesting and less noisy image, but in general this is open ended and you are free to explore any direction.
To set the tone, please read this article from Google's computational photography teammember Florian Kainz about how to take low-light photographs with cheaper camera sensors.
Everyone loves light trails because our visual cortex has a special neuron which fires on sci-fi imagery. Rotate your camera to track an object over time during a long exposure. How might we use sampling, software, and a fast framerate camera to better create these images?
We saw how to use the cross-bilateral filter in Lab 3 to denoise a flash/no-flash pair of images. However, the bilateral filter is slow. While there are methods to speed this up (as we saw in the HDR project), there are also alternative approaches. One such approach is called the guided filter by He et al., which you can read about here (along with its MATLAB implementation here). Python implementation by `lisabug' is here, plus other examples for how to use the guided filter (in C++) for flash/no-flash denoising here.
Capturing an image with a low but long exposure setting allows us to artificially introduce new light into the scene to change its appearance. One example is writing with light. Here's an example that we saw from our classmate Ishaan earlier in the course.
Having read the Google experiments by Florian Kainz, how might we write software to help us take a similar approach?
As part of our equipment pool, we have one very long lens for our Canon DSLRs—we can use it to take a photograph of something very far away at night. Experiment with the equipment during the afternoon lab, and see what kinds of problems you will face in taking a long-exposure image. What kind of software would we need to correct for these errors?
We will upload our photos to Google Drive to share them more easily. Name the files with your group member's last names. We will show the submissions at the beginning of the next class.
Here's the page: https://drive.google.com/drive/u/0/folders/1Wy0XordRJHWKsoeWIVlQVgtr5x-s9bsI (accessible with your Brown Google account).
For completion, please upload your code, input/result images, and any notes of interest as a PDF to Gradescope. Please use writeup.tex for your submission.
This lab was developed by the 1290 course staff.