CS 143 / Project 2 / Local Feature Matching

Feature matching of Notre Dame.

This project is about matching features between two different pictures of the same object. Take a look at the picutre right to this text, the circles are the result of feature detecting. And the green ones corrsponds to the right match.

Algorithm

We used a simplified version of SIFT pipiline here, it mainly consists of 3 steps.

  1. Feature point detection;
  2. Feature description;
  3. Feature matching.

Feature point detection

I just simply follow the instructions in the handout, and made several little adjustments to make the project run faster.

  1. Instead of calculating gradient directly, I used horizontal edge filter and vertical edge filter to get the Ix, Iy matrix of the original picture. This method is much more efficient.
  2. I normalized the harrison matrix to make it easier to determine the threshold value for the binary picture.
  3. I used BWCONNCOMP and COLFILT to get a local maximum of cornerness. Before doing the non-maximum suppression, it will get thousands of feature points and it's not very efficient. After non-maximum suppression, I got only about one tenth the amount of points from each image and got half the amount matched, and almost 65% matches of that are correct. This is very computationally efficient in all three steps.

    Image after tackled by BWCONNCOMP and COLFILT, then get a local maximum point in every white patch.

  4. In order to suppress the effect of picutre edge. I pad the array within the image to simply make it black.
  5. 
            for i=1:cutoff_frequency*2+4
                BW(i, :) = 0;
                BW(size(image,1)-i+1,:) = 0;
                BW(:,i) = 0;
                BW(:, size(image,2)-i+1) = 0;
            end
        

Feature description

Several adjustments:

  1. To make the gradient more accurate and noise eliminated, I filtered the image first.
  2. In order to make computation more efficient, instead of calculating gradient directly I used horizontal edge filter and vertical edge filter to get the Ix, Iy matrix of the original picture.
  3. I used the function atan() to get the angle of the gradient. As the atan() is in the range of [-PI/2, PI/2], I used a little trick here. And I generate a histogram by dividing the degree 2*PI to eight bins. Every point with a gradient degree just fall into the closest bin. Say, if the degree of a point is PI/6 and the magnitute is 1.5, then it contributes to the (0, PI/4) by 1.5
  4. I have done the "normalize -> threshold -> normalize again" step. It has improved performance of the project from 50% good matches to 60% good matches.
  5. The feature-width is set 52 here to get a more accurate and sufficient description of the feature.

Feature matching

Several adjustments:

  1. Set the NNDR ratio as 0.75

Results in a table

Nore Dame

Got 237 feature points from left image, 319 feature points from right image;

71 matches in the end, 50 matches are correct. (70% good matches)

House

Got 41 feature points from left image, 43 feature points from right image;

20 matches in the end, 15 matches are correct judged by my eyes. (75% good matches)

  1. To achieve the best visual effect, I used different set of parameters for different pair of images.
  2. Because I didn't make features scale and orientation invarient, some other pair of images are not matched well. So I didn't include those bad matches in in this page.'