CS 143 / Project 2 / Local Feature Matching

Example of a right floating element.

The goal of this prject is to create a local fearure matching algorithm. There are three parts included in this project as followig

  1. Interest point detection
  2. Local feature description
  3. Feature matching

For the Interest point detection, Harris corner detector is implemented. it will detecte some Interest points that will be used for describing and matching features later

For the Local feature description, a SIFT-like local feature description is implemented. it will generate Local feature description for each interest points that are detected in part1.

For the Feature matching, "nearest neighbor distance ratio test" method is implemented. it will try to match local features that are generated from part2. In this way, the program try to find all the possible matching point pair from two or more similar images.

Progarm Description

Interest point detection (get_interest_points.m )

In this part, Harris corner detector is implemented.During this part, first we need to caculate the cornerness function by using the image derivatives and Gaussian filter, then use function COLFILT() to do the non-maximum suppression. In this way can we detect a list of interest points.

Local feature description (get_features.m)

In this part a SIFT-like local feature description is implemented.First, by using each interest points we get from part 1 as centers, we generate a 4*4 grid of cells, each feature_width/4. Secondly, for each cell we use 8 filter in 8 orientations to generate the Histograms of orgented gradients, use the 4*4*8= 128 dimensions as the feature elements. At last, each feature is normalized to unit length.

Part of Code about how to generate the Histograms of orgented gradients


             patch=image(cey-feature_width/4:cey,cex-feature_width/4:cex);
             rightfliter=[1 0 -1;2 0 -2; 1 0 -1];
             leftfliter=[-1 0 1;-2 0 2; -1 0 1];
             upfliter=rightfliter';
             downfliter=leftfliter';
             rightdownfliter=[0 -1 -2; 1 0 -1; 2 1 0];
             rightupfliter=[2 1 0; 1 0 -1; 0 -1 -2];
             leftdownfliter=[-2 -1 0; -1 0 1; 0 1 2];
             leftupfliter=[0 1 2; -1 0 1; -2 -1 0];
             
             right=imfilter(patch,rightfliter,'same');

             left=imfilter(patch,leftfliter,'same');
             up=imfilter(patch,upfliter,'same');
             down=imfilter(patch,downfliter,'same');

             rightdown=imfilter(patch,rightdownfliter,'same');
             rightup=imfilter(patch,rightupfliter,'same');
             leftdown=imfilter(patch,leftdownfliter,'same');
             leftup=imfilter(patch,leftupfliter,'same');
             
            
             p1=sum(right(right>0));
             p2=sum(left(left>0));
             p3=sum(up(up>0));
             p4=sum(down(down>0));
             p5=sum(rightdown(rightdown>0));
             p6=sum(rightup(rightup>0));
             p7=sum(leftdown(leftdown>0));
             p8=sum(leftup(leftup>0));
             
	     list=[p1 p2 p3 p4 p5 p6 p7 p8];
             list = list ./ norm(list);
             list(list > 0.4) = 0.4;

Feature matching (match_features.m)

For the Feature matching, "nearest neighbor distance ratio test" method is implemented. First, use "for" loop to cacluate the Euclid distance between each point features from image1 and image2, then get the ratio of the shortest distance and the seconde shortest distance. If the 1-ratio is larger than the threshold(0.2 for this program), output it as a good match.

Part of Code about how to do ratio test


          for i=1:num_features1
	    dis=realmax;
	    secdis=realmax;
	    index=0;
	    d1=features1(i,:);
	    for j=1:num_features2
		d2=features2(j,:);
		if(sqrt(sum(sum((d1-d2).^2)))<=dis)
		    secdis=dis;
		    dis=sqrt(sum(sum((d1-d2).^2)));
		    index=j
		elseif(sqrt(sum(sum((d1-d2).^2)))=0.2)
		matches=[matches;i index];
		confidences=[confidences;c];
	    end
    
	end  

Results in a table

Result: 86 total good matches, 17 total bad matches. If using the normalized patches, it will decrease about 10-20%. When the project is run on additional test cases, the correct rate will be lower since it is asked for another group of threshold to fit these test cases. Also, the quality of image pair(like the difference of size and color is too much)will affect the result of match.

Some another Example

(All the original images are from Internet)

Since the evaluation function only works for the particular image pair, so the evaluation about points in the images below is not correct.

Bugs and Disadvatages

There are some places that the program need to be improved. First, the correct rate is a little low(83%). By improving the feature description, I think it can be better. Also it runs a little slow(2 mins for 900 interest points.)