CS 143 / Project 2 / Local Feature Matching

72 good matches, 5 bad matches, 93.5% good ratio.

Fist of all, special Thanks to one TA of Computer Vision. Sorry that I do not know your name...Thank you so much for offering great help even when you are not on duty. I followed the handout and finished all the requirements. To make the assignment cooler, I tried different methods such as non-maximum suppression and three ways of getting features. For Notre Dame images, I got a result of 72 total good matches, 5 total bad matches. The ratio of "good" here is 93.5%. Considering that the higher "confidence" is (you know what it means, right?), the larger probability an "interesting point" would have to perform a "good match", I gave codes for you to choose any rank of confidences you want. I chose the first 99% higher confidences here.

  1. 93.5% good matches.
  2. some of the "bad matches" are in fact not that bad...
  3. sort and choose any rank of "confidences"

Notices

  1. In order to get rid of the noise, I tried to filter the image with different Gaussians by different times. Their results would be a little bit different from each other's, and I used only a pre-gaussian in my hand-in code.
  2. In order to get rid of the noise caused by the edges of the image, avoid the points that are very close to the edges.
  3. I found a series of method about the "Non-maximal suppression for features/corners" from P. D. Kovesi. MATLAB and Octave Functions for Computer Vision and Image Processing. It's open and available to public and on this website . However, I prefered my own code, so I just sited this information and used my own non_maximum method with "colfilt".
  4. I've tried three ways of get_features: The first one is to open the feature_window of each interesting points then caculte the gradients and cluster them. The second one is to use "sobel" at 8 different orientations and get the 8 gradients of each pixel, then use them as the 8 clusteres of the gridents to get the feature. In the end, I chose the third one, which is much faster and more accurate. For the first method, lots of pixels which were covered by more than one feature_window got caculted more than once, and it would also take time to pick them out. For the second method, there was in fact only four clusters because the gradients of oppisite orientations are opposite.
  5. I found that the higher the confidence is, the higher probability the points have for a good match, so I sorted the confidences of the points in ascending order, and set a paremeter so that we can get any confidence we want to get. In the code I got first 99% of the confidences of the points with confidences larger than 0.35.
  6. Example of code with highlighting

    highlighting (1)sort, rearrange and choose any rank of "confidences". (2)cluster and get features fast.

    
    %example code
    [distance_s, index_s] = sort(distance, 2,'ascend');
    matches( : , 1) = 1: num_features1;
    matches( : , 2) = index_s(:,1);
    
    %get confidences
    ratio = distance_s(:,1)./distance_s(:,2);
    confidences = 1-ratio;
    
    %select points with confidences larger than 0.35
    pick = confidences >0.35;
    
    %select points with any confidence and rank of confidence you want to set
    %here we choose the highest confidences, because the higher the confidence
    %is, the higher probability the points have for a good match
    [select, kk] = sort(confidences, 'descend');
    %you can set the 0.99 here to be any number less than 1
    matches = matches(kk(1:ceil(sum(pick)*0.99)),:);
    confidences = confidences(kk(1:ceil(sum(pick)*0.99)),:);
    
    
    
    cluster1 = ge(gradientxy, bin1).*lt(gradientxy,bin2).*magnitudexy;
    cluster2 = ge(gradientxy, bin2).*lt(gradientxy,bin3).*magnitudexy;
    cluster3 = ge(gradientxy, bin3).*lt(gradientxy,bin4).*magnitudexy;
    cluster4 = ge(gradientxy, bin4).*lt(gradientxy,bin5).*magnitudexy;
    cluster5 = ge(gradientxy, bin5).*lt(gradientxy,bin6).*magnitudexy;
    cluster6 = ge(gradientxy, bin6).*lt(gradientxy,bin7).*magnitudexy;
    cluster7 = ge(gradientxy, bin7).*lt(gradientxy,bin8).*magnitudexy;
    cluster8 = ge(gradientxy, bin8).*lt(gradientxy,bin9).*magnitudexy;
    
    %initialize the features
    features = zeros(size(x,1), 128);
    
    n = length(x);
    
    %initialize the eight parts corresponding to each part of the features
    r1 = zeros(n,16);
    r2 = zeros(n,16);
    r3 = zeros(n,16);
    r4 = zeros(n,16);
    r5 = zeros(n,16);
    r6 = zeros(n,16);
    r7 = zeros(n,16);
    r8 = zeros(n,16);
    
    %get the features
    vec = [1, (1+feature_width/4) , (1+feature_width/2), (1+feature_width/4*3)];
    for ii = 1:n
        for i = 1:4
            for j = 1:4
                r1(ii,4*i+j-4) = sum(sum(cluster1(y(ii)-feature_width/2+vec(j):y(ii)-feature_width/2+vec(j)+(feature_width/4-1), x(ii)-feature_width/2+vec(i):x(ii)-feature_width/2+vec(i)+(feature_width/4-1))));
                r2(ii,4*i+j-4) = sum(sum(cluster2(y(ii)-feature_width/2+vec(j):y(ii)-feature_width/2+vec(j)+(feature_width/4-1), x(ii)-feature_width/2+vec(i):x(ii)-feature_width/2+vec(i)+(feature_width/4-1))));
                r3(ii,4*i+j-4) = sum(sum(cluster3(y(ii)-feature_width/2+vec(j):y(ii)-feature_width/2+vec(j)+(feature_width/4-1), x(ii)-feature_width/2+vec(i):x(ii)-feature_width/2+vec(i)+(feature_width/4-1))));
                r4(ii,4*i+j-4) = sum(sum(cluster4(y(ii)-feature_width/2+vec(j):y(ii)-feature_width/2+vec(j)+(feature_width/4-1), x(ii)-feature_width/2+vec(i):x(ii)-feature_width/2+vec(i)+(feature_width/4-1))));
                r5(ii,4*i+j-4) = sum(sum(cluster5(y(ii)-feature_width/2+vec(j):y(ii)-feature_width/2+vec(j)+(feature_width/4-1), x(ii)-feature_width/2+vec(i):x(ii)-feature_width/2+vec(i)+(feature_width/4-1))));
                r6(ii,4*i+j-4) = sum(sum(cluster6(y(ii)-feature_width/2+vec(j):y(ii)-feature_width/2+vec(j)+(feature_width/4-1), x(ii)-feature_width/2+vec(i):x(ii)-feature_width/2+vec(i)+(feature_width/4-1))));
                r7(ii,4*i+j-4) = sum(sum(cluster7(y(ii)-feature_width/2+vec(j):y(ii)-feature_width/2+vec(j)+(feature_width/4-1), x(ii)-feature_width/2+vec(i):x(ii)-feature_width/2+vec(i)+(feature_width/4-1))));
                r8(ii,4*i+j-4) = sum(sum(cluster8(y(ii)-feature_width/2+vec(j):y(ii)-feature_width/2+vec(j)+(feature_width/4-1), x(ii)-feature_width/2+vec(i):x(ii)-feature_width/2+vec(i)+(feature_width/4-1))));
            end
        end
    end
    
    features = [r1  r2   r3   r4   r5   r6   r7   r8];
    
    

    Results in a table

    Special Thanks to one TA of Computer Vision. Sorry that I do not know your name...Thank you so much for offering great help even when you are not on duty