CS 143 / Project 2 / Local Feature Matching

The goal of this assignment is to create a local feature matching algorithm using techniques described in Szeliski chapter 4.1. The pipeline we suggest is a simplified version of the famous SIFT pipeline. The matching pipeline is intended to work for instance-level matching -- multiple views of the same physical scene. The work was divided in 3 steps:

  1. Corner detection and interest points
  2. Features extraction
  3. Matching points

Harris corner detector

The next code shows how the harris corner detector was implemented.


dx = [-1 0 1; -1 0 1; -1 0 1];
dy = [1 1 1; 0 0 0; -1 -1 -1];
sigma = 2;
g = fspecial('gaussian',4*sigma,sigma);

Ix = imfilter(image,dx,'same');
Iy = imfilter(image,dy,'same');
Ix2 = Ix.^2;
Iy2 = Iy.^2;
Ixy = Ix.*Iy;

gIx2 = imfilter(Ix2,g);
gIy2 = imfilter(Iy2,g);
gIxy = imfilter(Ixy,g);
sizes = size(image);

for i=1:sizes(1)
    for j=1:sizes(2)
        H = [gIx2(i,j) gIxy(i,j); gIxy(i,j) gIy2(i,j)];
        R(i,j) = det(H) - 0.04*(trace(H)^2);
    end
end

%%img = (gIx2.*gIy2)-(gIxy.^2) - 0.06*(((gIx2.^2)+(gIy2.^2)).^2);
%sizes = size(img);
Rlist = sort(R(:)');
min = Rlist(size(Rlist,2) - 100);

x= [];
y = [];
imshow(image);
hold on;
for i=1:sizes(1)
    for j=1:sizes(2)
        if R(i,j) > min
            plot(j,i,'o');
            x = [x i];
            y = [y j];
            
        end
    end
end

Features

The SIFT has the following steps

In each interest point, calculate the histogram of directions of each quadrant. After normalized, the histograms are added and a feature matrix of (number f feature points x 8) is created. The 8 dimensions of the histogram are up, down, right, left, up-left, up-right, down-left and down-right. The following code do this step


features = zeros(size(x,1),8);


xorig = x;
yorig = y;


dright = [-1 0 1; -1 0 1; -1 0 1];
dleft = [1 0 -1; 1 0 -1; 1 0 -1];
dup = dleft';
ddown  = dright';
dupright = [0 1 1; -1 0 1; -1 -1 0];
dupleft = [1 1 0; 1 0 -1; 0 -1 -1];
ddownright = [-1 -1 0; -1 0 1; 0 1 1];
ddownleft = dupright';



for i=1:size(x,2)
    while x(i) <= feature_width/2
        x(i) = x(i) + 1;
        x(i)
    end        
    while x(i) >= (size(image,1) - feature_width/2)
        x(i) = x(i) - 1;
    end
       
    while y(i) <= feature_width/2
        y(i) = y(i) + 1;
    end        
    while y(i) >= (size(image,2) - feature_width/2)
        y(i) = y(i) - 1;
    end

    l = x(i) - feature_width/2;
    c = y(i) - feature_width/2;
    
    
    featr = [];
    f = [];
    while l < feature_width
        while c < feature_width
            featr = [0 0 0 0 0 0 0 0];
            featr(1) = featr(1) + sum(sum(imfilter(image(l:l+3,c:c+3),dup)));
            featr(2) = featr(2) + sum(sum(imfilter(image(l:l+3,c:c+3),ddown)));
            featr(3) = featr(3) + sum(sum(imfilter(image(l:l+3,c:c+3),dright)));
            featr(4) = featr(4) + sum(sum(imfilter(image(l:l+3,c:c+3),dleft)));
            featr(5) = featr(5) + sum(sum(imfilter(image(l:l+3,c:c+3),dupright)));
            featr(6) = featr(6) + sum(sum(imfilter(image(l:l+3,c:c+3),dupleft)));
            featr(7) = featr(7) + sum(sum(imfilter(image(l:l+3,c:c+3),ddownright)));
            featr(8) = featr(8) + sum(sum(imfilter(image(l:l+3,c:c+3),ddownleft)));
            
            f = [f; featr/norm(featr)];
            featr = [];
            c = c+4;
        end
        l = l+4;
        c = y(i) - feature_width/2;
    end
    features(i,:) = sum(f);
    f = [];
end


Matching

This project do not do the matching part. But the idea was, for the set of interest points, chech all the interest points of the another image and compare to see what was the best match.

Results

We can empiriclly see that the same cluster of interest points was found in botch pictures

More Results of the corner detector