Interest points detected with Harris Corner Detection.
I implemented the basic Harris Corner Detector to get interest points in the first part of this algorithm. I used derivative of gaussian filters to get the horizontal and vertical gradient images, Ix and Iy, and then squared each of those images, as well as multiplying Ix by Iy, to use in the harris cornerness function:
har = (Ix2.*Iy2)-(Ixy.^2) - (alpha*(Ix2 + Iy2).^2);
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore m
I then thresholded the resulting cornerness values, and used the function bwconncomp for local non-maximum suppression. With bwconncomp and cellfun, I could apply the max function to each separate section of corner values, to ensure none would be surrounded by other pixels of interest. Finally, I suppress the edges with the width of the eventual features, so that the later calculations I do won't exceed the indices of the image and to eliminate random edge of image interest points.
For this part of the algorithm I decided to use sobel filters to calculate the image descriptors. This is a simple filter that was easy to rotate. I calculated the gradient images in each direction, and then took blocks around each keypoint within each gradient image. I weighted these with a gaussian, and then used mat2cell to turn each gradient block into a 4 x 4 array of cells. Then I calculated descriptor vectors by summing the values in each cell.
Here I implemented a simple NNDR test for the Euclidean differences of all the interest points.
73% accuracy on given Notre Dame photos.