3
votes

I wrote an application which detects keypoints, compute their descriptors and match them with BruteForce in OpenCV. That works like a charme.

But: How is the distance in the match-objects computed?

For example: I'm using SIFT and get a descriptor vector with 128 float values per keypoint. In matching, the keypoint is compared with for example 10 other descriptors with the same vectorsize. Now, I get the "best match" with a distance of 0.723.

Is this the average of every single euclidean distance of all floats of one vector to another? I just want to understand how this one value is created.

1

1 Answers

2
votes

By default, from the Open-CV docs, the BFMatcher uses the L2-norm.

C++: BFMatcher::BFMatcher(int normType=NORM_L2, bool crossCheck=false )

Parameters: 
normType – One of NORM_L1, NORM_L2, NORM_HAMMING, NORM_HAMMING2. 
L1 and L2 norms are preferable choices for SIFT and SURF descriptors ...

See: http://docs.opencv.org/modules/features2d/doc/common_interfaces_of_descriptor_matchers.html?highlight=bruteforcematcher#bruteforcematcher

The best match is the feature vector with the lowest distance compared to all the others.