I wrote an application which detects keypoints, compute their descriptors and match them with BruteForce in OpenCV. That works like a charme.
But: How is the distance in the match-objects computed?
For example: I'm using SIFT and get a descriptor vector with 128 float values per keypoint. In matching, the keypoint is compared with for example 10 other descriptors with the same vectorsize. Now, I get the "best match" with a distance of 0.723.
Is this the average of every single euclidean distance of all floats of one vector to another? I just want to understand how this one value is created.