i am using the following code in the descriptor_extractor_matcher.cpp sample to compute the descriptors of img1 (Mat descriptors01), write it to my disk and load it back (Mat descriptors1). (same steps for the keypoints, but code is rather much the same ...)
Ptr<DescriptorExtractor> descriptorExtractor = DescriptorExtractor::create( argv[2] );
...
Mat descriptors01;
descriptorExtractor->compute( img1, keypoints1, descriptors01 ); // compute descriptors
FileStorage storage("test.yml", FileStorage::WRITE); //save it to disc
storage << "blub" << descriptors01;
storage.release();
Mat descriptors1;
FileStorage storage1("test.yml", FileStorage::READ); // load it again
storage1["blub"] >> descriptors1;
storage1.release();
The keypoints & descriptors for image 2 are computed and used without saving and loading.
I am using only the loaded data (keypoints & descriptors) for image 1 for the matching, so for the descriptors: descriptors1.
Now here is the thing: if I compare the cases
A) Using the code above for computing, storing and loading;
B) Using only loaded data (without computing and store it again)
for the matching I get different results, as you can see in the pictures for keypoints aswell as for the matching descriptors. I would have expect no differences... What am I missing here? Must I compare 2 images, and cannot compare an image to a stored set of keypoints and it's descriptors ?
Of course I'm using the same values for [detectorType] [descriptorType] [matcherType] [matcherFilterType] [image1] [image2] [ransacReprojThreshold], by the way ;)
Thanks alot!
UPDATE:
It seems the issue is depending on the descriptor. Working with loaded descriptors works for SIFT and SURF, but not for ORB and other. Images: Results with different descriptors for case A and B: