0
votes

I am trying to match two images using ORB, using brute force matcher results in random matches, applying Low's ratio condition dose not find any correct match!

I suspect the problem is in if m.distance < 0.7: I read that m.distance is the distance between two discriptors returned from bf matcher, but the values are too large and none meet the condition, how do I remove the outliers then?

What I've tried:

  • sorting the list by distance matches = sorted(matches, key = lambda x:x.distance)
  • using SIFT, which works great!

Sorting does not help in finding good matches,and SIFT is patented, I want ORB.

my code:

# finding feature points on products
image = productName + ".jpg"
path = "./images/products/"+image
img1 = cv2.imread(path,cv2.IMREAD_GRAYSCALE) 
orb = cv2.ORB_create()
kps1, des1 = orb.detectAndCompute(img1, None)
result=cv2.drawKeypoints(img1,kps1, img1,flags=cv2.DRAW_MATCHES_FLAGS_DRAW_RICH_KEYPOINTS )
plt.figure(num=None, figsize=(20, 12), dpi=80, facecolor='w', edgecolor='k')
plt.imshow(result)

# finding feature points on shelves
image = shelfName+'_raf.jpg'
path = "./images/shelves/"+image
img2 = cv2.imread(path,cv2.IMREAD_GRAYSCALE) 
orb = cv2.ORB_create()
kps2, des2 = orb.detectAndCompute(img2, None)
result=cv2.drawKeypoints(img2,kps2, img2,flags=cv2.DRAW_MATCHES_FLAGS_DRAW_RICH_KEYPOINTS)
plt.figure(num=None, figsize=(20, 12), dpi=80, facecolor='w', edgecolor='k')
plt.imshow(result)

bf = cv2.BFMatcher(cv2.NORM_HAMMING, crossCheck=False)
matches = bf.match(des1, des2)

# store all the good matches as per Lowe's ratio test.
good = []
for m in matches:
    if m.distance < 0.7:
        good.append(m)
        
match_img = cv2.drawMatches(img1, kps1, img2, kps2, good, None, flags=2)

plt.figure(num=None, figsize=(20, 12), dpi=80, facecolor='w', edgecolor='k')
plt.imshow(match_img, 'gray')

enter image description here

enter image description here

enter image description here

if I remove Low's condition, I get: enter image description here

1
that is not Lowe's ratio test. the code has crucial errors in at least two places related to this. where did you get this code from?Christoph Rackwitz
I got it from this linkHasan Ali
yes, that's why the code in that question also doesn't work. discard the brute force matcher. you need "knn" matching with k=2. otherwise Lowe's ratio test can't be done. please read the other responses. or take your code from the samples directory within OpenCV.Christoph Rackwitz
What do the circles represent?fmw42
the circles are feature points or keypointsHasan Ali

1 Answers

2
votes

The code attempts to use Lowe's ratio test (see original SIFT paper).

this requires, for every descriptor, the two closest matches.

the code should read:

matches = bf.knnMatch(desCam, desTrain, k=2) # knnMatch is crucial
good = []
for (m1, m2) in matches: # for every descriptor, take closest two matches
    if m1.distance < 0.7 * m2.distance: # best match has to be this much closer than second best
        good.append(m1)

further, I would highly recommend the flann matcher. it's faster than the brute force matcher.

look at the OpenCV tutorials or the samples directory in OpenCV's source (samples/python/find_obj.py) for code that works.