0
votes

I am trying to calculate distance of the object from the camera using the below code. So the approach is to calculate disparity of the left and right image which will be in matrix format. Once I get the Disparity, I use the formula Depth = Baseline * focal length/ Disparity

Two questions here:

  1. Why is the Depth coming as negative ?
  2. And how to get the actual distance of the object from the camera using Depth matrix. Depth matrix basically gives the distance for each point which I don't want. I want to calculate the actual distance of the object in meters.

    import numpy as np import sys import cv2 from matplotlib import pyplot as plt %matplotlib inline

    imgL = cv2.imread('C:/Users/Akash Jain/Documents/ZED/LeftDepth/left000005.png', 0) imgR = cv2.imread('C:/Users/Akash Jain/Documents/ZED/RightDepth/right000005.png', 0)

    stereo = cv2.StereoBM_create(numDisparities=16, blockSize=5) disparity = stereo.compute(imgL,imgR) D= 0.12*0.70*672/disparity print("disparity",disparity) print("Depth",D) plt.imshow(disparity,'gray') plt.show()

enter image description here

enter image description here

1
Not sure why it is not taking as codeAkash
Is your camera setup parallel and are the images rectified?Paul92

1 Answers

2
votes

The simple Depth = Baseline * focal length/ Disparity formula is valid only for the case in which the cameras axes are parallel. Assuming you have good matches, toed-in (converging-axes) cameras will cause negative disparities in some parts of the image plane.

However, the disparity image you posted seems low-quality too, which makes me suspect you have bad matches as well: is it reasonable, given your scene, for the disparity to swing so wildly in the low-middle portion?