I'm trying to compute disparity map in C++ using OpenCV 3.1. I use StereoSGBM algorithm and I need to be able to recognize far and very close objects. Thus I set MinDisparity to -16 and MaxDisparity 160.
Camera is now correctly calibrated but the resulted disparity map is cut from left. The amount of the cut depends on the MaxDisparity settings.
I would understand why this happens for close objects. Simply because pixels on one image are not available on the second image. But this does not happen with farther objects. In such case, the object is fully visible in both camera images, but it is not visible in the resulted disparity map.
Look at this picture. Why my hand is not visible on the result?
Is there any solution for this problem? To compute disparity map for all visible area in case of high MaxDisparity settings?