I am trying to calculate disparity (difference of pixels) in left and right images. Assume the images are rectified (row aligned) and the principle point is known and indicated as cx, cy (note principle point is different for left and right cameras).
Say we have a pixel in left image x_l has corresponding pixel in right image x_r. If we use a image coordinate whose origin is cx and cy, then the disparity of two pixels is simply as: disp = x_l - x_r
However, if a image coordinate that has origin at top left of the image (i.e. defined in OpenCV), do I need to consider this offset (cx) when calculating the disparity? Or can I still use disp = x_l - x_r to get the correct disparity?
It may seems a bit stupid question but I do feel confused. Thanks.