1
votes

I try to construct 3D point cloud and measure real sizes or distances of objects using stereo camera. The cameras are stereo calibrated, and I find 3D points using reprojection matrix Q and disparity.

My problem is the calculated sizes are changing depending the distance from cameras. I calculate the distances between two 3D points, it has to be constant, but when object gets closer to the camera, distance increasing.

Am i missing something? The 3D coordinates have to be in camera coordinates, not in pixel coordinates. So it seems inaccurate to me. Any idea?

1
Welcome to Stack Overflow! Please post the specifics of your question with a minimal, complete, and verifiable example. Others need to be able to reproduce your error exactly, preferably just by copy and pasting the code, but please give just enough code to reproduce your error. Help others help you!alkasm

1 Answers

0
votes

You didn't mention how far apart your cameras are - the baseline. If they are very close together compared with the distance of the point that you are measuring, a slight inaccuracy in your measurement can lead to a big difference in the computed distance.

One way you can check if this is the problem is by testing with only lateral movement of the camera.