1
votes

I've been going back and forth between my OpenCV and OpenGL components and I'm not sure which of the two should correct for this.

Using OpenCV camera calibration yields fx, fy with an aspect ratio of approximately 1, which would correspond to images of square size. My calibration output:

...
image_width: 640
image_height: 480
board_width: 6
board_height: 9
square_size: 5.
flags: 0
camera_matrix: !!opencv-matrix
   rows: 3
   cols: 3
   dt: d
   data: [ 6.6244874649105122e+02, 0., 3.4060477796553954e+02, 
        0., 6.4821741696313484e+02, 2.5815418044786418e+02, 
        0., 0., 1. ]
distortion_coefficients: !!opencv-matrix
   rows: 5
   cols: 1
   dt: d
   data: [ -1.1832005538154940e-01, 1.2254891816651683e+00, 3.8133468645674677e-02, 1.3073747832019200e-02, -3.9497162757084490e+00 ]

However, my image widths are 640x480, as you can see in the calibration output.

When I detect the checkerboard in the frame and draw the checkerboard with OpenGL, what OpenGL renders ok in height but stretched in width direction so it doesn't fit the checkerboard where it really is. What certainly solves it is multiplying the fx component of the calibration by 480/640, but I don't know if that is the way to go.

How do I correct for this? Scale the calibration values with the image size, or do something in OpenGL to fix it?

Edit

There is the distinction between capturing and displaying. I capture images with a smartphone that was calibrated, and that smartphone spits out images of 640x480.

For finding the chessboard I use the intrinsics as shown above, that were calibrated.

Then, no matter which aspect ratio I give to OpenGL, let it be fx/fy, 640/480 or fx/fy * 640/480, it is wrong. The chessboard that is projected back in OpenGL in such a way that what OpenGL projects back is stretched in width direction.

The only way that it looks exactly right in OpenGL, is if I use fx=640, fy=480 for finding the chessboard. And that is wrong as well because now I am totally ignoring the camera intrinsics...enter image description here

Edit2

I mean no matter how I set the aspect ratio that I pass to gluPerspective, it doesn't come out right.

gluPerspective( /* field of view in degree */ m_fovy,
   /* aspect ratio */ 1.0/m_aspect_ratio,
   /* Z near */ 1.0, /* Z far */ 1000.0);

What I've tried for values of m_aspect_ratio:

  • Output aspect ratio of OpenCV's calibrationMatrixValues
  • fx/fy
  • 640/480
  • fx/fy * 640/480
  • output of calibrationMatrixValues * 640/480

All seem to botch the width. Note that the origin of the chessboard is in my screenshot the topmost inner corner in the image: it is placed correctly, and so is the bottommost inner corner in the image. It's a scaling problem..

Edit3

It was something really, really stupid.. I was setting the aspect ratio for OpenGL like so:

gluPerspective( /* field of view in degree */ m_fovy,
    /* aspect ratio */ 1.0/m_aspect_ratio,
    /* Z near */ 1.0, /* Z far */ 1000.0);

and setting

m_aspect_ratio = viewportpixelheight / viewportpixelwidth;

not realizing that viewportpixelheight and viewportpixelwidth are integers, so I'm doing integer division which resulted either in 1 or (when swapping them) in 0.

2

2 Answers

1
votes

The camera calibration matrix maps world coordinates to image coordinates. OpenGL maps image coordinates to screen coordinates. If your problem is in how the image is being displayed you should handle it in OpenGL.

I'm sorry for the confusion. What I was trying to do is make a distinction between capturing and displaying the image. You are correct that the aspect ratio of the physical camera can be calculated from the focal lengths. Those focal lengths and that aspect ratio are fixed by the hardware of the camera. Once you have captured that image though you are free to display it at any aspect ratio you choose with OpenGL. You can crop and stretch the image all you like to change the aspect ratio of what is displayed. It is not a given that the aspect ratio of the camera matches the aspect ratio of your screen. OpenCV calculates the camera calibration matrix with direct raw measurements from the physical camera. If we assume they are correct and constant (both of which seem reasonable if there is no zoom) then any further changes to aspect ratio are the responsibility of OpenGL. When I said fx and fy do not determine aspect ratio I was referring to the displayed aspect ratio which was not very clear at all I'm sorry.

Also, I should mention, the reason you can calculate aspect ratio from focal length is that focal length is expressed in units of pixels and those units can be different on the x and y axis. A brief explanation can be found here

The best explanation I have found of focal lengths in the camera matrix is in the section on Camera intrinsics of Computer Vision Algorithms and Applications. In the pdf it starts on page 72, page 50 in the book.

1
votes

An aspect ratio of 1.0 doesn't indicate a square image, it indicates square pixels (which explains why it's almost always 1.0; non-square pixels are relatively rare except in cameras that capture in anamorphic format). The camera matrix contains no information about absolute image dimensions or physical camera dimensions.

On the other hand, gluPerspective's aspect ratio does represent image dimensions. It's important not to confuse one aspect ratio for the other.

If you want to use gluPerspective, it is possible, but you should understand that gluPerspective doesn't let you model all of the intrinsic camera parameters (namely axis skew and principal point offset). I describe how to set the aspect ratio and fovy correctly in this article.

However, I strongly recommend using either gluFrustum (which allows non-zero principal point offset), or glLoadMatrix directly (which also allows nonzero axis skew). Both approaches are explained in this article.