Good day dear community,
I am in the process of performing a stereo calibration for two cameras. I know that Stereo calibration will essentially find out the rotation R and translation t between both the cameras. For the two cameras I have given the intrinsic parameters. In the first step, I take about 50 images simultaneously with both cameras. I then finally use these for the stereo calibration. My results also make rough sense. Because I know the distance of the cameras and compared them with the output t and that fits.
However, I sometimes get an error message when I, for example, vary the distance between the cameras and take new pictures, namely:
(expected: ‘nimages == (int)imagePoints2.total()’), where
‘nimages’ is 27
must be equal to
‘(int)imagePoints2.total()’ is 50
My guess is that the corners on the checkerboard are not recognized for a particular camera and consequently there are fewer object points/image points available for that camera. My code is identical with this onehttps://stackoverflow.com/questions/28222763/stereo-calibration-opencv-python-and-disparity-map.
My assumption was strengthened when I printed out the number of object points/image points for each camera:
Can anyone give me a tip on how to make sure I get the same number of object/image points for both images? Or does anyone know this problem in general and knows a solution? I would be very grateful.