Hi,
I am trying to calibrate a StereoLabs Zed 2 camera and I get inconsistent results.
Zed cameras are calibrated themselves, and here is the calibration parameters coming from their SDK:
I am trying to calibrate the camera myself (because I want to do a robot-camera calibration), and my problem is that I cannot get results similar to Zed SDK calibration result.
I have used 3 different checkerboards with 3 different sizes (link to the images of the checkerboards and my setup: IMG_8871.jpg - Google Drive)
Here are the links for the images I took for calibration:
For smaller pattern: calibration small pattern - Google Drive
For medium pattern:
calibration medium pattern - Google Drive
For the larger pattern:
calibration large pattern - Google Drive
The problem is that I get different intrinsic matrices for each, and the none of them are similar to Zed SDK result.
Smallest pattern:
array([[1.18941333e+03, 0.00000000e+00, 1.11280623e+03], [0.00000000e+00, 1.23577099e+03, 6.53167171e+02], [0.00000000e+00, 0.00000000e+00, 1.00000000e+00]])
medium pattern:
array([[1.07054440e+03, 0.00000000e+00, 1.11565267e+03], [0.00000000e+00, 1.07011710e+03, 6.28861026e+02], [0.00000000e+00, 0.00000000e+00, 1.00000000e+00]])
large pattern:
array([[1.01906390e+03, 0.00000000e+00, 1.12143074e+03], [0.00000000e+00, 1.01388780e+03, 5.92256064e+02], [0.00000000e+00, 0.00000000e+00, 1.00000000e+00]])
In all cases, reprojection error (the first output of cv2.calibrateCamera()
)is less than 0.6, and the output of the following code is also always less than 0.08.
ret, mtx, dist, rvecs, tvecs = cv2.calibrateCamera(objpoints, imgpoints, gray.shape[::-1], None, None)
mean_error = 0
for i in range(len(objpoints)):
imgpoints2, _ = cv2.projectPoints(objpoints[i], rvecs[i], tvecs[i], mtx, dist)
error = cv2.norm(imgpoints[i], imgpoints2, cv2.NORM_L2)/len(imgpoints2)
mean_error += error
print( "total error: {}".format(mean_error/len(objpoints)) )
So I think each function is doing its job very well, but overall the final result is inaccurate.
So here are my questions:
-
These matrices are actually very different, and I expect that if I do a good job in calibration, I should have gotten consistent result. Yes?
-
Even though the errors are very low, still the calibration is far from perfect, yes? And this means that for the images provided, the calibration algorithm is doing a good job, but the images are not enough to do the calibration robust enough? (Note: The robot has only 4 DOF, and it can only rotate along Z axis of the robot, that is why the images I can collect using the robot is limited, and that is why I have added several images without using the robot, and with very different angles)
-
What should I do to have the calibration as robust as possible?
Note: in this post, I have discussed the big picture of what I want to do, which is robot-camera calibration. Eye-to-hand calibration for a 4 DOF robot arm