stereoCalibrate vs calibrateCamera reprojection error

I’m performing a stereo camera calibration where I first calibrate the left and right cameras individually and finally I perform the stereo calibration with the intrinsic parameters fixed.

What I have found, is that the returned reprojection error by both methods is slightly different which raised some questions. (I use the perViewErrors instead of the global RMSE)

This is one example of the reprojection errors I get for calibration:

calibrateCamera() left reprojection error: 0.10 0.22 0.14 0.15 0.21 0.14 0.13 0.16 0.15 0.14 0.19 0.23 0.12

calibrateCamera() right reprojection error: 0.10 0.24 0.13 0.14 0.22 0.14 0.13 0.18 0.17 0.17 0.25 0.24 0.12

stereoCalibrate() left reprojection error: [0.14 0.26 0.17 0.23 0.25 0.23 0.18 0.3 0.17 0.18 0.23 0.23 0.13]

stereoCalibrate() right reprojection error: [0.14 0.28 0.15 0.22 0.25 0.23 0.17 0.29 0.18 0.2 0.28 0.24 0.13]

Are the reprojection errors computed the same way for both methods? After some experiments I reached the conclusion that both methods just reproject the 3D points using the intrinsic parameters of each camera. The only difference is how the rvecs and tvecs are computed during the optimization process which explains the small differences.

What I found strange is that if I compute the reprojection error myself using the rvecs and tvecs obtained with cv.solvePnP() I obtain the same exact results as the cameraCalibration(), which makes me think that maybe stereoCalibrate has some other difference I’m not aware of.

Left reprojection error solvePnP: 0.10 0.22 0.14 0.15 0.21 0.14 0.13 0.16 0.15 0.14 0.19 0.23 0.12

Right reprojection error solvePnP: 0.10 0.24 0.13 0.14 0.22 0.14 0.13 0.18 0.17 0.17 0.25 0.24 0.12

If the extrinsic parameters do not take part in such computation, is there another metric that I can use to assess good stereo calibration and ensure accurate depth results?

Any help will be appreciated.

Thanks,