[Camera calibration] the goodPerViewErrors and perViewErrors is not identical in test_cameracalibration code

Hi all,

recently I am checking the camera calibration detail, I got calibration result with calibrateCamera function, and it will give you the
perViewErrors also. To validate this error in 1 view which is around 0.1 pixels, I tried to use projectPoints function to get distorted image points from object points, but the pixel difference between obtained image points and original image points is around 1~2 pixels, which is not identical to the perViewError.

I have reviewed the source code and calibration test code, it seems the way we calculate goodPerviewErrors and perviewErrors is the same (they both use the cvProjectPoints2Internal function).

The way we calculate goodPerviewErrors is:
Step 1: calculate reprojected image points with cvProjectPoints2Internal
Step 2: calculate the x,y difference between the points from step1 and the points from calibration image
Step 3: sum all the x^2 + y^2
Step 4: calcualte sqrt(sum/num_points)

sqrt( (imageMeanDx + imageMeanDy) / (etalonSize.width * etalonSize.height));

The way we calculate perviewErrors is in cvCalibrateCamera2Internal:
Step 1: calculate reprojected image points with cvProjectPoints2Internal
Step 2:
cvSub( &_mp, &_mi, &_err);
double viewErr = norm(_err, NORM_L2SQR);
perViewErrors->data.db[i] = std::sqrt(viewErr / ni);

so Can I say there is no difference?

I have 1 issue and 1 question as follow:

  1. if the perViewError is based on the same projection algorithm, why we use it again as a reference in the testing code?
  2. what’s the root cause for the different results?

Thanks so much