Hello,

Any suggestion would be appreciated.

You can calculate the reprojection error using estimated pose of the object. Simply use `projectPoints()`

and calculate RMS error between projected points and points used in pose estimation. Good pixel error is usually <1px but it depends on many factors.

Thanks, Filip. I tried heart button already and I still want to say thanks. 20 characters

Hello Filip.

I found few examples generating random 3d points used for projectPoints().

The projectpoints() gives a set of 2d points. How can we calculate errors between these points and those used in pose estimation.

You can compare two sets of 2D projections calculating RMS error. The error can be defined as distance between two points. C++ code would look like this:

```
float rmsError = 0;
for (int i = 0; i < projPts.size(); i++)
{
float distError = distance(projPts[i], calcPts[i]);
rmsError += distError * distError; // Square of distance error
}
rmsError /= projPts.size(); // Mean of squared errors
rmsError = sqrt(rmsError);
```

Where `projPts`

is set of points used for pose extimation, `calcPts`

is set of points calculated from estimated pose and `distance()`

is custom function for calculation of distance between two points.

let me add to @FilibBaas โ solution:

`distance()`

could be `double d = cv::norm(p1-p2); // builtin L2 distance`

note, that it unfortunately does *not* work with `cv::norm(p1, p2);`

Thanks very much at first.

I add some details about my question. Suppose R and t are given by solvePnP with a set 3D points (Pc) in camera coordinate system and a set of 2D points which are said to be projections of the 3Ds in another image, i.e. via keypoint comparison.

Then we have to compute such an error between the 2Ds used for pose estimation and another set of 2D points computed via R, t and Pw (3D points of Pc in world coordinate system).

What confused me is simply the relationship between Pw and Pc.

Pc = R Pw + t

We have to know Pw in order to get the first set of 2Ds. So, my question is actually how can we get this Pw.

R.inv() Pc - t must not be the case. Maybe there was something wrong with my understanding the solvePnP.

Here is the code computing R and t

slambook/pose_estimation_3d2d.cpp at master ยท gaoxiang12/slambook ยท GitHub.

It uses Pc for solvePnP if I understood it well.

Maybe I should ask my question in this way.

Is the objectPoints (first parameter) in solvePnP the same to the InputArray objectPoints of projectPoints(first parameter).

Usages of two functions see here OpenCV: Camera Calibration and 3D Reconstruction

If it is, I got the difference with

Point2d d0 = vP2dOutput2[i] - vP2dTest[i]; | ||
---|---|---|

RMS += sqrt(d0.xd0.x + d0.yd0.y); |
||

cout << โ 2d points projected and distance: << vP2dOutput2[i] << |

```
cout << " -- RMS: " << RMS / vP2dOutput2.size() << endl;
```

โ 2d points projected and distance: [380.2, 175.053]|[370.365, 167.1] , 12.6479

โ 2d points projected and distance: [345.136, 222.826]|[309.618, 204.208] , 52.7496

โ 2d points projected and distance: [349.549, 179.601]|[386.705, 196.455] , 93.5488

โ 2d points projected and distance: [84.9355, 354.452]|[52.2203, 375.842] , 132.636

โ 2d points projected and distance: [349.162, 215.112]|[312.49, 191.881] , 176.046

โ 2d points projected and distance: [214.79, 253.643]|[230.302, 276.441] , 203.622

โ 2d points projected and distance: [381.356, 173.009]|[372.559, 164.99] , 215.526

โ 2d points projected and distance: [352.36, 213.28]|[323.445, 199.333] , 247.629

โ 2d points projected and distance: [352.302, 210.801]|[318.484, 189.511] , 287.591

โ 2d points projected and distance: [379.568, 178.538]|[379.988, 175.167] , 290.987

โ 2d points projected and distance: [348.936, 180.147]|[385.069, 197.167] , 330.928

โ 2d points projected and distance: [249.769, 237.273]|[317.253, 298.58] , 422.102

โ 2d points projected and distance: [339.088, 225.744]|[283.822, 197.415] , 484.205

โ 2d points projected and distance: [333.557, 201.172]|[353.182, 211.721] , 506.485

โ 2d points projected and distance: [299.258, 206.846]|[352.271, 238.927] , 568.449

โ 2d points projected and distance: [296.908, 206.954]|[340.367, 235.93] , 620.682

โ 2d points projected and distance: [326.667, 211.693]|[346.301, 227.018] , 645.589

โ 2d points projected and distance: [376.644, 185.988]|[381.575, 188.825] , 651.278

โ 2d points projected and distance: [347.921, 181.534]|[384.583, 197.653] , 691.327

โ 2d points projected and distance: [338.236, 226.869]|[289.758, 197.396] , 748.061

โ 2d points projected and distance: [257.187, 253.406]|[262.875, 268.743] , 764.418

โ 2d points projected and distance: [260.5, 247.457]|[268.813, 261.587] , 780.813

โ 2d points projected and distance: [339.063, 225.934]|[289.758, 197.396] , 837.781

โ RMS: 36.4253 (divided)

So this is a very bad result.

Maybe I should ask my question in this way.

Is the objectPoints (first parameter) in solvePnP the same to the InputArray objectPoints of projectPoints(first parameter).

Yes, `objectPoints`

are object points relative to any part of object. No need to recalculate it to coordinates of camera! In `solvePnP()`

you use `objectPoints`

and result is object pose relative to camera.

In `projectPoints()`

you use the same set of `objectPoints`

and also `rvec`

and `tvec`

(the estimated pose of object). You can understand it as inverse function of `solvePnP()`

.

If the reprojection error is high, check if object points are defined correctly (measure them) and also make sure to have good intrinsic calibration of camera.

This makes sense for me. Thanks, Filip.