Feed known translation vector to SolvePnP

I’m not sure what you mean by “2-5% error with SolvePnP’s translation vector”, but the question that comes to mind is “how much does a 2-5% translation error affect your attitude value.” Is it a 0.01 degree difference, and how does that magnitude of error propagate to the values you are calculating? And maybe you have done that already, and it’s significant enough to try to address. I just know that I’m prone to trying to chase out every last spec of error, and sometimes I need to zoom back out and look for more fertile ground (for example, maybe image processing improvements to get better feature localization.)

Having said all that, I’d probably be trying to do the same thing if I already had high quality data for the translation vector (again, be mindful of any offset from the GPS tvec and your camera nodal point, and account for it if you can.)

I think you could probably do what you want by looking at calib3d/src/calibration.ccp. In my version the good stuff starts at line 1145.

CvLevMarq solver( 6, count*2, cvTermCriteria(CV_TERMCRIT_EPS+CV_TERMCRIT_ITER,max_iter,FLT_EPSILON), true);

I think you could change the 6 to 3 (number of parameters to solve for) and the _param mat to also be 3x1, and then call cvProjectPoints2 with the tvec that you pass in (and whatever other bookkeeping changes are necessary).

If you are willing to do that, and it might be a little tedious to track down the way the parameters are getting shadowed, you should probably be able to get what you want.

Good luck and let us know your results.

1 Like