So I have a robot with a camera on it. The camera is tilted downwards to the ground at 5 degrees. With two consecutive frames, I want to calculate the robot’s motion as follows.
I use the SIFT
feature extractor implemented in OpenCV
to get features than match them with FLANN
.
After that I use findEssentialMat()
to calculate the rotation and translation.
My question now is if I need to account for the tilt of the camera to get the correct translation as the distance between two features is not correct with a tilted camera?
Like in this picture:
In my mind I need to calculate: a = b cos^-1(tilt) for the real distance my robot traveled in the x-direction. Movement in y and z as well as the rotation should not be affected right?
Is b the translation vector OpenCV calculates?
Or how do I get the correct translation?
Or do I overthink this and misunderstood what the essential Matrix actually calculates?
Is there a parameter for OpenCV functions that I need to set?
Maybe I am missing the forest through the trees.
Thanks for the help!