How is it possible for OpenCV to do Epipolar Geometry calculations without the focal length of a camera?

Hi there OpenCV community.

The more I learn about epipolar geometry the more confuse I get about this.

in epipolar geometry you need to know the distance ‘f’ in metric units, to be to obtain the 3D coordinates of the detected pattern points, or triangulate features in stereo vision, etc.

Now, how is this possible if all OpenCV calculates is a fx and fy in pixels and we are not telling the sensor size (given by manufacturers), neither the manufacturer focal lenght (metric).

Some help would be highly welcome.

Related 1.
Related 2.

Could someone throw some light here? :slight_smile:

Quoting the great @spmallick, CEO of OpenCV:

  1. Intrinsic parameters of the camera . As mentioned before, in this problem the camera is assumed to be calibrated. In other words, you need to know the focal length of the camera, the optical center in the image and the radial distortion parameters. So you need to calibrate your camera. Of course, for the lazy dudes and dudettes among us, this is too much work. Can I supply a hack ? Of course, I can! We are already in approximation land by not using an accurate 3D model. We can approximate the optical center by the center of the image, approximate the focal length by the width of the image in pixels and assume that radial distortion does not exist. Boom! you did not even have to get up from your couch!

https://learnopencv.com/head-pose-estimation-using-opencv-and-dlib/

If I remember correctly this can be simplified to:
Fx = Frame_width/2
Fy = Frame_height/2

Correct me if I’m mistaken.

Doing that will increase the error :frowning:

Is this the only way to go?

Besides providing the manufacturer focal length (and trusting it)?

Thanks @Yochanan_Scharf

EDIT: Doing that also does not provides metric units for epipolar geometry calculations.

The proper way is to do camera calibration which is not easy but well documented.
https://learnopencv.com/camera-calibration-using-opencv/

In the past I have used a desktop-application by BoofCV.
Tutorial Camera Calibration - BoofCV

The tutorial will guide you step by step.
It involves:

  1. Downloading their app.
  2. printing a checkerboard and sticking it on a hard surface which is fun.
  3. Following the calibration process.

Remember: this has to be done only once (for each camera).

Thank you.

My post already mention this:

Now, how is this possible if all OpenCV calculates is a fx and fy in pixels and we are not telling the sensor size (given by manufacturers), neither the manufacturer focal length (metric).

I.E I am aware enough of the calibration process and each of the terms obtained from it. Which goes back to the same point, fx and fy are not in metric units after the calibration process, how does OpenCV get the focal length in metric units needed for epipolar geometry calculations.

@DoDoM You can have a look at my reply to the other thread where you asked the same question: Disparity calculations and camera centers - #4 by lpea. To summarize, only the ratio between the focal length and the pixel size is important, therefore it is sufficient to estimate the focal length “in pixels”.

Thanks a lot @lpea

Nevertheless I’m asking for monocular camera. The goal is to draw epilines (thus variable Z). And my main concern was since I need to know the distance Camera-ImagePlane which f to use.

Would be possible to apply the same principle to obtain the triangle sizes and angles between the camera-imagePlane-imagePixelXY.? Using fx for pixel x coordinates and fy for y pixel coordinates and them sum up the angles with the hypothenusas?

Getting closer, thanks a lot!

where did you find that idea, even ? it’s all in pixels, to my knowledge.

@berak thank you so much.

I am not sure from where, but lets sum up:

OpenCV uses fx with a pixel on the image x and the optical canter cx.

All of them in pixel units, while doing the same with fy, y, and cy in order do the calculations, like for example minimizing the reprojection errors of the different solvePnP methods.

OpenCV is placing the image plane at which distance form the camera origin? A distance fx (in pixels this would be confusing) for the calculations on x axis, and at a distance fy for the calculations on y axis?

I am also finding other sources that mention they normalize it to a distance f = 1.