How to correct distortion from stitching dual camera to estimate homography

I am trying to distort the image obtained from two cameras that generate a stitching image. Unfortunately, I do not have access to the cameras and I would like to know what could be the best way to apply to correct the image and get the lines below to be straight.

The next step I am trying to do is to calculate the homography with respect to the field. And the only thing that I find is that: transformation between the image field and the template field plane has to be done through a linear transformation.


that’s a fisheye lens. you’ll need to work with the fisheye model. OpenCV has a bunch of routines for that.

if you can’t wave a calibration pattern in front of the camera, you’ll have to guess the parameters and see if they produce a sensible result. you can start by picking identical points on the field and on the picture. that’s equivalent to calibration (but not nearly enough data for a good calibration).

Thank you very much for the crackwitz comment, the doubt I have is that I understand that if the stiching has been done between two cameras with fisheye distortion. I couldn’t apply the fisheye undistortion to stiching image because it wouldn’t have to be exactly the same right?

Exactly, you can not do that. The best practise would be to undistort both images first and apply stitching on those undistorted images. That way you should get linearized image without need of further undistorting.

I have never done that, but it should work.