Hello,
I am trying to obtain an accurate 2D coordinate system from captured images. For now I am using a simulator where I have a perspective vision sensor perpendicular with the ground.
To that goal, I did camera calibration. For that I used a black plane with 12 white circles evenly spaced. I took some images where I changed the camera angle, but always maintaining the same position of the plane in the simulation. An image is presented below.
I built a matrix with the real world coordinates of the circles, and then I used cv2.HoughCircles() to obtain circles coordinates in the image. I adjusted the circles detected in respect with the real world circles.
Finally I used: newcameramtx, roi = cv2.getOptimalNewCameraMatrix(); mapx, mapy = cv2.initUndistortRectifyMap() and dst = cv2.remap() to obtain the undistorted image.
However, the re-projection error is about 0.38, and the original image and the undistorted one are barely different.
Any idea of what I could be doing wrong or other additional steps?
Thank you