Hello,
I am aiming to measure objects within a plane.
So starting with this:
I am able to transform it to this:
(For testing purposes I used the (arbitrary) quadrilateral with known points.)
So now I am able to measure angles and distances in this plane. (At least the process of measuring is simplified, because I guess one could also measure in the original image with enough effort and maths.)
Back to my topic: The “cv.warpPerspective” function interpolates pixels. (Because the homography maps pixels, which may be considered as integers to real numbers (floats). So e.g. pixel 102, 506 gets “warped” to 300.5 and 736.9. Then one is in need to interpolate the values at 300 and 737 from the surrounding “neighbors”.)
Now my question(s):
(1) Since the picture is getting “strechted” in some regions more than in others I am in fear that this interpolation introduces an measurement error?
(2) Does anyone got experiance how this influences the measuring?
(3) Does it intruduce errors at all?
(4) And which interpolation method would be the most exact/preferred and why?
(5) Which resolution to choose for the final picture?
My guess would be that the maximum error is a one “square” pixel-region. Because thats the “uncertainity” of the warping (123, 234 → 100.6 and 300.2 could land at 100/ 101 and 300/301). Might that be true?