I stitched two images together with scans mode. In order to better understand the process, I used stitching_detailed.cpp for my work. A question arises in my mind about how to map points from the original images to the stitched images and vice versa. Additional information for reproducing the issue is provided below.
I use the following configuration when I run stitching_detailed.cpp to pass the arguments( the configuration is intended to simulate scans mode since the default is panorama, I assume)
Using the below program, I map the points of the original images and the stitched image.
cv::Point2f p(300, 300); //an example point from the stitched image
cameras[0].K().convertTo(K, CV_32F); //for first image
cv::Point2f p2 = warper->warpPointBackward(p, K, cameras[0].R); //warp points to the first input image
p2 = cv::Point2f(p2.x - corners[0].x, p2.y - corners[0].y); // subtract point from corner
Despite this, the result is not what I wanted and the points are not the same in the original image and the stitched image… These images are attached below for your review.
your command line execution says “affine” everywhere but not “rotation”.
I would expect the values it calculates to require affine warping since it was told to use that.
look at the switch above that. the pointer’s type is one thing, the “interface”. what it actually points to matters more.
I have never touched the module and just provide rubberducking. I can be very wrong here. ultimately it’s a good idea to make your issue reproducible, i.e. post a “minimal reproducing example”
Thanks, but I am confused since this inherited class contains no H. How can we pass H to the function? Nevertheless, cameras[i].R seems to play the H role in the code.
Since I am a newbie to the forum, it’s not possible for me to attach multiple images to reproduce the issue.
I have provided a simple example for the warping issue. It appears that the main issue occurred when I used warpPoint function (possibly incorrectly). The code is available at the following link