Stereo : getting 3D position in left camera frame


I’m working on a stereo setup.
I use getOptimalNewCameraMatrix followed by stereoRectify and initUndistortRectifyMap to get my undistortion/rectification maps.

Then, on the undistorted images, I match some features and compute manually the disparity.
So for each feature, I have the pixel coordinates (x,y) in left rectified/undistorted image and the disparity.

Now the question is how to get the 3D position in the original left Camera Frame?

So far, what I have found :
cv::perspectiveTransform(kpts_xleft_yleft_disparity, results3D, Q);
where kpts_xleft_yleft_disparity is a vector of Point3d containing pixel coordinates (x,y) in left image and disparity, and Q is the 4×4 disparity-to-depth mapping matrix from stereoRectify.
The question is : in which referential is the result? In the coordinate system of the left camera? Or of the undistorted left camera? or something else?

triangulatePoints(left_proj_mtx, right_proj_mtx, ixel_coordinates_of_feature_in_left_img,pixel_coordinates_of_feature_in_right_img, results4D);

However, the documentation for triangulatePoints for opencv 3.4 says that " If the projection matrices from stereoRectify are used, then the returned points are represented in the first camera’s rectified coordinate system.". So, because the projection matrices come from stereoRectify, I’m ending up with the coordinates in the wrong coordinate system (rectified left camera frame instead of left camera frame). So is there a way to get the coordinates in original camera frame out of it? (NB : in the doc from opencv 4.2 there is no longue the warning that the result is in undistorted frame)

  1. Any other idea?

Thans a lot in advance