Question about cv::calibrateHandEye

I have a camera mounted on a gimbal, and I need to find the rvec & tvec between the camera and the gimbal.

So I rotate the gimbal, toked several picture of chessboard and record the corresponding IMU data (Euler angle) for each frame. But I can’t get the number to match the CAD scratch of the gimbal.

Here is the main part of my code.

  1. First I use cv::calibrateCamera to get the std::vector<cv::Mat> tvec & std::vector<cv::Mat> rvec. (I get the error of 0.488197)
const auto error = calibrateCamera(
    object_points, 
    image_points, 
    images[0].size(), 
    cameraMatrix, 
    distCoeffs, 
    rvecs, 
    tvecs);
  1. Then I convert my IMU data from Euler angle to rotation matrix by the following code, and push the result into std::vector<cv::Mat> r_gripper2base . (I called inverse() of Quaternionf here because I think the quaternion calculated from rotation might means “base2gripper”?)
const float cy = cos(yaw * 0.5f);
const float sy = sin(yaw * 0.5f);
const float cp = cos(pitch * 0.5f);
const float sp = sin(pitch * 0.5f);
const float cr = cos(roll * 0.5f);
const float sr = sin(roll * 0.5f);

Eigen::Quaternionf quaternion(
    cy * cp * cr + sy * sp * sr,
    cy * cp * sr - sy * sp * cr,
    sy * cp * sr + cy * sp * cr,
    sy * cp * cr - cy * sp * sr
);

cv::Mat rmat = quaternion.inverse().matrix();
  1. Since I am ignoring the translation between base and gripper, so I let t_gripper2base = cv::Mat::zeros(3, 1,CV_32F).

  2. Finally, I use the cv::calibrateHandEye to get the result.

calibrateHandEye(
    r_gripper2base,  // std::vector<cv::Mat> get from step 2
    t_gripper2base,  // std::vector<cv::Mat> get from step 3 (all zeros)
    rvecs,           // output from cv::calibrateCamera
    tvecs,           // output from cv::calibrateCamera
    r_gripper2cam, 
    t_gripper2cam);

But the result I get is far off from the ground truth. And with further testing, I found that passing different value to the method parameter of cv::calibrateHandEye returns different result (a huge difference):

calibrateHandEye(
    r_gripper2base,
    t_gripper2base,
    rvecs,
    tvecs,
    r_gripper2cam,
    t_gripper2cam, // [3228.091028917876; -993.606961589237; 2952.829007343935]
    cv::CALIB_HAND_EYE_PARK);  
calibrateHandEye(
    r_gripper2base,
    t_gripper2base,
    rvecs,
    tvecs,
    r_gripper2cam,
    t_gripper2cam, // [255.7012843960561; 610.620177710132; 713.9109132572202]
    cv::CALIB_HAND_EYE_TSAI);  

I think I must have missed something but I don’t have the knowledge to aware what it is. Any suggestions would be appreciated!


Here is two example image I taken from camera (combined side by side).

And here is all the files to re-produce the result: Archive.zip

that is only correct if you are SURE the gyro_data Mat contains FP32 data, not FP64 or other data. the type argument it MUST match the Mat’s runtime type.

Yes, I am sure that this is a valid operation, since when capturing data I store these data into .xml files using cv::FileStorage with FP32.

all fine.

I can’t run your code, so my “analysis” is superficial and I’m left with mentioning the usual issues. I also haven’t used the hand-eye calibration stuff.

another usual issue is understanding of rvec. it’s an axis-angle encoding. magnitude is angle. axis is axis. if needed, cv::Rodrigues converts to 3x3 rotation matrix (also backwards).