I want to obtain the Hand-to-Eye relationship matrix with a 4-D arm. I assume there is no rotation around the X and Y axes, and no translation along the Z axis. I’ve tried representing my arm’s data as (x,y,0,0,0,r) and, after using solvePnp to determine the chessboard to the camera coordinates, I modified z,w, and p to 0 (while making sure to adjust the camera horizon level as possible to minimize errors). I expected to retrieve the Hand-to-Eye relationship as (x,y,0,0,0,r), but when inputting the data into calibrateHandEye function, I couldn’t get the correct results. Is there a simpler way to solve my issue or any suggestions on what I might be doing wrong?
1 Like
Hi,
I wonder if you have solved your problem. I am currently facing the same issue, described here: Eye-to-hand calibration for a 4 DOF robot arm