Essential Matrix Calculation

I am trying to find the rotation between two camera images. I manually selected the matching points between the images and undistorted these points to obtain direction vectors (since it’s a fisheye camera, I am working with undistorted vectors). When I input these undistorted direction vectors in the correct order into the OpenCV function, it returns incorrect rotation values. To verify my logic, I validated a similar process with the code below. I am following exactly the same approach, but even when I select 50 points, I get incorrect results, likely due to reprojection error.

import numpy as np
import cv2
def euler2RotationMat(roll_deg, pitch_deg, yaw_deg):
    roll = np.radians(roll_deg)
    pitch = np.radians(pitch_deg)
    yaw = np.radians(yaw_deg)

    # Rotasyon matrislerini hesapla
    R_x = np.array([[1, 0, 0],
                    [0, np.cos(roll), -np.sin(roll)],
                    [0, np.sin(roll), np.cos(roll)]])

    R_y = np.array([[np.cos(pitch), 0, np.sin(pitch)],
                    [0, 1, 0],
                    [-np.sin(pitch), 0, np.cos(pitch)]])

    R_z = np.array([[np.cos(yaw), -np.sin(yaw), 0],
                    [np.sin(yaw), np.cos(yaw), 0],
                    [0, 0, 1]])

    # Toplam rotasyon matrisini hesapla
    R = R_z @ R_y @ R_x  # Matris çarpımı (z, y, x sırasıyla)

    return R
def RotationMat2euler(R):
    pitch = np.arcsin(-R[2, 0])

    if np.cos(pitch) != 0:
        roll = np.arctan2(R[2, 1] / np.cos(pitch), R[2, 2] / np.cos(pitch))
        yaw = np.arctan2(R[1, 0] / np.cos(pitch), R[0, 0] / np.cos(pitch))
    else:
        roll = 0
        yaw = np.arctan2(-R[0, 1], R[1, 1])

    return np.degrees(np.array([roll, pitch, yaw]))

def normalize(vectors):
    norms = np.linalg.norm(vectors)
    normalized_vectors = vectors / norms
    return normalized_vectors


# Cameras
C1 = np.array([0, 0, 0])
C2 = np.array([3, 2, 1])

# Points
P1 = np.array([1, 11, 3])
P2 = np.array([2, -1, 4])
P3 = np.array([-1, 1, 2])
P4 = np.array([4, 7, 2])
P5 = np.array([11, 3, 8])
P6 = np.array([2, 4, 9])
P7 = np.array([-2, 11, 5])
P8 = np.array([8, 1, 2])

points = np.array([P1, P2, P3, P4, P5, P6, P7, P8])

direction_vectors_C1 = normalize(points - C1)

# Camera Rotation
c2_camera_rotation = euler2RotationMat(3,12,9)
direction_vectors_C2 = np.zeros([len(points),3])

# Rotate Camera ( inverse rotation to points)
for i in range(0, len(points)):
    direction_vectors_C2[i] = normalize(c2_camera_rotation.T @ (points[i] - C2))

scaled_vectors_C1 = direction_vectors_C1 / direction_vectors_C1[:, 2][:, np.newaxis]  # (x/z, y/z, 1)
scaled_vectors_C2 = direction_vectors_C2 / direction_vectors_C2[:, 2][:, np.newaxis]  # (x/z, y/z, 1)

xy_vectors_C1 = scaled_vectors_C1[:, :2]
xy_vectors_C2 = scaled_vectors_C2[:, :2]


E, _ = cv2.findEssentialMat(xy_vectors_C2, xy_vectors_C1, method=cv2.RANSAC, prob=0.999, threshold=1.0)
_, R, t, _ = cv2.recoverPose(E, xy_vectors_C2, xy_vectors_C1)

print("Translate: ", normalize(C2-C1))
print("Calculated Translate:",t.T)

print("Rotation: ",RotationMat2euler(c2_camera_rotation))
print("Calculated Rotation: ", RotationMat2euler(R))

What I mean is that in the scene I set up, when I use the error-free direction vectors, the result is correct, but when a small amount of error is introduced, the results change significantly. This issue is not related to RANSAC or point matching because, as I mentioned, I selected the points manually, and it was still incorrect. I believe small reprojection errors are causing this. How can I optimize the process? or What should i do?

The code above is just a similar code to illustrate my logic.

Thank you for your help.