Hi there!
The point of the project
I’m currently trying to create a suitable infrared LED position tracking system for my project.
As the base layer I’m using the SimpleBlobDetector to detect the 4 bright LEDs “marker” on an under-exposured video feed. Then I have a points_3D matrix which represents the physical dimensions of my marker and relative distance between the LEDs to use in solvePnP.
Code
The simplified code looks something like this, assume the camera capture, blob detector and some other params are predefined:
# The matrix which row represents each LED relative position as (X, Y, Z) offsets
points_3D = np.array([
(0, 0, 0), # Blob Zero
(0, -4, 0), # Blob 1 is 4 units UP from zero
(-2.2, -4, 0), # Blob 2 is 4 units UP and 2.2 units LEFT
(-7.4, -4, 0), # Blob 3 is 4 units UP and 7.4 units LEFT
], dtype="double")
# A helper matrix to draw the marker position and rotation (as XYZ axes), used for a drawing function I'm not displaying here
axis = np.float32([[3, 0, 0], [0, 3, 0], [0, 0, -3]]).reshape(-1, 3)
# Detecting the LED light as blobs, this all works nice
keypoints = detector.detect(prepeared_frame)
# Converting to simple points array
points_2D = np.array([point.pt for point in keypoints], dtype="double")
# Only solving PNP when all 4 points are available
if len(points_2D) == 4:
success, rotation_vector, translation_vector, _ = cv2.solvePnPRansac(
points_3D,
points_2D,
calib_matrix,
dist_coeffs,
cv2.SOLVEPNP_P3P) # Also played around with other methods, but that's not the problem here
# If found a solution: project the marker's XYZ vectors and display them
if success:
points, jacobian = cv2.projectPoints(axis, rotation_vector,
translation_vector,
calib_matrix, dist_coeffs)
# Draw everything
drawAxes(frame, points_2D, points)
Some results
The solvePnP function only works fine when the points_2D order (i.e. the blobs order) strictly matches the expected points_3D order. Here I have all the points numerated and detected in valid order, as it is defined in points_3D (take a look at points_3D definition in the code and compare to the picture, you’ll understand what I mean). As long as the order is valid - axes do match the reality. You can see the marker is kind of a reverse ‘L’ shape containing 4 LEDs.
See the 1st screenshot
As soon as the points_2D (blobs) order doesn’t match the points_3D order - it fails to compute proper axis relation, you can see the lines don’t even match the marker plane.
See the 2nd screenshot
Question
Is there any possible solution (hardware, e.g. marker form, or software) which will help solvePnP work with of any blobs order? Some way for it do decide which point is which? Or maybe there is any algorithmic way of sorting every blob the way solvePnP expects?
Something tells me that that might be impossible as you can probably rotate any given 3D points group that way, that there will be more than a single 2D projection of them on a plane… Or, I just might need more LEDs…?