I need to combine
ORB descriptors of an image.
As you know,
SIFT descriptors are of 128-length and
ORB descriptors are of 32-length.
A possible way that I have thought is :
SIFTdescriptors to 32-length. For instance, reshape a (135, 128) descriptor to a (540, 32) descriptor
ORBdescriptors (since at this moment both have 32-length)
sift_kp, sift_desc = sift.detectAndCompute(img,None) new_sift_desc = sift_desc.reshape((int(128/32) * sift_desc.shape, 32)) orb_kp, orb_img_descriptor = orb.detectAndCompute(img,None) all_descriptors = np.concatenate((new_sift_desc , orb_img_descriptor), axis=0)
After combinating the descriptors, the idea is to use
all_descriptors in order to perform feature matching against another image.
The problem I find with this approach is that binary descriptors (like ORB) and classical ones (like SIFT) use different types of distance (Hamming vs Euclidean).
Thus, I don’t know if it is possible to convert SIFT descriptors into binary or viceversa.
Or if there is a better way to combine these descriptors.