ORB: Low matches on low contrast images


I have been experimenting with feature matching on a wide variety of images with ORB. I am somewhat surprised to find the algorithm failing on a certain subset of images, namely those with blue backgrounds. These failures are surprising to me because I would expect a performance decrease but not a total failure, especially considering not every letter in my source image has a color poorly contrasting against the background.

28 keypoints, 1 match.

Increasing the brightness of the source background helps somewhat, getting 47 keypoints and 16 matches, but what is still surprising is that none of the letters beside the g get matched. I am having to describe this in words this because this forum will not let me link more than one image as a new member.

For the record, blue source background on blue target background also fails with four matches only, all around the corners.

Red source background on white target background gives 216 keypoints and 63 matches, more in line with what I was expecting. Surprisingly, the red e matches fine despite the similarly red background.

White source background on white target background gives 874 keypoints in the source image and 527 matches. This is the ideal I am trying to reach.

I know that preprocessing steps such as sharpen kernels and equalization histograms can improve ORB’s performance. But I will not be able to easily detect low contrast images ahead of time and do not want to naively apply contrast corrections to all my images in fear of increasing the false positive rate, so assume the images will be inputted into ORB as they are.

I initially assumed this was a problem with colors being too close to each other to be distinguishable by ORB, but it looks like not every background color fails equally. This makes me wonder why blue stood out as being especially bad.

So what I want to know is where the problem is. Is it:

  • A problem with low contrast images in general
  • A problem with a specific color scheme not being registered by ORB, like blue on blue
  • A problem with something else I haven’t though of

Any advice is appreciated.

All my images are non-aliased 24-bit RGB. Also, the parameters of my ORB instance are:
orb = cv.ORB_create(nfeatures=100000,scaleFactor=1.1,edgeThreshold=5)

does ORB see color?

if it doesn’t, it sees grayscale.

is it robust/invariant to inversion? which is what you have here, where the local appearance is inverted when your background is darker than the letter in one picture, but brighter in the other.

this behavior can be explained from its design. you would have to read the relevant paper(s) by its author(s). feature descriptors are designed to be robust/invariant to various perturbations. intensity changes are commonly handled. inversions aren’t because that’s rarely occurring in natural images.

if that doesn’t solve your questions, check the implementation.

have you tried other descriptors? how do they fare? which of those are color-aware?

Yes, ORB only sees grayscale. In my case, this:


I hadn’t put too much thought into the effects of grayscaling until I read your post, but I think this pretty much shows the root of the problem.

1 Like

bummer. the docs don’t mention if any of the available descriptors can do color.

there are variants of some descriptors that incorporate color. I’ve heard of CSIFT. not sure if that’s actually suitable.

if this is important to you, you have options:

  • strip the background color (set to black or sth)
  • convert color space so you get better discrimination (possibly apply PCA)
  • run a sobel or sth on the RGB variant, then collapse that to single channel (sum/max/cvtColor)