Stereo block matching on int16 images

I am interested to perform stereo block matching with 16 bit images, but cv::StereoMatcher::compute() currently only works with 8 bit images. Does anyone have an idea what the level of effort would be to support this or what changes would be required? Of course we can scale the 16 bit images down to 8 bit, but it would be great to use the full dynamic range for our application.

I think the kernel should be trivial enough. it’s just correlation, or maybe squared differences, or something like that. making it all look elegant may be a lot of work. if you’ve seen OpenCV code you know what I’m worried about.

perhaps tone-map, or just high-pass, the data.

I was hoping it could be as simple as allowing cv::Mats of type CV_16UC1 past the initial input type checking. Sounds like you are talking about the OpenCL kernels? I have not worked with OpenCL before and am not sure if we would need that or SIMD for our application at this point.

We’re definitely looking into ways to normalize to an 8 bit range to use the existing implementation as well, but it seems like we will lose at least some useful amount of dynamic range in the process.