False positive in Template Matching

I am performing template matching in small regions, to validate the presence or not of a template.

My problem is that, with evident different images, I am getting a very good match score (0.95).

I would like to know how to validate the match to figure out if it is a false positive or not, and if there is any way to prevent this situation.

how do you calculate that? matchTemplate has multiple scoring options. details please.

Thanks for your answer.

I am using the normed options (cv.TM_SQDIFF_NORMED, cv.TM_CCOEFF_NORMED, cv.TM_CCORR_NORMED) with very similar results for all of them (considering that the minimum is representative of the match for SQDIFF)

0.95 is not necessarily a good matching score. how do you determine what’s a good score and what isn’t?

Oh, I am considering a good match everything that is over 0.92.

What I am doing is:

1- Compute template matching between source and template
2- Get MaxVal and MaxLoc from the resulting image (I am getting values between -1 and 1)
3- If MaxVal > 0.95, then I have a match.

How should I determine what’s a good score and what isn’t?

if those two pictures you posted match with 0.95 or better, but you don’t want them to match, that pair’s actual matching score should be a lower bound on the threshold you should use.

find some pictures where your eyes say it’s barely a match, check how well that scores. that’s the threshold I would pick. I suspect a proper match would score quite a bit closer to 1.0

the situation might even require you to express the score in terms of difference/dissimilarity instead of similarity, meaning by how much they differ… because you might need to say 1% or 0.1% or 0.01%, and expressing that as 99%, 99.9%, 99.99% is just silly.

In most case, template matching on edge gave better result
Here your template is mostly wight that will give you good correlation near everywhere on your test image. (Same thing for SD)
Could you provide more images with the expected result?

I am getting a lot of patch images from a big image, to consider them “templates”. Do you mean that I am choosing a region as template that is not suitable? How can I determine that the template (the one in the image I showed for example) is not a good candidate for being a template?

I tried a simple method to discharge this kind of templates, that is checking how many matches do I have over a certain threshold (0.9 for example) in my result image. I noticed that for this kind of images, I have a complete band of high values. The problem is that I am also discharging good matches with this method. I can’t find the trade off to determine which images are good to be consider templates and which are not.

Sorry, i was confuse by my phone screen (only see half of the template image)
It’s normal to have a band of hight values considering the template and the image, a lot of patches have half white on left and half dark on right so correlation will be very hight
If you try to detect small white spikes, maybe you should try to reduce your pattern images.

Check this example to use pattern matching on object edge

You can use connectedcomponent to fix thresh and find a possible location. A weighted surface with correlation should be tested :

#include "opencv2/opencv.hpp"

#include <iostream>
using namespace cv;
using namespace std;

static void AddSlider(String sliderName, String windowName, int minSlider, int maxSlider, int valDefault, int *valSlider, void(*f)(int, void *), void *r)
    createTrackbar(sliderName, windowName, valSlider, 1, f, r);
    setTrackbarMin(sliderName, windowName, minSlider);
    setTrackbarMax(sliderName, windowName, maxSlider);
    setTrackbarPos(sliderName, windowName, valDefault);

struct SliderData {
    Mat img;
    int thresh;

static void UpdateThreshImage(int, void *r)
    SliderData *p = (SliderData*)r;
    Mat dst, labels, stats, centroids;

    threshold(p->img, dst, p->thresh, 255, THRESH_BINARY);

    connectedComponentsWithStats(dst, labels, stats, centroids, 8);
    if (centroids.rows < 10)
        cout << "**********************************************************************************\n";
        cout << " thresh = " << p->thresh / 255.0 << "\n";
        for (int i = 0; i < centroids.rows; i++)
            int x = (int)centroids.at<double>(i, 0);
            int y = (int)centroids.at<double>(i, 1);
            if (dst.at<uchar>(Point(x,y)) == 255)
                cout << "(x, y) = ("<<centroids.at<double>(i, 0) << ",";
                cout << centroids.at<double>(i, 1) << ") surface = ";
                cout << stats.at<int>(i, 4) << "\n";
        cout << "----------------------------------------------------------------------------------\n";

    imshow("Max corr", dst);
void main(void)
    Mat imgSrc = imread("source.jpg",IMREAD_GRAYSCALE);
    Mat imgTmp = imread("template.jpg", IMREAD_GRAYSCALE);
    Mat result;

    imshow("Original", imgSrc);
    imshow("Template", imgTmp);

    Scalar m1 = mean(imgSrc);
    Scalar m2 = mean(imgTmp);
    Mat imgSrcF;
    Mat imgTmpF;
    Mat rNorm;
    SliderData ps;
    ps.thresh = 0;
    cv::matchTemplate(imgSrc, imgTmp, result, TM_CCOEFF_NORMED);
    normalize(result, rNorm, 1, 0, NORM_MINMAX);
    rNorm.convertTo(ps.img, CV_8U, 255);
    imshow("matchTemplate", ps.img);
    AddSlider("Level", "matchTemplate", 200, 255, ps.thresh, &ps.thresh, UpdateThreshImage, &ps);
    int code = 0;
    while (code != 27)
        code = waitKey(50);


please post more actual data. we can’t help if you’re not being forthcoming.