Identifying maximums from polar points

Hi,

To preface I’m using OpenCV for Unity (C#).

I’m trying to get the corners of a jigsaw piece, to do this, I’m getting the contour of the piece and converting each point on the contour to polar coordinates, then plotting those values.

I’m looking to identify the corners, or more specifically, the “sharp” maximums of the graph. I currently have a list of Points and I’m using an algorithm to select maximums, then selecting the previous and next point to get the cross product, using that to determine the “sharpness” of the point. Sadly this isn’t working well, below is a polar plot that is generated, with green dots denoting “selected corners”, which are incorrect.
Polar plot:
133220833398554311

Here’s my function, apologies it’s in C#, the principle is the same:

public List<int> FindCorners(List<Point> values)
    {
        List<int> peaks = new List<int>();
        double current;
        IEnumerable<double> range;

        int checksOnEachSide = rangeOfPeaks / 2;
        for (int i = 0; i < values.Count; i++)
        {
            current = values[i].y;
            range = values.Select(v => v.y);

            if (i > checksOnEachSide)
            {
                range = range.Skip(i - checksOnEachSide);
            }

            range = range.Take(rangeOfPeaks);
            if ((range.Count() > 0) && (current == range.Max()))
            {
                double mag = 0;
                int halfAverageRange = averageRange / 2;
                for (int j = -halfAverageRange; j < halfAverageRange; j++)
                {
                    mag += GetMagnitude(values, i + j, 1);
                }

                double average = mag / averageRange;

                Debug.Log(values[i].y + " -> " + average);

                peaks.Add(i);
            }
        }

        return peaks;
    }

Where GetMagnitude returns the cross product, it’s here for clarity:

public double GetMagnitude(List<Point> pointList, int centerIndex, int depth)
    {
        if (centerIndex < 0 || centerIndex > pointList.Count - 1)
        {
            return 1;
        }

        int prevPointIndex = centerIndex - depth;
        if (prevPointIndex < 0)
        {
            int remainder = centerIndex - depth + pointList.Count;
            prevPointIndex = remainder;
            if (prevPointIndex < 0)
                Debug.LogWarning("Depth is too high, may result in unfavourable results");
        }

        int nextPointIndex = centerIndex + depth;
        if (nextPointIndex > pointList.Count - depth)
        {
            int remainder = pointList.Count + depth - centerIndex;
            nextPointIndex = remainder;
            if (nextPointIndex > pointList.Count - 1)
                Debug.LogWarning("Depth is too high, may result in unfavourable results");
        }

        Point prev = pointList[prevPointIndex];
        Point crrt = pointList[centerIndex];
        Point next = pointList[nextPointIndex];

        if (prev == crrt)
            prev = new Point(0, 0);
        if (next == crrt)
            next = new Point(0, 0);

        Vector3 rhs = new Vector3((float)((next - crrt).x * polarPlotStretchFactor), (float)(next - crrt).y);
        Vector3 lhs = new Vector3((float)((prev - crrt).x * polarPlotStretchFactor), (float)(prev - crrt).y);

        Vector3 cross = Vector3.Cross(lhs, rhs);
        return cross.magnitude;
    }

Anyone have any insight into how I could effectively do this? I’ve seen some posts on StackOverflow about this problem, but sadly none (AFAIK) go into detail about how to achieve accurate corner selection on the polar plots.

Thanks a lot.

Fixed: I was returning a List of maximums but not the magnitude of the cross product, so when they were selected after running the function, they were selected by decreasing index and not by decreasing magnitude.

I also switched to using gradient instead of cross product, it gave me more accurate results.