Convert CV_32SC2 to CV_8UC3

I’m actually using C# (port for Unity) but I find Python code is most easy to translate.

Basically, I just want to convert a point vector to a mat, warpAffine said mat, then change its type from CV_32SC2 to CV_8UC3 (required for converting to a texture that I can display).

I’ve tried

mat.convertTo(mat, CvType.CV_8UC3);


mat.convertTo(mat, CvType.CV_8UC3, 255.0);


mat.convertTo(mat, CvType.CV_8UC3, 1.0 / 255.0);


mat.convertTo(mat, CvType.CV_8UC3, (255.0 / 4294967296.0));

Could be an XY problem, anyone know how to do this?

in that case, converting to a 3 channel, uchar mat does not make any sense.

you probably want to draw the points into an (existing or blank) image, and display that.

tell us, where do you get those points from ?

Thanks for your reply, I think you’re right about that, and it turns out that this was an XY problem.

My actual problem isn’t displaying the image but actually having a Mat that represents the warpAffine in a meaningful way.

Take this example:

        List<MatOfPoint> matOfPoints = new List<MatOfPoint>();
        Mat hierarchy = new Mat(mat.height(), mat.width(), CvType.CV_8UC3);

        // find contours of whole image
        Imgproc.findContours(chromaKeyed, matOfPoints, hierarchy, Imgproc.RETR_TREE, Imgproc.CHAIN_APPROX_SIMPLE);

        // loop each contour
        for (int i = 0; i < matOfPoints.Count; i++)
            MatOfPoint points = matOfPoints[i];

            // create convex hull around piece
            MatOfInt hullInt = new MatOfInt();
            Imgproc.convexHull(points, hullInt);

            List<Point> pointMatList = points.toList();
            List<int> hullIntList = hullInt.toList();
            List<Point> hullPointList = new List<Point>();

            for (int j = 0; j < hullInt.toList().Count; j++)

Here I select a specific contour from an image, then generate a convex hull around said contour, the next step is to add each Point along the hull to a list for use later.

Now, my issue is warpAffine’ing that list of Points into something meaningful, so I convert the List to a Mat, and warpAffine it, except the next step is to crop the Mat based on the values that I get back from the warpAffine. See below:

// rotate the hull points
        Mat hullPoints = Converters.vector_Point_to_Mat(piece.hullPoints);
        Imgproc.warpAffine(hullPoints, hullPoints, rotationMatrix, new Size(hullPoints.width(), hullPoints.height()));

        // find bounding rect with new hull points
        OpenCVForUnity.CoreModule.Rect rect = Imgproc.boundingRect(hullPoints);

        Imgproc.rectangle(hullPoints, rect, new Scalar(255, 255, 255, 255));

        Mat croppedHullPoints = new Mat(hullPoints, rect);

Except that I get an error on the cropping part with this error message:

Mat::n_1Mat__JIIII() : OpenCV(4.6.0-dev) C:\Users\satoo\Desktop\opencv\modules\core\src\matrix.cpp:767: error: (-215:Assertion failed) 0 <= _rowRange.start && _rowRange.start <= _rowRange.end && _rowRange.end <= m.rows in function ‘cv::Mat::Mat’

Which looks to be caused by an oversized Rect, but I got the points for the Rect from the hullPoints, so how is that possible?

Well, after investigation, the image produced after converting the points to a Mat is this:
(it’s a solid white line, hard to see on this background)
Which is nothing like what the points are supposed to represent:
(can’t upload a second image, it’s a square shape with a convex hull surrounding it, basically nothing like the first image)
This is presumably a problem with going from CV_8UC3 to CV_32SC2, no?

I think this is a problem with Converters.vector_Point_to_Mat(), is that the correct method to use?


first, apologies for our overeager spambot here …

please be more explicit. what’s “meaningful” here ?

warpAffine() transfoms images, not points. might be simply the wrong function.

do you simply want to rotate your hull points ?
how do you derive the rotation matrix there, what is it’s shape ? (is it a homogeneous transform ?)

That’s alright, rather an overeager one than another that doesn’t work :slight_smile:

Ah gotcha.
I have an image where I derive all this point information, but that image gets rotated, so I need to rotate the point information along with it.

The rotation matrix is derived like so:

Mat rotationMatrix = Imgproc.getRotationMatrix2D(, (-90f + minAreaRect.angle), 1);

Where minAreaRect is a RotatedRect surrounding the contours of the original image.

            MatOfPoint2f points = new MatOfPoint2f(Converters.vector_Point2f_to_Mat(allPiecePoints));

            RotatedRect minAreaRect = Imgproc.minAreaRect(points);
1 Like

have a look at transform() instead of the warpAffine().

(and, idk, eventually you have to make homogeneous 3d points (add a 1 for z) from your 2d hull points)

I think that transform could work, but it won’t work with a CV_32SC2 mat, since it messes up all of the points.

Converters.vector_Point_to_Mat() is most certainly the issue here. Reading up on it, it’ll take an input of points and then convert them into a mat with just 1 column, which is not what I need. As such, I created a method to convert it myself, like so:

public Mat PointListToMat(List<Point> points, int rows, int cols)
        Mat mat = new Mat(rows, cols, CvType.CV_8UC3);
        for (int i = 0; i < rows; i++)
            for (int j = 0; j < cols; j++)
                if (points.Contains(new Point(i, j)))
                    mat.put(i, j, 255, 255, 255);
                    mat.put(i, j, 0, 0, 0);
        return mat;

I’m really not sure if that’s how I’m supposed to do it, but it does work.

That code will give me an accurate points mat and I can warpAffine it, but cropping it still fails as the Rect returned by boundingRect on the hullPoints is still an empty set when applied to the Mat.

Do you think I would benefit from doing all of the rotation and cropping maths myself? Given a rotation matrix, I don’t see how it could be too difficult to apply it to each point in a list. Maybe? If you think that’s advantageous, do you have any tips for where to start with that?

Failing that, I think I’ll go back to square one and rethink my design, frankly there’s a couple of things that need updating I just really wanted to see my current method through.

in what way? show us some data. inputs, expected, actual.

Okay sure, I’ll try my best to explain why I don’t think it will work.

I’m creating a program to solve a jigsaw puzzle, it takes in an input image from a camera of all of the pieces laid out flat, no overlaps, corrects any perspective issues, then identifies each piece using inRange to create a mask and findContours to identify each pieces edges. Using the contour, I create a convex hull and then find convexity defects. These convexity defects are used to identify ‘tabs’ and ‘blanks’ in each piece of the jigsaw. Once I have the points that make up the tabs and blanks, I subtract them from the original contour list temporarily while I create a minAreaRect around the remaining parts of the piece so that I can effectively rotate it to a 0 degree bearing from North. This is where I’m at, the piece is rotated but any information previously obtained is not, such as the convex hull, contours, tabs and blanks information, etc… so I’m just hoping to rotate them along with the image. Everything I just explained needs to happen in that order because I need those tabs and blanks to correct the rotation of the piece to a 0 degree bearing, so rotating the image first is not an option.

My issue is that I store the convex hull Points in a list (or Vector, whatever you prefer), and converting that into a Mat for rotation using warpAffine or transform gives a 32SC2 Mat, which from my testing, is not representative of the same ‘space’ as the image, I.E. the points don’t line up how they should.

Here’s an imgur library of a bit of everything that happens, maybe it’ll be easier to understand that way:

I decided to completely restart like I mentioned in the previous post, and crop from the beginning, but still no luck unfortunately. My main trouble at the moment is rotating the final image into a Mat that is “just the right size” for the rotation, I think I could do this if I can figure out how to rotate the convex hull, so back where I started really.

         // rotate the hull points
        Core.transform(convexHull, convexHull, rotationMatrix);

That’s how I currently rotate the hull points, but it returns a solid white line when I display it on the screen (presumably because it’s not an image) (see last imgur picture)

My question still remains, how do I rotate these points with the image?

I’m an idiot, this is working totally fine, my method for displaying was totally wrong. I’ve successfully managed to rotate the hull points, woo! Sadly, not my final problem, I’m trying to warp affine the image now into a Mat that “perfectly” fits the image but I’m having some troubles. Some images (the ones that don’t require much rotation) work fine, but others that require more rotation have some offset issues where the region of interest isn’t wholly within the image. See below:

Here’s the code I’m using for that:

    public void RotatePiece(Piece piece, Mat rotationMatrix)
        // rotate all hull points
        Mat hullPoints = Converters.vector_Point_to_Mat(piece.hullPoints);
        Core.transform(hullPoints, hullPoints, rotationMatrix);

        Converters.Mat_to_vector_Point(hullPoints, piece.hullPoints);

        // create a bounding rect around the hull points so we know how big to make the destination image
        OpenCVForUnity.CoreModule.Rect rect = Imgproc.boundingRect(hullPoints);

        // rotate the image preview with correct width and height
        Mat mat = new Mat(rect.width, rect.height, CvType.CV_8UC4);

        Imgproc.warpAffine(piece.pieceMat, mat, rotationMatrix, new Size(rect.width, rect.height), Imgproc.INTER_NEAREST, Core.BORDER_REPLICATE);

        ProcessSprites.instance.AddProcessImage(mat, "PieceRotationCorrector");

I tried following this post but it’s not quite working like it probably should, see below:

Any ideas?