The data is 2D, the source of the data is a pair of profiling sensors.
The scan data has a slight clockwise rotation. I have no control over how the pieces are handled and presented to the sensors. I have to assume there will always be in “slightly” different positions and angles.
I manipulated the images in Gimp and found an approximate scale of 0.96 (that is, the scanned image is about 4% smaller than the reference image, based on measuring the outside distance between the top and bottom tines - looks like a fork to me, so I’m calling them tines…not sure what it actually is)
So #1, there does appear to be a scale between the two, so if you aren’t expecting that you might want to figure that out first.
I just copied/pasted the scanned image on top of the reference image and manually aligned them by rotating the pasted layer. I found that the scanned image was about 0.7 deg rotated clockwise compared to the reference image.
So how to automate this? If it’s for just this one item I’d probably do something ad-hoc…like find all the lines that are approximately +/-5 degrees from horizontal and compute an average angle for the two groups, then compute the difference to the reference. That’s your rotation. I’d probably use the front points on the tines as a reference for translation.
But there is probably a much better / more robust / more general way to approach it…but I’d start with what I described above and see how well I could make it work.
Iterative Closest Point (ICP). your data is points. treat it as such.
or else chamfer matching. involves a distance transform. should work tolerably here.
if you need an initialization for ICP, you could come up with local descriptors on your point set (or sequence…). orientation (linear fit to local environment) and/or curvature might describe features of this data. or just drag it into place manually.