# How to find how much the arrow is rotated?

Hi guys,

I am trying to figure out how much an arrow is rotated in an image.
this is the arrow

there’s one contain the rotated arrow

I can locate the arrow by using template matching.

however, I would like to know how much the arrow is rotated.
if the arrow is pointing to the north, the rotation is 0 degree.
if the arrow is pointing to the south, the rotation is pi(180) degree
if the arrow is pointing to the east, the rotation is pi/2 (90) degree
if the arrow is pointint to the west, the rotation is 3/2 pi (270) degree.

the arrow can pointing at any direction apart from the four directions i mentioned.

I am a new starter of open cv. I need a bit of guidance to point me to the right direction.
I’ve tried template matching, scaled template matching I can get the location of the arrow even it’s rotated in the picture but i am not sure how to get the rotation

please note the image contains the rotated arrow can be high resolution and low resolution depends on the set up. the image i posted is very low setup.

here’s the bit of code i use to locate the template(arrow) - i use a c# wrapper of open cv but i believe you should understand what i am doing here.

CvInvoke.MatchTemplate(imgScene, template, imgout, Emgu.CV.CvEnum.TemplateMatchingType.CcorrNormed);

double minVal = 0.0;
double maxVal = 0.0;
Point minLoc = new Point();
Point maxLoc = new Point();

CvInvoke.MinMaxLoc(imgout, ref minVal, ref maxVal, ref minLoc, ref maxLoc);
Rectangle r = new Rectangle(maxLoc, template.Size);
CvInvoke.Rectangle(imgScene, r, new MCvScalar(0,255,0));

thanks very much in advance.

P

this can probably be solved with a filter bank that contains many rotated variants of the arrow. steps of perhaps 5-20 degrees should suffice.

you get a result like matchTemplate, but for every instance in the filter bank.

filter2D or matchTemplate would be used.

so you cannot use template matching
(or you would have to rotate the image a few degrees for each attempt)

also, do you absolutely have to use a triangular arrow ? that’s a bad choice, it has 120° rotation ambiguities

I have a proof of concept.

• center the template on the eye of the arrow
• create rotated filter instances (getRotationMatrix2D, warpAffine). that’s a “filter bank”.
• matchTemplate with TM_SQDIFF for every filter in the bank (yes that’s expensive, takes half a second). best results are minima (least difference).
• non-maximum suppression (using dilate in case of maxima, or erode in case of minima, and then equality comparison)
• also thresholding on a manually chosen threshold
• for each extremum, get index (thus angle) of filter with best response (np.argmin)

I get a heading of 306°, or 54° counter-clockwise, for your picture. testing 1° steps took me 4-5 seconds (10° steps take half a second). this could probably be done quicker, with some tricks.

convolution/correlation (filter2D) turned out to be bad at this, even if they’re fast at this. these “signals” contain DC components, which always throws simple multiplication off. TM_SQDIFF uses the sum of squared differences, which models the situation a lot better.

it’s doable. I’ll let you puzzle over the details this. the list above should suffice.

visible here:

what else choice do i have here? I’m using template matching even it’s rotated i still can get the arrow position accurately. I assume template matching is doing some smart things by comparing the original arrow(pointing to north) with source the image i have.

ideally i would like to return the result in 100 milleseconds level, even 0.5 sec is too slow for what i’m trying to do …

I would recommend hooking into the game’s process and extracting the original position values from its process memory directly.

as I said, with tricks, processing time can be reduced severely. do not expect anyone here to do the work for you for free.