I believe that there is a bug in cv2.resize in conjunction with INTER_AREA interpolation, but I wanted to ask here first just in case I missed something in the docs. My question is: Is this a bug or where is this behavior documented?
Resizing a 3x1 image to 1x2 should produce two pixels where the value of both is equal to the average of the 3 pixels, but that does not seem to be the case.
Example
3x1 input Image: [[0.1, 0.2, 0.3]]
Resized to 1x1: [[(0.1 + 0.2 + 0.3) / 3]] = [[0.2]] (works correctly as expected)
Expected result when resizing to 1x2:
[[0.2],
[0.2]]
Actual result when resizing to 1x2:
[[0.16666667]
[0.16666667]]
Code example
import numpy as np
import cv2
src = np.float64([[0.1, 0.2, 0.3]])
print("Resized to 1x1:\n")
print(cv2.resize(src, (1, 1), interpolation=cv2.INTER_AREA))
print("\nResized to 1x2:\n")
print(cv2.resize(src, (1, 2), interpolation=cv2.INTER_AREA))
you squeeze a 1x3-shaped matrix (3 wide, 1 high) into a 1 wide 2 high matrix (2x1-shaped).
that is a weird thing to do, but okay.
both pixels of the result should be equal, but their value can be unexpected. I see now why you’d expect it to be 0.2. I actually would think that too.
be aware that opencv uses fixed point (integer) math. that can be (barely) noticeable in some instances.