INTER_AREA unexpected result

I believe that there is a bug in cv2.resize in conjunction with INTER_AREA interpolation, but I wanted to ask here first just in case I missed something in the docs. My question is: Is this a bug or where is this behavior documented?

Resizing a 3x1 image to 1x2 should produce two pixels where the value of both is equal to the average of the 3 pixels, but that does not seem to be the case.

  • 3x1 input Image: [[0.1, 0.2, 0.3]]
  • Resized to 1x1: [[(0.1 + 0.2 + 0.3) / 3]] = [[0.2]] (works correctly as expected)
  • Expected result when resizing to 1x2:

Actual result when resizing to 1x2:

Code example
import numpy as np
import cv2

src = np.float64([[0.1, 0.2, 0.3]])

print("Resized to 1x1:\n")
print(cv2.resize(src, (1, 1), interpolation=cv2.INTER_AREA))

print("\nResized to 1x2:\n")
print(cv2.resize(src, (1, 2), interpolation=cv2.INTER_AREA))

you squeeze a 1x3-shaped matrix (3 wide, 1 high) into a 1 wide 2 high matrix (2x1-shaped).

that is a weird thing to do, but okay.

both pixels of the result should be equal, but their value can be unexpected. I see now why you’d expect it to be 0.2. I actually would think that too.

be aware that opencv uses fixed point (integer) math. that can be (barely) noticeable in some instances.

maybe this helps with the details:

I’d expect that OpenCV would use the floating point resize implementation since the input is not uint8 and conversion would be inefficient.

Anyway, this really seems to be a bug, so I’ve opened an issue now :slight_smile: INTER_AREA unexpected result · Issue #21472 · opencv/opencv · GitHub

I haven’t looked into it much at all, but reading this page:

(figure 3, specifically) would lead me to expect 0.133, 0.266 for the pixel values, but I might be misunderstanding something.