I have an image with exactly 12 colors. When I run median blur, it ends up introducing new colors because it works across each channel. If I collapse to grayscale, I risk losing colors that end up merging. Any suggestions, or another library I could use?
yes, the median isn’t calculated on color tuples but on each color plane individually. to achieve this “atomicity” of colors, the “trick” is to use a palette.
in more detail:
kmeans or other clustering, to calculate a palette of the picture
turn picture from BGR to palette values (nearest neighbor lookup in the palette, for every pixel)
Thanks Alejandro, I’m familiar with how blur works. The concept of MEDIAN blur/filter, is that it ideally maintains the pixel values already in a photo. Directly quoting opencv documentation:
Median Filter
The median filter run through each element of the signal (in this case the image) and replace each pixel with the median of its neighboring pixels (located in a square neighborhood around the evaluated pixel).
That said, OpenCV insists on treating each channel separately, thus leading to my issue.
Each channel of a multi-channel image is processed independently.
Thanks crack, that’s exactly the approach I have in mind.
I already have the list of colors/palette in the photo, and I don’t even need to do nearest neighbor lookup. I can just map the colors to index values out of a list, since I know there’s exactly 12 colors. ie. I can make (50,50,50) to be 1, (25,65,80) to be 2, etc.
My current issue is implementing this in numpy, and mapping each pixel in a performant manner. Going pixel by pixel with a switch statement isn’t very efficient I think.
You are right, I confused median with average in my answer. OpenCV apply median on each channel, not preserving original colors, and this was exactly your original point.