Estimate "true resolution" of an image; a simple approach

I need to estimate the “true resolution” (don’t have a better word for it),
to identify

  • poor optics (blurry, foggy, unsharp)
  • poor sensor (i.e. noise. Sensor too small, bad tuning etc.)
  • post “artifical” resizing

I found there are some very academic “direct” approaches for that.

Still… “I had an idea” - and I wonder if that “poor man’s approach”
might deliver acceptable results.

Take the orig image

  • shrink it with a detail-preserving alg (Lanczos etc.).
  • then resize back to orig size
  • build pixel-wise diff
  • if more then a special amount of pixels differs in more than some value:
    Orig image has more resolution than the tested shrink value. Else less.
  • Do interval nesting on the resize values

Will this do? Or complete nonsense.

Maybe too slow comparing to other algs or
too much dependent on some invariant image characteristics, etc.

TIA :slight_smile: