Compensate micromovement between stills

I am building a photogrammetric head scanner, that aims at high-resolution normal maps (8k+) from multi-angle lighting. The goal is to create image sets of at least 8 lighting situations in the shortest possible amount of time. While I am waiting for the hardware to arrive, I’d like to get an overview of the necessary technologies.

The problem:
Multi-angle lighting necessitates a static subject - people are everything but.
To compensate for the movement of the subject between the images, my idea is to track features between images, store the per pixel deformation, and morph/warp the current image to match the original image based on those stored vectors.

Are there maybe projects or resources that already accomplish this?
Where I could learn about the necessary tech?

Thank you for your time

1 Like