Hi,
Im using a python script to recieve images from a depth camera and when i started reading in the values of each pixel i had to normalize it to actually display it with imshow()
So i used this command to normalize my image:
cv2.normalize(depth_array, depth_array, 0, 1, cv2.NORM_MINMAX)
depth_array is a 2D numpy array that contains the values for each pixel.
After working some time with it I realised that the closest objects were always the darkest no matter how near they were and it realised that it takes all the read in values and normalises the between the min and max values that were read.
So i want that the images are normalised from the minimum distance of the depth camer to the maximum distance of it, so that 0.2 - 8 gets normalised between 0 and 1.
Is there a different way to normalize the image so that it works like i need to?
Im really sorry for the bad english, im no native speaker.
Thanks a lot in advance