I am trying to find the linear displacement of a needle by tracking the patterned feature on a needle.
So I have been able to extract the linear patterns out using binary thresholding.
# Function to process the frame and detect edges
def detect_edges(frame):
# Convert the frame to grayscale
gray = cv2.cvtColor(frame, cv2.COLOR_BGR2GRAY)
# Apply Gaussian blur to reduce noise
blurred = cv2.GaussianBlur(gray, (9, 9), 0)
# Apply thresholding
_, thresholded = cv2.threshold(blurred, 180, 255, cv2.THRESH_BINARY)
return thresholded
Then, I isolated the patterns in a manually chosen ROI region. I basically took the intensity values along the middle of the ROI that I chose.
# Function to get the pixel intensity along a vertical line in the middle of the ROI
def get_pixel_intensity(frame, roi, step):
# Extract the region of interest
roi_frame = frame[roi[1]:roi[1]+roi[3], roi[0]:roi[0]+roi[2]]
# Get the middle column of the ROI
middle_column = roi_frame[:, roi_frame.shape[1] // 2]
# Reshape the middle column into chunks of size 'step' and calculate mean for each chunk
mean_intensities = [np.mean(chunk) for chunk in np.array_split(middle_column, len(middle_column) // step)]
return mean_intensities
This allowed me to produce an output like this,
Each block here representing the pattern/teeth.
But this is where I am struggling to translate this into a vertical displacement. How could I compare for example 2 frames back to back to determine the displacement?
There are some issues that are slightly affecting this where the number of striations representing a teeth in one frame may not be the same number in the next frame. Also some parts of a teeth will eventually go out of frame as it moves.
Should I be trying to find like a midpoint for each teeth to use when comparing with adjacent frames?