A lot of webcam-class cameras support some sort of single shot mode, and that might be a path worth exploring. I’m thinking of triggering a single shot simultaneously on both cameras. You’d have to be ok with (presumably) a lower frame rate, jitter (you might have trouble getting frames exposed at a consistent interval) and some amount of skew between the pseudo-synchronized “streams.” To be clear, I don’t know that this will work at all, but I think it has a good chance, at least for some cameras.
If I were in your shoes, I’d start by doing something like:
Set up single shot mode on your camera, point the camera at a high resolution stopwatch (the stopwatch on my iphone has been good enough for me in the past), and take successive snapshots with the camera with random delays in between.
- Note the system time just prior to triggering a single shot.
- Note the system time just after receiving the frame.
- Note the timestamp field in the buffer that was returned.
- Save the image along with trigger time, return time and buffer timestamp (I usually encode these in the image filename, but you can get as fancy as you want.
For extra credit, analyze the captured image and extract the current time on the stopwatch.
I’d be looking for how consistent the relationship between the stopwatch (in the image) time is with when you triggered the single shot. IF it’s consistent, then maybe you have a path forward with this. Repeat with 2 cameras, triggering them simultaneously. Or, I guess you could just start with this and compare the image they captured - just be careful to position the cameras the same, so you don’t get any discrepancies due to the rolling shutter (which is presumably what your webcam uses).
Another option is that you can run two streams at once, and inspect the timestamp values. Restart one of the streams until you get timestamp values that are close enough to each other for your needs. Just be ready for drift between the two streams. (Or instead of fully restarting the stream, you could just call VIDIOC_STREAMOFF and then VIDIOC_STREAMON at the right time.
One more idea is to run two streams and inspect the timestamps coming off of both streams, then intentionally delay the return of the buffer to one of the streams to try to force them into closer synchronization. To do this you will need to hold on to all of the buffers, and choose when to return the first buffer to the camera. This is similar to the “software genlock” trick that has been used for synchronizing video outputs - that is to say, a hack, but one that might just work.
Again, these are all just ideas, and they require that you have varying degrees of control over the cameras. Maybe there is enough support via OpenCV to do the single shot test, but the other ones will need lower level access, so you either need to control the camera directly (e.g. using the V4L2 interface) or be willing/able to edit and compile the OpenCV code to get at the bits you care about.