Hi Laurent,

On Sun, 2007-05-13 at 23:14 +0200, Laurent Pinchart wrote:
> Hi Julien,
> 
> 
> On Friday 11 May 2007, julien quelen wrote:
> > Hi all,
> >
> > I am trying to build a synchronized multiple cameras capture.
> > I started with 2 Logitech QuickCam UltraVision USB webcams.
> >
> > I have difficulties synchronizing them.
> >
> > In my tests, I manage to extract the exact timestamp of each captured
> > frame.
> 
> I'm curious, how did you do that ? Which time source did you use ?
> 

On the one hand we have in all USB packets:
- PTS field containing the device clock time when the current video
frame capture began.
- SCR field containing the device clock time sampled at USB packet
emission.

On the other hand we have the host clock time when we received an
complete URB.

What I need is: the host clock when a new video frame capture begins.

To know that, I assume the following approximation: For one URB
received, the SCR time of the URB's last packet and the host time when
this URB was received complete correspond to the same real time.

So at each complete URB, I store last packet's SCR (Tscr) and host time
(Thost) with do_gettimeofday.

When uvc receives a URB containing a packet with a video EOF, I take
packet PTS (Tpts), I take last stored host time (Thost) and I substract
to it the time (in us) that separate last URB completion and current
video SOF time.

Thost(Video SOF) = Thost - ((Tscr - Tpts) / dwClockFrequency * 1000000))


I agree that this calculation starts from an approximation, however I
guess the time error should be less than 500us..

Then I can compare video SOF times of several usbcams and the results
look coherent.

Please let me know your opinion on this calculation.. I am open to any
comment!..

> > So then the synchronization can be made by sofware (comparing 
> > timestamps). But I would prefer the cameras to be directly synchronized
> > at the source (shot on each camera is taken at the same time).
> 
> That's an interesting idea. Both cameras have a common time source (the USB 
> SOF tokens), so they could in theory start streaming at the same time. In 
> practice, the feature will have to be implemented in the camera.
> 
> > The only way I found to do that for the moment is starting the 2 cameras
> > at the same time (simultaneous call of VIDIOC_STREAMON for the 2 cams. I
> > use mmap method, USB isochronous transferts). Frames appear to be more
> > or less synchronized. However sometimes, the 2 cameras start totally
> > dephased.
> >
> > Is there a way to be more precise in the image capture start?
> 
> I don't think there's currently a way to synchronise acquisition on several 
> cameras. I'll pass the idea to Logitech, they might add the feature to future 
> models.
> 
> Best regards,
> 
> Laurent Pinchart

Best regards,

Julien Quelen

_______________________________________________
Linux-uvc-devel mailing list
[email protected]
https://lists.berlios.de/mailman/listinfo/linux-uvc-devel

Reply via email to