I'm working on a project that involves streaming from multiple web cams.  By and large, things work, but there are some gotchas.

 - my system is a P4 2.8GHz with 1G of memory
 - running Debian with kernel 2.6.17
 - I have 8 logitech QC fusions, as well as 6 logitech QC messengers on the system
 - the 8 fusions are grouped 4-each into two USB2 hubs
 - I'm capturing the fusions at 640x480x15fps
 - I wrote a capture program (based on luvcview) which grabs frames from a uvc cam and dunks it into a datafile with a timestamp: I run 8 instances of this capture program, one for each of the fusions
 - the messengers, being USB 1.1, can't stream, but I snap pics from them round-robin fashion, hitting each cam about once every 3 or 4 seconds
 - I'm also recording audio from each of the 8 fusions as well
 - the entire recording system is stable enough to run for 2-3 hours or more (generating 70-80G of audio and video)

  Because I'm getting MJPEG encoded frames from the fusions, the frames can be directly stored without any post-processing, and consequently there is very little CPU overhead in capturing the data.  I can capture 8 640x480x15fps video streams, plus record 8 audio streams, and snap pics from the messengers, all on the same system, and CPU utilization is spiky but generally low.

  However, there are gotchas, not all of which I have solutions for:

 - Audio recording from the fusions has a persistent pop or click in it.  This makes the audio unusable directly, but it's not difficult to remove the clicks using something like audacity or gnome wave cleaner.

 - While the USB 2 bus can support 8 streaming cams, there seems to be a fair bit of jitter in capture time.  The capture utility I wrote time-stamps each frame as it receives it, but while the frames are generally uniformly spaced 0.066s, there are occasional dropouts and periods where frames are snapped very quickly one after another.  I haven't done enough testing to know whether this jitter gets worse with more web cams or not.

 - Likely as a consequence of the time jitter in the USB bus (or maybe a problem with the cams?  or maybe my capture program?), I'm having a lot of trouble trying to synchronize the video streams from the fusions with the audio I recorded.  I'm still working on a solution to this one... :(

  Capturing the cameras in YUV format may introduce other bottlenecks because of the increased bandwidth, I haven't tried that.  My 640x480 MJPEG frames come in around 70k apiece.

  Hope this helps!

Denis

On 10/20/06, Pablo dAngelo <[EMAIL PROTECTED]> wrote:
Hi all,

I would like to build an omnidirectional video camera system, that can capture a 360° degree view of a scene, similar to the ladybug camera by pointgrey ( http://www.ptgrey.com/products/ladybug2/index.asp). Such a system would probably consist of 4 or 5 cameras rotated by 90 or 72 degrees each, and maybe another one pointing upwards. They should all capture images in parallel.

Since it should be quite cheap (but high quality), my only option is to use webcams, either based on USB or Firewire (although the only affordable firewire camera I found, the Fire-I seems to be out of production).

I have a single Philips 740K camera, which has a nice picture quality and provides a standard M12x0.5 lens screw mount, so I can mount some third party wide angle lens. However, it is limited to USB 1.1, that will lead to a very limited framerate, I fear.

Therefore I'm thinking about using USB2 webcams, such as the Logitech Fusion or Pro 5000, to increase the overall framerate I can archive with this driver. The question now is, which framerate could I expect from 5 UVC USB2 cameras connected to a single laptop? Or is there a principal problem of using that amount of cameras at the same time?

On
https://lists.berlios.de/pipermail/linux-uvc-devel/2006-March/000365.html,
David Moore noticed:
> Hi, I'm interested in connecting more than one camera at once,
> specifically, a pair of Logitech Quickcam Pro 5000s.
> I only have one at the moment, but when I run it at full uncompressed
> resolution (640x480x30fps YUV), it chooses an ISO endpoint with a packet
> size of 3060 bytes.  This is more than half of the available ISO
> bandwidth. (cat /proc/bus/usb/devices claims it's 60%)

Is there a compression method supported by the cameras (for example MJPEG), that would lead to higher compression, thus allowing a higher total framerate?

Btw. Thanks a lot for developing this driver!

ciao
  Pablo
__________________________________________________________________________
Erweitern Sie FreeMail zu einem noch leistungsstärkeren E-Mail-Postfach!
Mehr Infos unter http://freemail.web.de/home/landingpad/?mc=021131

_______________________________________________
Linux-uvc-devel mailing list
[email protected]
https://lists.berlios.de/mailman/listinfo/linux-uvc-devel

_______________________________________________
Linux-uvc-devel mailing list
[email protected]
https://lists.berlios.de/mailman/listinfo/linux-uvc-devel

Reply via email to