I'm looking at using multiple UVC cameras which likely will not support 
MJPEG.  I'm curious how to calculate my max framerate for a given 
resolution and number of cameras based on a single USB 2.0 bus and thus 
am trying to understand the calculation and the architecture of UVC and 
the uvcvideo driver.

I recall reading somewhere (perhaps the uvccapture source) the uvcvideo 
driver (or perhaps UVC itself, or perhaps a limitation of uvccapture) 
was not able to capture still images from cameras and instead it would 
put the camera in streaming mode and grab frames as directed.  If this 
is true then even if I only want to grab 1 frame per second from each 
camera (to lower overall USB bandwidth) the bus would still need to 
support the full-framerate stream from each camera.  Is this true?  Is 
the streaming framerate a per-camera feature so that I can't necessarily 
count on any given UVC camera to allow me to stream only 1fps on the bus?

Some details of the system I'm working with:
  - Camera (TBD) - likely an OnniVision Image sensor coupled with an 
OmniVision Bridge Controller that talks UVC
  - 4x identical cameras
  - 1 USB 2.0 Host controller (don't have the option of having multiple 
host controllers for greater bus bandwidth)

Thanks for any information,

Tim

_______________________________________________
Linux-uvc-devel mailing list
[email protected]
https://lists.berlios.de/mailman/listinfo/linux-uvc-devel

Reply via email to