On Wed, Nov 24, 2004 at 08:12:10AM +0100, Vojtech Pavlik wrote:
> On Wed, Nov 24, 2004 at 03:40:14AM +0100, Luca Risolia wrote:
> 
> > > The fact that the camera contains an OmniVision sensor and the fact that
> > > it uses the ovcamchip module don't imply that the camera can do YUV in
> > > hardware.
> > 
> > I doubt the USB controller/firmware "knows" whether the video format is RGB
> > or YUV; it just accepts and outputs whatever arrives from the image sensor.
> > Since OV sensors can output YUV, you should obtain unmodified data from the
> > controller as well.
> 
> I'd expect the OV sensor behaves exactly the same in RGB and YUV modes
> regarding output timing. If it's so, then getting YUV data from the
> sensor should indeed work.

OV sensors are smart enough to work in "master mode" by providing timing
signals to the controller such as VSYNC, HSYNC, PixCLocK etc.. 

> > > > So I repeat: V4L1 does not support Bayer SBGGR8, but does support YUV's.
> > > > Since colorspace conversions are not allowed, this means that you will
> > > > have to use native YUV as default format sent by the OV sensor.
> > 
> > > The sensor in this camera doesn't output YUV. It outputs its native
> > > SBGGR8 data.
> > 
> > False. Isn't t the sensor one of the OV6x or OV7x or OV8x series?
> > If yes, look at the first page of the OV6x and OV7x datasheets.
> 
> I didn't say "can't". I said "doesn't". Doesn't with the Windows driver,
> with the Mac driver, and at this moment with the Linux driver, too.

Ok, you have the OV sensor datasheets and AFAIK it seems you can 
already perform i2c I/O operations, so why not program the sensor?

> I agree there is strong evidence that it should be possible. On 30 Hz
> only (no 60 Hz mode, as that's only available in RGB), but it could work
> iff the FX2 chip doesn't notice.

When operating in "slave mode", USB2 controllers can handle sensor digital 
timings fine. 
You could always decrease the transfer rate one half by programming the 
sensor in QVGA mode. This way you could give interlaced data to the user 
for further postprocessing. V4L2 has a nice interface for interlaced format 
negotiations.

> > > This's actually false. OV sensors support both YUV and raw RGB _natively_.
> > 
> > > The sensors do. The USB bridge implemented by the Cypress EzUSB FX2 chip
> > > necessarily doesn't. It might, though. But it hasn't been done before.
> > 
> > "Sensors do and FX2 chip does not" has not much sense to me.
> 
> There are a whole bunch of registers that I have no idea what they mean
> on the FX2 chip.
> 
> Some will probably need a different setting if I ever try to resize the
> output to anything else than one byte per pixel, 640 byte wide
> scanlines.
> 
> Unlike BGGR8, YUV422 requires 16-byte pixel values, and that means
> 1280-byte wide scanlines.

Most of the time controllers only refer to HSYNC,VSYNC and PCLK input
signals from the video source, when storing data packets in the FIFO 
block (after performing any windowing or decimation). Once YUV is 
programmed in the sensor, try to just expect 2*actual_nbytes per 
frame from the bulk endpoint, without modifying any of the FX2 
registers.

> I'll try to do it. Yes, it'll be useful because it'll make the camera
> work with a much larger software base. It'll also be nice, since I'll be
> able to remove the Bayer interpolation from the driver while keeping the
> functionality for V4L1. If it works.
> 
> It won't push anyone to fix their apps to support format
> (color/geometry) from BGGR8 anyway.

PWLIB/Gnomemeeting CVS already supports BGGR8, I wrote a patch for XawTV 
sometime ago, and someone is attempting to develop a low-level "libv4l2"
library including vendor-specific decoders (where cpu-optimizations would
be possible) and various colorspace convertions.

> Btw, how do you suggest handling the fact that the image is rotated 180
> degrees as sent directly by the cam?

I would probably play with some FX2 registers to fix this.

Attachment: pgpBFSZHA3T67.pgp
Description: PGP signature

Reply via email to