On Wed, 17 May 2000, Alan Cox wrote:

> > It is not -nice- to force -all- clients to understand -any- source format;
> > they already have their hands full handling displays and UI... That's why
> > I expect V4L2 to step in with its modularized drivers for all
> > transformations etc. We are just stuck with V4L for the moment.
> 
> General transformations are not going in kernel space. Period.  If V4L2
> decides to do that then V4L2 is not going to hit the kernel either.

Well, I don't know -where- it will be done, but I read this:

http://millennium.diads.com/bdirks/v4l2fx.htm

"A V4L2 video effects device is a device that can do image effects,
filtering, or combining of two or more images or image streams, for
example video transitions or wipes. Applications send data to be processed
and receive the result data either with read() and write() functions, or
through memory-mapped buffers."

> You are right that the apps have to deal with a lot of formats, wrong to
> think the kernel should be involved. Your modular transformations belong
> in userspace as something like libvideotransform.a.

I don't think that the kernel should be involved at all. I only wanted to
say that currently we have device-specific (usually YUV clone) format
converted to RGB24. This is not good, but there isn't much we can do
without breaking lots of clients. Some are already broken, by the way -
Voxilla insists on planar YUV422 (which is rare among native camera
streams). I made a patch to Voxilla (user-space, of course :)

> By putting it in userspace you make it swappable, you make it easily
> updated, you make it expandable for new formats without work, and you
> get to use MMX and SSE. A lot of format conversions you want MMX for -
> you cant get that in kernel space.

That's why we need a common (USB) video device module which:

a) provides what a generic USB driver needs (read/ioctl)
b) does NOT provide what a generic USB driver should not need

> > HOWEVER. Cameras -do not- produce legitimate YUV formats!
> 
> The conversions I've seen so far in the YUV space are byte
> re-ordering. Where there are common YUVlike formats we need to define
> another format type and you want the processing in user space. There
> are simply too many cases where doing it outside of the app is wrong.
> With kiovec based direct I/O the number of cases is vastly increased.

Could be. I have a camera right now that produces in different modes
YUV420 (packed -and- planar), YUV422, RGB8 and RGB10 - latter two are RAW
BAYER PATTERNS! (good luck to anyone trying demosaicing of those :)

Definitely all USB video drivers need to be reworked to use shared library
and to provide one of standard formats (easiest to transform from native
streams).

As one of future problems I may mention one: a camera can change the
data format when changing video size. This is not handled by V4L at all.

I fully understand (and support!) your recommendation to get rid of as
much colorspace transformations in kernel space as possible. But I don't
see how it can be done right now. All cameras produce RGB24, and it works
for most clients. If we change it to YUV420, for example, some clients may
break. Should we attempt this change now? Linus won't even accept such
patch, probably. After 2.4 such work can be done safely, and it will
probably involve transition to V4L2 as well.

It is also possible that we will need to certify clients for compliance.
Currently probably only xawtv is flexible enough; most of other clients
are one- or two format applications.

Thanks,
Dmitri


---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to