On Fri, 2018-10-19 at 09:28 +0200, Hans Verkuil wrote:
> On 10/18/2018 06:08 PM, Ezequiel Garcia wrote:
> > Set up a statically-allocated, dummy buffer to
> > be used as flush buffer, which signals
> > a encoding (or decoding) stop.
> > 
> > When the flush buffer is queued to the OUTPUT queue,
> > the driver will send an V4L2_EVENT_EOS event, and
> > mark the CAPTURE buffer with V4L2_BUF_FLAG_LAST.
> 
> I'm confused. What is the current driver doing wrong? It is already
> setting the LAST flag AFAIK.

The driver seems to be setting V4L2_BUF_FLAG_LAST to the dst
buf, if there's no src buf.

IIRC, that alone is has no effects, because .device_run never
gets to run (after all, there is no src buf), and the dst
buf is never marked as done and dequeued...
 
> I don't see why a dummy buffer is needed.
> 

... and that is why I took the dummy buffer idea (from some other driver),
which seemed an easy way to artificially run device_run
to dequeue the dst buf.

What do you think?

Thanks,
Ezequiel

Reply via email to