Hi lu,

Quite bad, that's why I don't understand it. Uploaded a side by side
comparison here : http://imgur.com/8R07xbS
On the left we have 100ms delay + 20% packet loss and on the right 100ms
delay with 10ms variance.  There is no movement in the scene. The
screenshot is taken after receiving one GOP = 25 of frames.

Janis


On 18 September 2013 17:23, Luca Barbato <[email protected]> wrote:

> On 18/09/13 16:12, Janis Zemitis wrote:
> > Hi all,
> >
> > I am decoding a real-time stream sent from another machine all works
> > flawlessly on LAN. So to test the performance I started  simulating for
> > various network characteristics by use of netem on the server side.
> > Artifacts are in the bounds of expactation for corruption, duplication
> and
> > packet loss, however I get an unusable image (white overlay over the
> whole
> > frame) when adding the slightest amount of variance to the delay
> parameter.
>
> How bad it is compared the normal packet loss?
>
> lu
>
> _______________________________________________
> libav-api mailing list
> [email protected]
> https://lists.libav.org/mailman/listinfo/libav-api
>
_______________________________________________
libav-api mailing list
[email protected]
https://lists.libav.org/mailman/listinfo/libav-api

Reply via email to