Hi all, I'm using ffmpeg to encode some video. When the command locally I get a very different result than when I run the same command on the same source on a server. Somebody told me that ffmpeg might change the bit rate or some other quality switch based on the environment it's on. More specifically, that it can read what processor it has and make quality decisions on that - and that this might be the reason for the change. I can't find anything in the documentation about this. Anybody has insight on this odd behavior?
Thanks, Ron Ganbar email: [email protected] tel: +44 (0)7968 007 309 [UK] +972 (0)54 255 9765 [Israel] url: http://ronganbar.wordpress.com/
_______________________________________________ Nuke-python mailing list [email protected], http://forums.thefoundry.co.uk/ http://support.thefoundry.co.uk/cgi-bin/mailman/listinfo/nuke-python
