[Wikitech-l] Video Quality for Derivatives (was Re:w...@home Extension)

2009-08-06 Thread Michael Dale
So I committed ~basic~ derivate code support for oggHandler in r54550 
(more solid support on the way)

Based input from the w...@home thread;  here are updated target 
qualities expressed via the firefogg api to ffmpeg2thoera

Also j^ was kind enough to run these settings on some sample input files:
http://firefogg.org/j/encoding_samples/ so you can check them out there.

We want to target 400 wide for the web stream to be consistent with 
archive.orgs which encodes mostly to 400x300 (although their 16:9 stuff 
can be up to 530 wide) ...

Updated mediaWiki firefogg integration and the stand alone encoder app 
these default transcode settings. in r54552  r54554 (should be pushed 
out to http://firefogg.org/make shortly ... or can be run @home with a 
trunk check out at:
/js2/mwEmbed/example_usage/Firefogg_Make_Advanced.html

anyway on to the settings:

$wgDerivativeSettings[ WikiAtHome::ENC_SAVE_BANDWITH ] =
array(
'maxSize'= '200',
'videoBitrate'= '164',
'audioBitrate'= '32',
'samplerate'= '22050',
'framerate'= '15',
'channels'= '1', 
'noUpscaling'= 'true'
);
$wgDerivativeSettings[ WikiAtHome::ENC_WEB_STREAM ] =
array(
'maxSize'= '400',
'videoBitrate'= '544',
'audioBitrate'= '96',
'noUpscaling'= 'true'
);
$wgDerivativeSettings[ WikiAtHome::ENC_HQ_STREAM ] =
array(
'maxSize' = '1080',
'videoQuality'= 6,
'audioQuality'= 3,
'noUpscaling'= 'true'
);

--michael


Brion Vibber wrote:
 On 8/3/09 9:56 PM, Gregory Maxwell wrote:
 [snip]
   
 Based on 'what other people do' I'd say the low should be in the
 200kbit-300kbit/sec range.  Perhaps taking the high up to a megabit?

 There are also a lot of very short videos on Wikipedia where the whole
 thing could reasonably be buffered prior to playback.


 Something I don't have an answer for is what resolutions to use. The
 low should fit on mobile device screens.
 

 At the moment the defaults we're using for Firefogg uploads are 400px 
 width (eg, 400x300 or 400x225 for the most common aspect rations) 
 targeting a 400kbps bitrate. IMO at 400kbps at this size things don't 
 look particularly good; I'd prefer a smaller size/bitrate for 'low' and 
 higher size/bitrate for medium qual.


  From sources I'm googling up, looks like YouTube is using 320x240 for 
 low-res, 480x360 h.264 @ 512kbps+128kbps audio for higher-qual, with 
 720p h.264 @ 1024Kbps+232kbps audio available for some HD videos.

 http://www.squidoo.com/youtubehd

 These seem like pretty reasonable numbers to target; offhand I'm not 
 sure the bitrates used for the low-res version but I think that's with 
 older Flash codecs anyway so not as directly comparable.

 Also, might we want different standard sizes for 4:3 vs 16:9 material?

 Perhaps we should wrangle up some source material and run some test 
 compressions to get a better idea what this'll look like in practice...

   
 Normally I'd suggest setting
 the size based on the content: Low motion detail oriented video should
 get higher resolutions than high motion scenes without important
 details. Doubling the number of derivatives in order to have a large
 and small setting on a per article basis is probably not acceptable.
 :(
 

 Yeah, that's way tougher to deal with... Potentially we could allow some 
 per-file tweaks of bitrates or something, but that might be a world of 
 pain. :)

   
 As an aside— downsampled video needs some makeup sharpening like
 downsampled stills will. I'll work on getting something in
 ffmpeg2theora to do this.
 

 Woohoo!

   
 There is also the option of decimating the frame-rate. Going from
 30fps to 15fps can make a decent improvement for bitrate vs visual
 quality but it can make some kinds of video look jerky. (Dropping the
 frame rate would also be helpful for any CPU starved devices)
 

 15fps looks like crap IMO, but yeah for low-bitrate it can help a lot. 
 We may wish to consider that source material may have varying frame 
 rates, most likely to be:

 15fps - crappy low-res stuff found on internet :)
 24fps / 23.98 fps - film-sourced
 25fps - PAL non-interlaced
 30fps / 29.97 fps - NTSC non-interlaced or many computer-generated vids
 50fps - PAL interlaced or PAL-compat HD native
 60fps / 59.93fps - NTSC interlaced or HD native

 And of course those 50 and 60fps items might be encoded with or without 
 interlacing. :)

 Do we want to normalize everything to a standard rate, or maybe just cut 
 50/60 to 25/30?

 (This also loses motion data, but not as badly as decimation to 15fps!)

   
 This brings me to an interesting point about instant gratification:
 Ogg was intended from day one to be a streaming format. This has
 pluses and minuses, but one thing we should take 

Re: [Wikitech-l] Video Quality for Derivatives (was Re:w...@home Extension)

2009-08-06 Thread Gregory Maxwell
On Thu, Aug 6, 2009 at 8:00 PM, Michael Dalemd...@wikimedia.org wrote:
 So I committed ~basic~ derivate code support for oggHandler in r54550
 (more solid support on the way)

 Based input from the w...@home thread;  here are updated target
 qualities expressed via the firefogg api to ffmpeg2thoera

Not using two-pass on the rate controlled versions?

It's a pretty consistent performance improvement[8], and it eliminates
the first frame blurry issue that sometimes comes up for talking
heads. (Note, that by default two-pass cranks the keyframe interval to
256 and makes the buf-delay infinite. So you'll need to set those to
sane values for streaming).


[1] For example:
http://people.xiph.org/~maikmerten/plots/bbb-68s/managed/psnr.png

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Video Quality for Derivatives (was Re:w...@home Extension)

2009-08-06 Thread Gregory Maxwell
On Thu, Aug 6, 2009 at 8:17 PM, Gregory Maxwellgmaxw...@gmail.com wrote:
 On Thu, Aug 6, 2009 at 8:00 PM, Michael Dalemd...@wikimedia.org wrote:
 So I committed ~basic~ derivate code support for oggHandler in r54550
 (more solid support on the way)

 Based input from the w...@home thread;  here are updated target
 qualities expressed via the firefogg api to ffmpeg2thoera

 Not using two-pass on the rate controlled versions?

 It's a pretty consistent performance improvement[8], and it eliminates
 the first frame blurry issue that sometimes comes up for talking
 heads. (Note, that by default two-pass cranks the keyframe interval to
 256 and makes the buf-delay infinite. So you'll need to set those to
 sane values for streaming).

I see r54562 switching to two-pass, but as-is this will produce files
which are not really streamable (because they streams can and will
burst to 10mbits even though the overall rate is 500kbit or whatever
is requested).

We're going to want to do something like -k 64 --buf-delay=256.

I'm not sure what key-frame interval we should be using— Longer
intervals lead to clearly better compression, with diminishing returns
over 512 or so depending on the content... but lower seeking
granularity during long spans without keyframes.  The ffmpeg2theora
defaults are 64 in one-pass mode, 256 in two-pass mode.

Buf-delay indicates the amount of buffering the stream is targeting.
I.e. For a 30fps stream at 100kbit/sec a buf-delay of 60 means that
the encoder expects that the decoder will have buffered at least
200kbit (25kbyte) of video data before playback starts.

If the buffer runs dry the playback stalls— pretty crappy for the
user's experience.  So bigger buff delays either mean a longer
buffering time before playback or more risk of stalling.

In the above (30,60,100) example the client would require 2 seconds to
fill the buffer if they were transferring at 100kbit/sec, 1 second if
they are transferring at 200kbit/sec. etc.

The default is the same as the keyframe interval (64) in one pass
mode, and infinite in two-pass mode.  Generally you don't want the
buf-delay to be less than the keyframe interval, as quality tanks
pretty badly at that setting.

Sadly the video tag doesn't currently provide any direct way to
request a minimum buffering. Firefox just takes a guess and every time
it stalls it guesses more. Currently the guesses are pretty bad in my
experience, though this is something we'll hopefully get addressed in
future versions.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l