On Wednesday 14 September 2005 18:56, Gregorio Gervasio, Jr. wrote: > >>>>> Isaac Richards writes: > > [nVidia 6xxx and 7xxx cards] > > i> Myth uses the 'video texture adapter' by default on these newer boards, > and i> the only thing missing is support for adjusting the > contrast/brightness/etc i> through Xv. No tearing, performance is exactly > the same as with older i> hardware, and there's no colorkey to show up at > odd times. =) > > i> I have a 6600gt in my dev box, and it has absolutely no problem playing > HD i> video. > > > That's interesting. I have a 6600 and it (mostly the X > server) is just maxing out an Athlon64 3500+ when playing a 720p video > in a 1920x1080 display. It plays fine but there's no headroom > (eg. for >1.0x timestretch). Playing a 1080i video uses less CPU. I > had been assuming this was due to inadequate Xv support since an older > card (PCX5300) was using much less CPU for 720p. (Unfortunately, the > video quality of the older card was really bad at 1920x1080 so I had > to return it.)
If my memory serves correctly, my A64 3500 w/a PCI-Express 6600GT gets pegged more (percentage-wise) on HDTV playback than does my Athlon XP 3200 w/an AGP 6200, with very similar configurations between the two. I'll have to double-check what sort of cpu usage the 3200 is getting on both 1080i and 720p content now that I've bumped it to FC4, push a 1080p display and use no deint filter... -- Jarod Wilson [EMAIL PROTECTED]
pgp2yPgYNbFzz.pgp
Description: PGP signature
_______________________________________________ mythtv-users mailing list mythtv-users@mythtv.org http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users