On Mon, 2005-02-14 at 21:41 -0800, Brad Templeton wrote:
> On Mon, Feb 14, 2005 at 09:46:49PM -0700, Blair wrote:
> > it is definatley a widescreen. 6600GT wants a 1920x1440 to output 1080i
> > and
> > an 800x600 to output 480i. virtual screen size as reported by X for 1080i
> > is
> > 1920x14
Not to mention, I think these cards have full MPEG decoding built
in, but its not available from the current LInux driver.
AFAIK anyway...
Brian
On Tuesday 15 February 2005 12:13 am, Blair wrote:
> True. Just keep in mind that nVidia only has designed "initial" ( read
> "basic" ) support for this
True. Just keep in mind that nVidia only has designed "initial" ( read
"basic" ) support for this card in their current linux drivers (6629). My TV
only recognizes 480i and 1080i so I use those. I do not currently have
access to any of the deinterlacing features "Built In" to the chip set. I
Agreed. I would never want a TVOne for exactly that reason. It is
unnecessary given what the card already does for me, which gets back to the
original point of this spin for getting the card (6600GT) given the
availalble alternatives to accomplish the same thing without the need to week
model
On Monday 14 February 2005 11:15 pm, Blair wrote:
> Just in case;
>
> Hey Brad, It just occurred to me that folks might not realize that with
> this latest generation of nVidia Cards, a new set of TV formats have been
> implemented as options for the TV Out port on the cards. You may now do
> stuff
On Mon, Feb 14, 2005 at 10:15:28PM -0700, Blair wrote:
> Just in case;
>
> Hey Brad, It just occurred to me that folks might not realize that with this
> latest generation of nVidia Cards, a new set of TV formats have been
> implemented as options for the TV Out port on the cards. You may now do
On Mon, Feb 14, 2005 at 09:46:49PM -0700, Blair wrote:
> it is definatley a widescreen. 6600GT wants a 1920x1440 to output 1080i and
> an 800x600 to output 480i. virtual screen size as reported by X for 1080i is
> 1920x1440 and 2306x1024 when I run in dual head mode with the nVidia set to
> out
Just in case;
Hey Brad, It just occurred to me that folks might not realize that with this
latest generation of nVidia Cards, a new set of TV formats have been
implemented as options for the TV Out port on the cards. You may now do stuff
like;
Option "TVStandard" "HD480i" or
Option "TVStanda
Oops, got x and y backwards on my last message. Dyslexic speed
typing,...sorry
On Monday 14 February 2005 21:25, Brad Templeton wrote:
> On Mon, Feb 14, 2005 at 08:36:41PM -0700, Blair wrote:
> > 1920x1080, the COMPONENT input on the TV down converts it to the TV's
> > Native resolution (which i
it is definatley a widescreen. 6600GT wants a 1920x1440 to output 1080i and
an 800x600 to output 480i. virtual screen size as reported by X for 1080i is
1920x1440 and 2306x1024 when I run in dual head mode with the nVidia set to
output 480i. I have very little doubt that this is being managed
While I don't know much about how myth works internally as I've only been
using it for about 6 weeks, I can offer the following observations on this
problem;
1. These Prebuffering pause loops occur for me on both live TV and while
watching recordings.
2. I can often break the loops by pausing
On Mon, Feb 14, 2005 at 08:36:41PM -0700, Blair wrote:
> 1920x1080, the COMPONENT input on the TV down converts it to the TV's Native
> resolution (which is really EDTV at 852x480, it's an old Panasonic
> CT-34WX50). Still, the input expects to see the signal in its native (close
> to native) f
I'm going to start a new thread, since this isn't JUST an XvMC
problem, at least in my case..
On Mon, 14 Feb 2005 20:37:16 -0700, Blammo <[EMAIL PROTECTED]> wrote:
> Ok, couple of things:
>
> 1. The XvMC lockups, at least on my machine, seem to happen with the
> fading of the OSD. 100% of the t
Ok, couple of things:
1. The XvMC lockups, at least on my machine, seem to happen with the
fading of the OSD. 100% of the time.
2. I'm getting lockups from the "prebuffering" even with XvMC off... :(
I'm going to try the audio patch suggested in another thread, and see
what happens.
Oh it's definately not a "native" 1920x1440 monitor. But the 1080i spec is
1920x1080, the COMPONENT input on the TV down converts it to the TV's Native
resolution (which is really EDTV at 852x480, it's an old Panasonic
CT-34WX50). Still, the input expects to see the signal in its native (close
On Mon, Feb 14, 2005 at 08:17:26PM -0700, Blair wrote:
> On Monday 14 February 2005 19:53, Brad Templeton wrote:
> I'm just outputing 1920x1440 (a native X modeline) and letting X take care of
> it for me. Haven't been tweeking though maybe I should? I've just been
> letting the X virtual windo
On Monday 14 February 2005 09:11 pm, Blammo wrote:
> Just found this off the archives... any change this has something to
> do with our issue??
>
>
> quote
> The biggest XvMC problem I've seen when seeking is caused by the
> fading OSD. I got into the habit for a while of hitting esc quickly
> aft
On Monday 14 February 2005 19:53, Brad Templeton wrote:
> On Mon, Feb 14, 2005 at 07:39:54PM -0700, Blair wrote:
> > I strongly suspect it is the Component out processing and the Component
> > input itself that is reducing the artifacts, I get some of them back if I
> > go to an SVIDEO ouput on the
Just found this off the archives... any change this has something to
do with our issue??
quote
The biggest XvMC problem I've seen when seeking is caused by the
fading OSD. I got into the habit for a while of hitting esc quickly
after a seek (which keeps the OSD from fading out) and things worked
On Mon, Feb 14, 2005 at 07:39:54PM -0700, Blair wrote:
> I strongly suspect it is the Component out processing and the Component input
> itself that is reducing the artifacts, I get some of them back if I go to an
> SVIDEO ouput on the same card. I suppose it's possible (probably likely)
> tha
I strongly suspect it is the Component out processing and the Component input
itself that is reducing the artifacts, I get some of them back if I go to an
SVIDEO ouput on the same card. I suppose it's possible (probably likely)
that nVidia is doing some additional processing of this signal.
On Mon, Feb 14, 2005 at 06:19:01PM -0700, Blair wrote:
> FYI: nVidia 6600GT
>
> I picked up a 6600GT myself. Primary reasoning was to get the COMPONENT
> output to drive my older HDTV without the need to purchase an external scan
> converter. Cost of the alternative configuration was close eno
FYI: nVidia 6600GT
I picked up a 6600GT myself. Primary reasoning was to get the COMPONENT
output to drive my older HDTV without the need to purchase an external scan
converter. Cost of the alternative configuration was close enough that it
was worth it to me. HDTV Processing is not a signif
I'm having this problem as well, and am wondering what the most useful
debug information to provide would be, to help resolve this problem.
On Mon, 14 Feb 2005 16:10:21 -0800, Brad Templeton
<[EMAIL PROTECTED]> wrote:
> On Mon, Feb 14, 2005 at 10:39:22AM -0600, Neil wrote:
> > Brad Templeton wr
On Mon, Feb 14, 2005 at 10:39:22AM -0600, Neil wrote:
> Brad Templeton writes:
> >
> >On Athlon-3000s people typically see about 80% usage. On P4-3ghz we see
> >more like 55% usage but that number is probably a touch misleading
> >due to hyperthreading.
>
>
> Hi Brad,
>
> Just curious since I
On Mon, Feb 14, 2005 at 10:39:22AM -0600, Neil wrote:
> Brad Templeton writes:
> >
> >On Athlon-3000s people typically see about 80% usage. On P4-3ghz we see
> >more like 55% usage but that number is probably a touch misleading
> >due to hyperthreading.
>
>
> Hi Brad,
>
> Just curious since I
Brad Templeton writes:
On Athlon-3000s people typically see about 80% usage. On P4-3ghz we see
more like 55% usage but that number is probably a touch misleading
due to hyperthreading.
Hi Brad,
Just curious since I bought an Athlon 64 3000+ cpu. I'm also waiting for my
6600 GT nvidia graphics
27 matches
Mail list logo