Re: [mythtv-users] Struggling with Xwindows DVI to HDTV 1080i

2006-01-05 Thread Len Reed

Chris Lynch wrote:


I have 1080i working on my 6600 over DVI->HDMI, but didn't bother with 
component with the following modeline:


ModeLine "1920x1080i" 74.2 1920 2008 2052 2200 1080 1084 
1089 1125 +hsync +vsync interlace


Let me know if you need more - it looks great on my set (JVC HD-61FH96).


Interesting...

A couple of requests:
1. What version of nvidia's driver are you using?
2. Could you post your full xorg.conf file?

Thanks, Len
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] Struggling with Xwindows DVI to HDTV 1080i

2006-01-03 Thread Chris Lynch
On 1/2/06, Dan Christian <[EMAIL PROTECTED]> wrote:
Have you actually gotten 1080i to work with component?  Can you send theXorg.conf file?I tried to get it to work from my 6600GT, but couldn't get a usable picture.The picture would jump up-down by a line on every frame (like the interlace
was happening right).  Both component and DVI acted the same.  My eyesstarted to water after about 5 minutes.  I haven't found anyone with a usable1080i.Do you know which drivers worked?  Which list archvie were you referring to:
nvidia or mythTV?I have 1080i working on my 6600 over DVI->HDMI, but didn't bother with component with the following modeline:    ModeLine "1920x1080i" 74.2 1920 2008 2052 2200 1080 1084 1089 1125 +hsync +vsync interlace
Let me know if you need more - it looks great on my set (JVC HD-61FH96).
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] Struggling with Xwindows DVI to HDTV 1080i

2006-01-02 Thread Dan Christian
On Thursday 29 December 2005 17:52, Steve Adeff wrote:
> the NVidia component output uses an HDTV output chip so it will work. It's
> interlaced over DVI that doesn't since it uses the normal monitor output.
> Its a known bug that NVidia won't fix since relatively few people use it.
> Apparently one of the older (search the list archives) drivers does support
> it though. Some TV's will accept a 1920x540p signal from the DVI and
> display it as 1080i though, so thats worth a try too.

Have you actually gotten 1080i to work with component?  Can you send the 
Xorg.conf file?

I tried to get it to work from my 6600GT, but couldn't get a usable picture.  
The picture would jump up-down by a line on every frame (like the interlace 
was happening right).  Both component and DVI acted the same.  My eyes 
started to water after about 5 minutes.  I haven't found anyone with a usable 
1080i.

Do you know which drivers worked?  Which list archvie were you referring to: 
nvidia or mythTV?

Any CRT that takes 1080i should also take 540p.  The only differences are in 
the timing.  Of course, LCD and plasma have to digitize and rescale 
everything; so they might act differently.

Thanks,
-Dan
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] Struggling with Xwindows DVI to HDTV 1080i

2005-12-29 Thread Steve Adeff
On Thursday 29 December 2005 01:16, Len Reed wrote:
> Steve Adeff wrote:
> > On Tuesday 27 December 2005 18:38, Len Reed wrote:
> >>I've got an nvidia 6200 with DVI out connected to my Mitsubishi HDTV
> >>(DVIS to HDMI cable).  The TV does 720p and 1080i on HDMI: it's worked
> >>from both the cable box and from a DVD player that does upscaling.
> >>
> >>I can get the TV to recognize that it's getting 1080i input from the
> >>computer.  (The info on the screen says so.)  I can't get it to deal
> >>with 720p for some reason.  I can get the TV to handle lower resolution
> >>SVGA and XGA modes up to 1024x768 fine.
> >>
> >>With 1080i I get what is close to the twm screen, but there are two
> >>problems:
> >>1. The screen is greatly overscanned.  Perhaps 20% is not displayed.
> >>2. The interlacing is off, or at least that's my guess.  Everything is
> >>displayed twice, with one flickering image directly below another.  They
> >>are close: the bar at the top of an xterm has its two images overlapping.
> >>
> >>I've tried every modeline I can find, and have tried two different
> >>modeline calculators, but I can't get the two images to converge.  The
> >>TV seems to be reporting things correctly to X (59-61 Vsync, reasonable
> >>Hsync, etc.)  Telling X to ignore the TV's info doesn't help in any case.
> >>
> >>It seems like it should be easy enough to play with the vertical
> >>blanking interval to fix this, and that I'm close.  But I'm guessing,
> >>and I'm not making progress.  Is there a reasonble way to tweak the
> >>modeline to iterate toward a solution here?
> >>
> >>Details:
> >>Fedora core 4, x86_64
> >>Athlon-64x2 (3800+)
> >>ndvidia 6200 card
> >>latest nvidia X driver, compiled on the machine
> >>
> >>Thanks,
> >>Len
> >
> > newer nvidia drivers don't support interlace modes over DVI.
>
> Seriously??  I sure didn't see that in their README.  After my original
> posting, I bought (from somewhere that I can return it) a 6600 card that
> has component video out and it works fine.  (It exhibited exactly the
> same problem with DVI, though.)  The card with HDTV encoding is a
> satisfactory solution if not an ideal one (both technically and in
> cost.)  It sure seems stupid to have the card encode HDTV to have the TV
> turn it back into digital for the DLP display when I should be able to
> do it over DVI.  Certainly the cable box's 1080i DVI is a bit clearer
> than its component video.  Is there any way that the open source (dv)
> driver will work at 1920x1080i to DVI or is a waste of more time to even
> try?

the NVidia component output uses an HDTV output chip so it will work. It's 
interlaced over DVI that doesn't since it uses the normal monitor output. Its 
a known bug that NVidia won't fix since relatively few people use it. 
Apparently one of the older (search the list archives) drivers does support 
it though. Some TV's will accept a 1920x540p signal from the DVI and display 
it as 1080i though, so thats worth a try too.


> > Overscan won't change what you see for TV (ie. the overscan is the same
> > whether from your computer or your cablebox), so change the gui overscan
> > settings for your TV to fix the gui. if you want to fix tv video, SVN
> > lets you adjust overscan, or find the service menu info for your tv to
> > lessen its overscan.
>
> OK, I haven't laid mythtv into this mix yet, so I'll worry about it
> then.  The component video from the new card at 720p and 1080i exhibits
> only small overscan, about what I'd want.

newer TV's, esp. LCD and rear projection have very minor overscan which is 
nice, its the tube TV's that exhibit obscenely large overscan. Plus, with TV 
content your missing with Myth what you'd miss without it, so its a wash 
really. If you insist on seeing it all, then venture into the service menu or 
get an ISF guy to your house and pay him to do it.


> While you're listening, Steve, you were complaining about the VIA
> IEEE1394 chipset.  I can't find anything but VIA.  Fry's had a dozen
> cards, all VIA.  Did you have to mail order to get something with the TI
> chip?  Do you have a recommendation?

So far I've had great luck with the TI chipset in my laptop, no problems with 
it. I had a VIA on my Myth motherboard that just refused to work, and from 
what I've seen the people with VIA chipsets aren't having much better luck 
than that. I ended up getting a card from newegg(a Syba with NEC firewire 
chipset) for that computer that I've had, I'd say 95% success rate with (just 
in channel changing, haven't setup firewire capture yet). When I get back 
from Vaca. I'm getting another cable box and will have one capturing via 
firewire and one via PVR150, so we'll see how that goes. My advice rght now 
is to stay away from Via and be willing to spend a few bucks on a quality 
firewire card. I also found that some of the packaged libraries seem to not 
be perfect, I ended up compiling from scratch and my success rate went way 
up, same library version.

>
> Thanks again

Re: [mythtv-users] Struggling with Xwindows DVI to HDTV 1080i

2005-12-28 Thread Len Reed

Steve Adeff wrote:

On Tuesday 27 December 2005 18:38, Len Reed wrote:


I've got an nvidia 6200 with DVI out connected to my Mitsubishi HDTV
(DVIS to HDMI cable).  The TV does 720p and 1080i on HDMI: it's worked
from both the cable box and from a DVD player that does upscaling.

I can get the TV to recognize that it's getting 1080i input from the
computer.  (The info on the screen says so.)  I can't get it to deal
with 720p for some reason.  I can get the TV to handle lower resolution
SVGA and XGA modes up to 1024x768 fine.

With 1080i I get what is close to the twm screen, but there are two
problems:
1. The screen is greatly overscanned.  Perhaps 20% is not displayed.
2. The interlacing is off, or at least that's my guess.  Everything is
displayed twice, with one flickering image directly below another.  They
are close: the bar at the top of an xterm has its two images overlapping.

I've tried every modeline I can find, and have tried two different
modeline calculators, but I can't get the two images to converge.  The
TV seems to be reporting things correctly to X (59-61 Vsync, reasonable
Hsync, etc.)  Telling X to ignore the TV's info doesn't help in any case.

It seems like it should be easy enough to play with the vertical
blanking interval to fix this, and that I'm close.  But I'm guessing,
and I'm not making progress.  Is there a reasonble way to tweak the
modeline to iterate toward a solution here?

Details:
Fedora core 4, x86_64
Athlon-64x2 (3800+)
ndvidia 6200 card
latest nvidia X driver, compiled on the machine

Thanks,
Len



newer nvidia drivers don't support interlace modes over DVI.


Seriously??  I sure didn't see that in their README.  After my original 
posting, I bought (from somewhere that I can return it) a 6600 card that 
has component video out and it works fine.  (It exhibited exactly the 
same problem with DVI, though.)  The card with HDTV encoding is a 
satisfactory solution if not an ideal one (both technically and in 
cost.)  It sure seems stupid to have the card encode HDTV to have the TV 
turn it back into digital for the DLP display when I should be able to 
do it over DVI.  Certainly the cable box's 1080i DVI is a bit clearer 
than its component video.  Is there any way that the open source (dv) 
driver will work at 1920x1080i to DVI or is a waste of more time to even 
try?


Overscan won't change what you see for TV (ie. the overscan is the same 
whether from your computer or your cablebox), so change the gui overscan 
settings for your TV to fix the gui. if you want to fix tv video, SVN lets 
you adjust overscan, or find the service menu info for your tv to lessen its 
overscan.


OK, I haven't laid mythtv into this mix yet, so I'll worry about it 
then.  The component video from the new card at 720p and 1080i exhibits 
only small overscan, about what I'd want.


While you're listening, Steve, you were complaining about the VIA 
IEEE1394 chipset.  I can't find anything but VIA.  Fry's had a dozen 
cards, all VIA.  Did you have to mail order to get something with the TI 
chip?  Do you have a recommendation?


Thanks again for the help.

Len
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] Struggling with Xwindows DVI to HDTV 1080i

2005-12-27 Thread Steve Adeff
On Tuesday 27 December 2005 18:38, Len Reed wrote:
> I've got an nvidia 6200 with DVI out connected to my Mitsubishi HDTV
> (DVIS to HDMI cable).  The TV does 720p and 1080i on HDMI: it's worked
> from both the cable box and from a DVD player that does upscaling.
>
> I can get the TV to recognize that it's getting 1080i input from the
> computer.  (The info on the screen says so.)  I can't get it to deal
> with 720p for some reason.  I can get the TV to handle lower resolution
> SVGA and XGA modes up to 1024x768 fine.
>
> With 1080i I get what is close to the twm screen, but there are two
> problems:
> 1. The screen is greatly overscanned.  Perhaps 20% is not displayed.
> 2. The interlacing is off, or at least that's my guess.  Everything is
> displayed twice, with one flickering image directly below another.  They
> are close: the bar at the top of an xterm has its two images overlapping.
>
> I've tried every modeline I can find, and have tried two different
> modeline calculators, but I can't get the two images to converge.  The
> TV seems to be reporting things correctly to X (59-61 Vsync, reasonable
> Hsync, etc.)  Telling X to ignore the TV's info doesn't help in any case.
>
> It seems like it should be easy enough to play with the vertical
> blanking interval to fix this, and that I'm close.  But I'm guessing,
> and I'm not making progress.  Is there a reasonble way to tweak the
> modeline to iterate toward a solution here?
>
> Details:
> Fedora core 4, x86_64
> Athlon-64x2 (3800+)
> ndvidia 6200 card
> latest nvidia X driver, compiled on the machine
>
> Thanks,
> Len

newer nvidia drivers don't support interlace modes over DVI.

Overscan won't change what you see for TV (ie. the overscan is the same 
whether from your computer or your cablebox), so change the gui overscan 
settings for your TV to fix the gui. if you want to fix tv video, SVN lets 
you adjust overscan, or find the service menu info for your tv to lessen its 
overscan.

-- 
Steve
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


[mythtv-users] Struggling with Xwindows DVI to HDTV 1080i

2005-12-27 Thread Len Reed
I've got an nvidia 6200 with DVI out connected to my Mitsubishi HDTV 
(DVIS to HDMI cable).  The TV does 720p and 1080i on HDMI: it's worked 
from both the cable box and from a DVD player that does upscaling.


I can get the TV to recognize that it's getting 1080i input from the 
computer.  (The info on the screen says so.)  I can't get it to deal 
with 720p for some reason.  I can get the TV to handle lower resolution 
SVGA and XGA modes up to 1024x768 fine.


With 1080i I get what is close to the twm screen, but there are two 
problems:

1. The screen is greatly overscanned.  Perhaps 20% is not displayed.
2. The interlacing is off, or at least that's my guess.  Everything is 
displayed twice, with one flickering image directly below another.  They 
are close: the bar at the top of an xterm has its two images overlapping.


I've tried every modeline I can find, and have tried two different 
modeline calculators, but I can't get the two images to converge.  The 
TV seems to be reporting things correctly to X (59-61 Vsync, reasonable 
Hsync, etc.)  Telling X to ignore the TV's info doesn't help in any case.


It seems like it should be easy enough to play with the vertical 
blanking interval to fix this, and that I'm close.  But I'm guessing, 
and I'm not making progress.  Is there a reasonble way to tweak the 
modeline to iterate toward a solution here?


Details:
Fedora core 4, x86_64
Athlon-64x2 (3800+)
ndvidia 6200 card
latest nvidia X driver, compiled on the machine

Thanks,
Len
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users