I've got an nvidia 6200 with DVI out connected to my Mitsubishi HDTV (DVIS to HDMI cable). The TV does 720p and 1080i on HDMI: it's worked from both the cable box and from a DVD player that does upscaling.

I can get the TV to recognize that it's getting 1080i input from the computer. (The info on the screen says so.) I can't get it to deal with 720p for some reason. I can get the TV to handle lower resolution SVGA and XGA modes up to 1024x768 fine.

With 1080i I get what is close to the twm screen, but there are two problems:
1. The screen is greatly overscanned.  Perhaps 20% is not displayed.
2. The interlacing is off, or at least that's my guess. Everything is displayed twice, with one flickering image directly below another. They are close: the bar at the top of an xterm has its two images overlapping.

I've tried every modeline I can find, and have tried two different modeline calculators, but I can't get the two images to converge. The TV seems to be reporting things correctly to X (59-61 Vsync, reasonable Hsync, etc.) Telling X to ignore the TV's info doesn't help in any case.

It seems like it should be easy enough to play with the vertical blanking interval to fix this, and that I'm close. But I'm guessing, and I'm not making progress. Is there a reasonble way to tweak the modeline to iterate toward a solution here?

Details:
Fedora core 4, x86_64
Athlon-64x2 (3800+)
ndvidia 6200 card
latest nvidia X driver, compiled on the machine

Thanks,
Len
_______________________________________________
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users

Reply via email to