Dr Andrew C Aitchison wrote:

>I guess from his reply Mark V didn't read this bit. :-(
>If you are prepared to convert the hd15 signal yourself the TV out limits
>don't apply, the HD15 signal on most cards should be plenty fast enough
>(I'm typing this at a 6 year old card that would do
>2048x1536 @ 60/120Hz interlaced if I had a monitor that could cope).
>  
>

Exactly.  And many HDTV's even have hd15 inputs, so asside from finding 
the correct modeline, there is no other conversion needed to be done by 
the user.  In my case, the TV was already using the hd15 input for an 
over-the-air-set-top-hd-receiver, so I used a ~$70 hd15->yuv converter 
from audio authority.

Given my very strong belief in what you expressed - that there is no 
such thing as 'hdtv support' since any 5 year old video card can easily 
output 480p, 720p, and quite probably even 1080i and 540p signals - I 
cannot figure out why I seem to be the only person doing this and 
looking for help.

>>For 640x480 provides reasonable output, except that 10-15% of the pixels 
>>of the top and left of screen are not visible (i.e. off screen).  Now, 
>>reading the handy video mode timings FAQ, it suggests that in such a 
>>case you move to a higher dot clock.  Personally I don't want to 
>>endanger my (brothers) hdtv.  Playing around with a multisync monitor, 
>>just jacking up the dotclock number in the modeline also ends up 
>>increasing the vertrefresh, which I'm _guessing_ could be a bad thing on 
>>the hdtv, which I'm _guessing_ is probably only going to be happy at 60hz.
>>    
>>
>
>I've seen widescreen TVs in the UK advertising 100Hz images to reduce 
>flicker, so they much be reprocessing the image somehow.
>I'd _guess_ that most (all?) HDTVs are digital (like most monitors now), 
>and that the logic cuts out the picture if you give them a signal they
>don't like. While I don't take responsibility, I'd be prepared to try 
>modelines out. You can always use <ctrl><alt><keypad +/-> to switch
>from a good mode into one under test, and then back again if things
>don't look good.
>
>  
>
Yes, I would hopefully assume that the a TV that cost >$5K would have 
enough smarts to reject dangerous input signals.  However to contradict 
that, just using the standard tv-out (ntsc) features of the nvidia card, 
it has run into modes which do not sync, yet are not rejected by any 
logic.  Although none of them has ever damaged the TV AFAIK.

And yes, I know that I could be like a monkey trying all sorts of 
modelines, but I was really really hopeing that I was not the only 
person in the world doing this, and that perhaps someone out there could 
steer me in the right direction saving me time and failed -potentially 
risky- attempts.

I'm sure its just a conspiracy by all the video card makers to squelch 
the knowlege that their new 'feature' of 'hdtv support' is nothing but a 
crock of...

Thanks anyway for at least correcting Mark V.  At least now when the 
next person searches google, they won't just find 2 questions posted to 
Xpert, with 2 quick replies telling people that its a tv-out issue.

-dmc



_______________________________________________
Xpert mailing list
[EMAIL PROTECTED]
http://XFree86.Org/mailman/listinfo/xpert

Reply via email to