RE: [mythtv-users] native resolution vs 1080i for HDTV and SD on1280x768 lcd?

2005-11-10 Thread James C. Dastrup
If I do add a HDTV tuner (planning to use OTA or QAM, not  cable tv provided 
external box) to the setup, 
I'll need to use the FX5200.  I've searched and googled for a few days now and 
perhaps since my questions are so basic, 
I've not found the answers. My questions are- Should I output via DVI at 
native resolution?  Or should I get an external 
transcoder box to process the DVI or VGA out to component 1080i or 720p and 
let the tv scale to 1280x768? What 
about one of the Nvidia 6200 or 6600 cards that have the HDTV  component 
outputs? they have a fan  (a minus), but since 
these cards are now cheaper than a transcoder box may make more sense if the 
component outputs work in Linux.  And 
finally, when you have a SD/HDTV myth setup, do you watch your SD content over 
the 1080i connection and let the video 
card scale?  or would it be best to use 720p and use deinterlacing in mythtv?  
(I will only be getting a few HD channels, 
most of my viewing is still SD)  Or is there another completely differnt way 
to do this and I've just missed it?  Thanks,

I'm not sure what you mean by native resolutions. Do you mean change the 
resolution depending on what you're watching?
Is that possible in Myth? Even if that was possible, only CRT's support all 
native resolutions - all fixed pixel displays, such as
your 30 LCD will do their own conversion, either up or down. I would suggest 
outputting at whatever your TV supports, which sounds like it's probably 
1280x720.
I watch both SD and HD content useing my Radeon 9600 SE's DVI connection 
straight the the HDMI input.
Resolution is set at 1280x720 (matches my TV) and all content gets scaled 
automatically by Myth.  All signals looks great.
winmail.dat___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] native resolution vs 1080i for HDTV and SD on1280x768 lcd?

2005-11-10 Thread mrwester

I'm not sure what you mean by native resolutions. Do you mean change the resolution depending on what you're watching?
Is
that possible in Myth? Even if that was possible, only CRT's support
all native resolutions - all fixed pixel displays, such asyour 30
LCD will do their own conversion, either up or down. I would suggest
outputting at whatever your TV supports, which sounds like it's
probably 1280x720.I watch both SD and HD content useing my Radeon 9600 SE's DVI connection straight the the HDMI input.Resolution
is set at 1280x720 (matches my TV) and all content gets scaled
automatically by Myth.All signals looks great.
By native resolution, I mean the actual physical resolution of the
LCD. Searching the list for native resolution suggests to me
that running the custom modeline for 1280x768- which is the native
resolution for my panel is the best I can do, and therefore the correct
choice. I could input 600x800, and let the TV scale to 1280x768,
OR I can use a modeline to output 1280x768p before it gets to the TV and not have the TV do the
scaling. Either way, I still need to deinterlace before sending to the tv via DVI. Alternatively, I could input 1920
X 1080 interlaced (via transcoder box or nvidia 6XXX with HD component
dongle) and let the TV do the scaling down and deinterlacing. I
could in theory maximize my HDTV performace by using 720p, but then I
would have to deinterlace the SD files from my PVR-X50s. Since I
would be watching more SD than HD content, I guess the 1080i is the way
to go.

I suppose what I really need to do is confirm that the stutter and
blockiness I'm seeing with trying to output 720x480 interlaced
(480i) which is then deinterlaced by mythtv to my progressive LCD
isindeed the deinterlacing process in mythtv. As it sits, it
appears that the TV can handle deinterlacing better than my mythtv
computer. 

Can anyone confirm this logic? thanks...






___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users