Re: [mythtv-users] DVI to Sony HS420 series HDTV

2005-01-23 Thread Paul Miller
On Sunday 23 January 2005 3:53 pm, Brad Templeton wrote:
> Not clear on which situations you have tried.  Almost all DVD
> players and HD set top boxes (cable, satellite) today will have a
> setup uption to tell it if your TV is 16:9 or 4:3.   Have you set
> this, and which one works on your TV?   Whatever one works is the
> kind of output you should try to feed from your video card.

I do not have a DVD player or set top box to test with.  I've been 
systematically (or perhaps randomly) trying various modes using 
PowerStrip in Windows.  It appears that my hardware is very picky 
about the total horizontal resolution.  If I'm off by a pixel, the 
television goes blank.  The total vertical resolution is not as 
picky.  (This is with my DVI->HDMI cable.)

> > I also tried using the AIW HDTV output cable (YPrPb) that came
> > with my card.  In Windows, I can output 720p and 1080i on this
> > cable.  It
>
> Alas, you should have done a search.  It's been reported by many
> that the linux drivers don't support the ATI component video
> adapter. (Or you have a cable?)  The real work of generating YPrPb
> is done in the driver, as you probably know most video cards
> normally do RGB, not YPrPb. TVs, unless they have VGA inputs or old
> style (4 or 5 pin) RBG component video, do not take RGB.

I knew the VGA connector was RGB, but I wasn't sure about the other 
connector.  The DVI-I and RGB component video must be independly 
controlled since I can connect my CRT to the DVI-I and HDTV to RGB 
(in YPrPb mode) and they both display correctly.

> But what I am keen to know is, can you get 1080i video to display
> from this card!

There is a setting in Windows and it may be possible with the DVI-I 
connector, but I don't have it working yet.  Here's what I have 
working:

YPrPb (in Windows):
480p (720x480), 720p (1280x720), 1080i (1920x1080i)

DVI/HDMI (tested in Windows with PowerStrip):
480p (640x480) -- slight underscan
480p (720x480) -- overscan
720p (1280x720) -- overscan

The modeline settings (PowerStrip in Windows) are very sensitive...  I 
may need to come up with custom resolutions to eliminate the 
over/underscan problems, assuming my TV will tolerate it.

-Paul
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] DVI to Sony HS420 series HDTV

2005-01-23 Thread Brad Templeton
On Sun, Jan 23, 2005 at 03:37:42PM -0600, Paul Miller wrote:
> I did quite a bit of messing around with the television and video card 
> yesterday.  I believe that all HD sources must be in 16:9 format, 
> regardless of what they are outputting to.  So, I think you are 
> partially right with (c), but the television automatically adds the 
> black bars on the top and bottom of the screen.  It must be capable 
> of doing 1440 lines (though actually only displaying on 1080) for 
> widescreen videos.  For letterbox videos, the user needs to press 
> zoom to view it in fullscreen, otherwise there'll be black bars on 
> all sides.

Not clear on which situations you have tried.  Almost all DVD players
and HD set top boxes (cable, satellite) today will have a setup uption
to tell it if your TV is 16:9 or 4:3.   Have you set this, and which
one works on your TV?   Whatever one works is the kind of output you
should try to feed from your video card.


> 
> I also tried using the AIW HDTV output cable (YPrPb) that came with my 
> card.  In Windows, I can output 720p and 1080i on this cable.  It 

Alas, you should have done a search.  It's been reported by many that
the linux drivers don't support the ATI component video adapter. (Or
you have a cable?)  The real work of generating YPrPb is done in the
driver, as you probably know most video cards normally do RGB, not YPrPb.
TVs, unless they have VGA inputs or old style (4 or 5 pin) RBG component
video, do not take RGB.

> launched, the HDTV is scrambled and nearly solid fluorescent pink.

The scrambling is one problem.  THe funny colours you should expect by
feeding an RBG signal into a YPrPb input!

But what I am keen to know is, can you get 1080i video to display from
this card!

___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] DVI to Sony HS420 series HDTV

2005-01-23 Thread Paul Miller
I did quite a bit of messing around with the television and video card 
yesterday.  I believe that all HD sources must be in 16:9 format, 
regardless of what they are outputting to.  So, I think you are 
partially right with (c), but the television automatically adds the 
black bars on the top and bottom of the screen.  It must be capable 
of doing 1440 lines (though actually only displaying on 1080) for 
widescreen videos.  For letterbox videos, the user needs to press 
zoom to view it in fullscreen, otherwise there'll be black bars on 
all sides.

I also tried using the AIW HDTV output cable (YPrPb) that came with my 
card.  In Windows, I can output 720p and 1080i on this cable.  It 
looks very sharp.  Unfortunately, when I play DVDs, the television 
blanks out after the FBI warning, even if the window is not 
maximized.  I suspect that the driver or hardware does this blanking 
whenever a copyrighted signal is being displayed.  On the regular 
component or S-video outputs, no such blanking occurs.  I wonder if 
the DVI-to-HDTV adaptor would do the same.

http://shop.ati.com/product.asp?sku=2538004 (AIW HDTV cable)
http://shop.ati.com/product.asp?sku=2537967 (DVI-to-HDTV adaptor)

In Linux, it's a disaster.  With the AIW HDTV output cable plugged in, 
the video card thinks the HDTV is now the primary display device.  My 
CRT monitor remains in powersave mode until I bring up X.  After X is 
launched, the HDTV is scrambled and nearly solid fluorescent pink.

-Paul

On Saturday 22 January 2005 10:06 pm, Brad Templeton wrote:
> On Sat, Jan 22, 2005 at 08:18:44PM -0600, Paul Miller wrote:
> > I read somewhere that all HD sources use a 16:9 aspect ratio.  My
> > television is 4:3.  Should the DVI signal still be 16:9?
>
> DVI contains no information on aspect ratio, as far as I know, it's
> a digital version of component video.
>
> I have not played with 4:3 HDTVs.  They are rarer (though they
> actually make sense in the transition period when most of your
> content is still 4:3.   During this mixed period you are going to
> "waste" screen real estate one way or another by watching stuff of
> a different aspect ratio on your tube.  If you think it will mostly
> be 4:3 you watch your choice makes sense, though it will make less
> sense in a few years.)
>
> I can imagine a few ways this might work
>
> a) The TV pretends to be a 16:9 TV when it sees a signal with 720
> or 1080 lines.   Unfortunately it can't tell with 480 lines what
> you are sending.   Well with DVI it could count pixels, but I have
> not heard of that, since the TVs also have component and mostly use
> it.
>
> b) The TV has a mode on it you put in to tell it when to pretend to
> be a 16:9 TV.   In the pretending mode, it only displays into a
> 16:9 box with letterbox bars done by the TV.
>
> c) The TV is a 4:3 TV.  Your transmitting box is expected to know
> this. Thus you would feed it 1280 x 960, not 1280x720, and 1920 x
> 1440, not 1920 x 1080.   Your box would be putting in the letterbox
> bars.  This is what myth and xvideo will do, it's what you get if
> you run Myth on your computer monitor.
>
> Now C seems most likely.  All HDTV STBs have a menu setting asking
> if the TV is widescreen or 4:3 in them.   Myth doesn't have one per
> se, you have to put fields into xorg.conf and edit the command
> lines myth uses to call mplayer to make it work on a 16:9 TV --
> which you don't have, so you're golden!
>
> What I don't know is if the HTDV STBs know as much about 4:3 based
> HDTVs.  I presume they do since they are rarer but not unknown.
>
> A proper 4:3 HDTV has to have more than 1080 lines of course, it
> needs 1440 to do the full res.  Which I believe the fancy Sony
> does.

___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] DVI to Sony HS420 series HDTV

2005-01-22 Thread Brad Templeton
On Sat, Jan 22, 2005 at 08:18:44PM -0600, Paul Miller wrote:
> 
> I read somewhere that all HD sources use a 16:9 aspect ratio.  My 
> television is 4:3.  Should the DVI signal still be 16:9?

DVI contains no information on aspect ratio, as far as I know, it's
a digital version of component video.

I have not played with 4:3 HDTVs.  They are rarer (though they actually
make sense in the transition period when most of your content is still
4:3.   During this mixed period you are going to "waste" screen real
estate one way or another by watching stuff of a different aspect
ratio on your tube.  If you think it will mostly be 4:3 you watch your
choice makes sense, though it will make less sense in a few years.)

I can imagine a few ways this might work

a) The TV pretends to be a 16:9 TV when it sees a signal with 720 or
1080 lines.   Unfortunately it can't tell with 480 lines what you are
sending.   Well with DVI it could count pixels, but I have not heard
of that, since the TVs also have component and mostly use it.

b) The TV has a mode on it you put in to tell it when to pretend to be
a 16:9 TV.   In the pretending mode, it only displays into a 16:9 box
with letterbox bars done by the TV.

c) The TV is a 4:3 TV.  Your transmitting box is expected to know this.
Thus you would feed it 1280 x 960, not 1280x720, and 1920 x 1440, not
1920 x 1080.   Your box would be putting in the letterbox bars.  This is
what myth and xvideo will do, it's what you get if you run Myth on
your computer monitor.

Now C seems most likely.  All HDTV STBs have a menu setting asking if
the TV is widescreen or 4:3 in them.   Myth doesn't have one per se,
you have to put fields into xorg.conf and edit the command lines myth
uses to call mplayer to make it work on a 16:9 TV -- which you don't have,
so you're golden!

What I don't know is if the HTDV STBs know as much about 4:3 based
HDTVs.  I presume they do since they are rarer but not unknown.

A proper 4:3 HDTV has to have more than 1080 lines of course, it needs
1440 to do the full res.  Which I believe the fancy Sony does.
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] DVI to Sony HS420 series HDTV

2005-01-22 Thread Paul Miller

I read somewhere that all HD sources use a 16:9 aspect ratio.  My 
television is 4:3.  Should the DVI signal still be 16:9?

Here are some of my results:

1080i (1920x1080 interlaced):  no display; does not appear to be 
supported by ATI's binary driver.  The Windows driver has an option 
to "force 1080i" mode, but it does not appear to work with my 
hardware.

720p (1280x720):  image is left justified, truncated on the right, and 
there are large black bands on the top and bottom.  Using the TV's 
zoom function results in the image being cropped on all sides.

480p (720x480):  image appears to be cropped on only the left and 
right sides, but is fullscreen.

ATI Radeon 9700 AIW Pro with ATI's binary driver 8.8.25 (1/17/05) and 
XFree86 4.3.0.  Display is Sony KV-27HS420.

-Paul
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] DVI to Sony HS420 series HDTV

2005-01-22 Thread Joe Barnhart

--- Brad Templeton <[EMAIL PROTECTED]> wrote:

> >From what I have seen, most HDTVs really only take
> 1080i, they need
> an interlaced mode.
> 
> What amazes me is that they don't just make them
> take 1080p.  I mean
> there are computer monitors that cost $200 these
> days that can do
> that, the electronics just aren't that expensive any
> more.  90%
> of them are just downscaling to 1280x720p, or even
> EDTV so it doesn't
> make that big difference to them.

Doesn't this take a dual-DVI connection for a digital
link?  I thought a single DVI topped out at 1080i/720p
and greater bandwidth required two DVI channels.

> However, unfortunately taking 720p signals, and
> converting them to 1080i
> in your PC and then having the tv convert them back
> to 1280x720 (not
> really i or p) really hurts the quality.

So does converting 720p to 1080i for display on a
native 1080i set.  The display artifacts are just
horrible when the camera pans across, say, a football
field with all those nearly-horizontal lines!




__ 
Do you Yahoo!? 
Yahoo! Mail - Find what you need with new enhanced search.
http://info.mail.yahoo.com/mail_250
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] DVI to Sony HS420 series HDTV

2005-01-22 Thread Brad Templeton
> To me it looks like the horizontal resolution is provided with an 
> aspect ratio such that the vertical resolution can be determined for 
> each standard timing identification.  Anyhow, the EDID information 
> that I previously posted clearly came from my TV.  ATI's windows 
> driver recognizes it as a LCD, probably because of the DVI-D 
> interface.  I wonder if the mode is reported incorrectly because it 
> is an interlaced mode.  Perhaps I can just double all the vertical 
> settings to obtain 1080i.  hmm...

A lot of TVs report the 1920x540p because they don't have an easy way
to report interlaced modes, and since interlaced computer monitors went
out about 15 years ago, some video card drivers don't support
interlaced modes any more, HDTV is bringing it back.

>From what I have seen, most HDTVs really only take 1080i, they need
an interlaced mode.

What amazes me is that they don't just make them take 1080p.  I mean
there are computer monitors that cost $200 these days that can do
that, the electronics just aren't that expensive any more.  90%
of them are just downscaling to 1280x720p, or even EDTV so it doesn't
make that big difference to them.

I notice that Mitsubishi TVs which have a VGA port on them will take
720p on the VGA port, but not 1080i or 1080p.  At least one I looked at.

Now one reason to use 1080i is if your signal is 1080i and you send it
1080i the TV knows how to display it and deal with the interlacing
well.  If you send 1080p you would have to do that, but a very simple
bob would do the trick I suspect.

Also amazing is that some TVs who have internal 1280x720 resolution (or
similar) want a 1080i input and can't even take 720p.  This was a
trick to be cheap.  You have to take 1080i or people won't buy you so
they do only one.

However, unfortunately taking 720p signals, and converting them to 1080i
in your PC and then having the tv convert them back to 1280x720 (not
really i or p) really hurts the quality.
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] DVI to Sony HS420 series HDTV

2005-01-22 Thread Paul Miller
On Saturday 22 January 2005 3:18 pm, Preston Crow wrote:
> On Sat, 2005-01-22 at 16:04, Paul Miller wrote:
> > I have a Sony KV-27HS420 HDTV and an ATI Radeon 9700 AIW Pro, and
> > I'm trying to configure X such that my display works with the
> > DVI/HDMI interface.  I used read-edid | parse-edid to obtain the
> > modeline settings below.  Mode 720x480 works well so long I don't
> > switch to different inputs on the TV.  If I do, the display
> > sometimes becomes partially corrupted.  Mode 1920x540 does not
> > work and it appears that the TV wants to horizontally squash the
> > image, even though it is solid black.  I'm not sure why the EDID
> > information says "1920x540" instead of "1920x1080".  I haven't
> > been able to find any 1080i modelines that work either.  The
> > display's native resolution is 1080i.
>
> Correct me if I'm wrong, but my impression is that EDID information
> is based on a spec that assumes a multisync monitor.  When
> connecting to a TV, it may be impossible for the TV to correctly
> specify the legal modes, as they are separate discrete modes, not a
> continuous range of modes.
>
> http://en.wikipedia.org/wiki/EDID
>
> Yup, if I'm reading that correctly, EDID can't provide a list of
> video modes.

To me it looks like the horizontal resolution is provided with an 
aspect ratio such that the vertical resolution can be determined for 
each standard timing identification.  Anyhow, the EDID information 
that I previously posted clearly came from my TV.  ATI's windows 
driver recognizes it as a LCD, probably because of the DVI-D 
interface.  I wonder if the mode is reported incorrectly because it 
is an interlaced mode.  Perhaps I can just double all the vertical 
settings to obtain 1080i.  hmm...

-Paul

___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] DVI to Sony HS420 series HDTV

2005-01-22 Thread Preston Crow
On Sat, 2005-01-22 at 16:04, Paul Miller wrote:
> I have a Sony KV-27HS420 HDTV and an ATI Radeon 9700 AIW Pro, and I'm 
> trying to configure X such that my display works with the DVI/HDMI 
> interface.  I used read-edid | parse-edid to obtain the modeline 
> settings below.  Mode 720x480 works well so long I don't switch to 
> different inputs on the TV.  If I do, the display sometimes becomes 
> partially corrupted.  Mode 1920x540 does not work and it appears that 
> the TV wants to horizontally squash the image, even though it is 
> solid black.  I'm not sure why the EDID information says "1920x540" 
> instead of "1920x1080".  I haven't been able to find any 1080i 
> modelines that work either.  The display's native resolution is 
> 1080i.

Correct me if I'm wrong, but my impression is that EDID information is
based on a spec that assumes a multisync monitor.  When connecting to a
TV, it may be impossible for the TV to correctly specify the legal
modes, as they are separate discrete modes, not a continuous range of
modes.

http://en.wikipedia.org/wiki/EDID

Yup, if I'm reading that correctly, EDID can't provide a list of video
modes.

--PC


___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users