Re: [mythtv-users] bobdeint as output filter?

2004-12-31 Thread Joe Barnhart

--- Kyle Rose [EMAIL PROTECTED] wrote:
 
 FWIW, switching to using Bob as the deinterlacer set
 in that special
 control has also fixed the weird speed pulsing
 problem I was having
 with the new ffmpeg code.  And it looks *beautiful*.

Wait...  A little clarification here please...

Are you guys saying we need to set Bob in TWO places
to make this work?  I know about the bob setting in
the deinterlace box, but do I also name bobdeint
below in the special filters section?  Is this why
1080i and 720p playback looks so sub-optimal on my
native 1080i HDTV set?

__
Do You Yahoo!?
Tired of spam?  Yahoo! Mail has the best spam protection around 
http://mail.yahoo.com 
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] bobdeint as output filter?

2004-12-31 Thread Kyle Rose
Joe Barnhart [EMAIL PROTECTED] writes:

 Are you guys saying we need to set Bob in TWO places
 to make this work?

No, just the deinterlace box.

 I know about the bob setting in
 the deinterlace box, but do I also name bobdeint
 below in the special filters section?  Is this why
 1080i and 720p playback looks so sub-optimal on my
 native 1080i HDTV set?

I'm confused as to why you're using a deinterlacer for 1080i input on
a native 1080i set.

Kyle
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] bobdeint as output filter?

2004-12-31 Thread Isaac Richards
On Friday 31 December 2004 10:04 am, Joe Barnhart wrote:
 --- Kyle Rose [EMAIL PROTECTED] wrote:
  FWIW, switching to using Bob as the deinterlacer set
  in that special
  control has also fixed the weird speed pulsing
  problem I was having
  with the new ffmpeg code.  And it looks *beautiful*.

 Wait...  A little clarification here please...

 Are you guys saying we need to set Bob in TWO places
 to make this work?

No.

Isaac
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] bobdeint as output filter?

2004-12-31 Thread Joe Barnhart

--- Kyle Rose [EMAIL PROTECTED] wrote:

 I'm confused as to why you're using a deinterlacer
 for 1080i input on
 a native 1080i set.

I'm just shotgunning anything that might make my
HDTV look more HD.  720p material in particular
looks bad on my set.  The scaling is poor and produces
stair-step artifacts that are quite distracting on
material like football games.  1080i is mostly good,
but not as good as a settop OTA receiver box.

I'm using Xv on nVidia with 6111 driver.  My modeline
is 1080i and seems to work judging by the UI (both
myth and window manager).



__ 
Do you Yahoo!? 
Yahoo! Mail - Find what you need with new enhanced search.
http://info.mail.yahoo.com/mail_250
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] bobdeint as output filter?

2004-12-31 Thread Kyle Rose
 I'm using Xv on nVidia with 6111 driver.  My modeline
 is 1080i and seems to work judging by the UI (both
 myth and window manager).

Isn't one of the complaints about the nVidia drivers that they don't
support interlaced modes correctly?  With proper interlacing support
in the driver, you should just be able to use Xv scaling of
progressive video and it should look as good as it can on a display of
a different native resolution.

FWIW, I hate interlacing.  Interlacing was a technology devised to
overcome a limitation of one kind of display technology and should
have been deprecated with the advent of DTV.  Sorry to open THAT can
of worms; just my $0.02. :)

Cheers,
Kyle
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


1920x1080 HDTV's? (was Re: [mythtv-users] bobdeint as output filter?)

2004-12-31 Thread Michael T. Dean
While writing the below response, I started to wonder if there are *any* 
HDTV's available with 1920x1080 pixels.  More info below.  Keeping it 
short for those who just want to get to the point...  :)

On 12/31/2004 11:55 AM, Kyle Rose wrote:
FWIW, I hate interlacing.  Interlacing was a technology devised to
overcome a limitation of one kind of display technology and should
have been deprecated with the advent of DTV.  Sorry to open THAT can
of worms; just my $0.02. :)
 

The limitation: bandwidth.
If interlacing is no longer relevant in the age of digital TV, that 
implies that we now have unlimited bandwidth.  However, looking at ATSC 
high-definition TV, we have two primary modes:  720p ([EMAIL PROTECTED] 
(fr)/sec and 60fields (fi)/sec) and 1080i ([EMAIL PROTECTED]/sec and 
60fi/sec).  (Yes, I'm ignoring the 30fr/sec with 30fi/sec, and the 
24fr/sec with 24fi/sec progressive modes available for 1080 and 720 
resolutions--not to mention the 12 other formats with lower resolutions.)

So, what is the purpose of 1080i?  Basically,it allows higher resolution 
at approximately the same bandwidth. 720p gives 921,600 pixels and 1080i 
gives 2,073,600 pixels--more than double the pixels of 720p.  However 
both 720p and 1080i take approximately 3MHz bandwidth, compared to 6MHz 
for NTSC (HDTV takes less bandwidth because of the compression that's 
possible with the digital signal).  So, if 1080i takes half the 
bandwidth of NTSC, why not make it [EMAIL PROTECTED]  Well, the broadcasters 
feel that the benefits of the progressive format are not worth the cost 
of the bandwidth--i.e. they would rather be able to transmit twice the 
number of channels (=2 times as much space for advertisements) in the 
bandwidth they have available.

Therefore, 1080i yields a much better picture than 720p for 
slow-changing scenes:  it is not ideal for sports or other shows that 
are composed primarily of fast-motion scenes.  Given that in most 
television shows--dramas, comedies, news, etc.--the fast-motion scenes 
are a very small percentage of the show, 1080i allows much better 
overall picture quality.

Of course, since nearly all HDTV's on the market have only 1280x720 
pixels, the quality benefit is chiefly available to those people using a 
computer to output to something other than a TV (i.e. high resolution 
monitors (such as WUXGA) or--for those with a lot of extra cash lying 
around--a projector with an extremely high optical resolution that can 
fully resolve 1920x1080, like the Runco DTV-1200 ( 
http://www.runco.com/OP_PA_dtv1200.html , MSRP $44,995.00)). 

But, wait!  My TV says it supports 1080i.  It does.  It accepts a 1080i 
signal, deinterlaces it, scales it down to 1280x720 pixels, and displays 
it.  Therefore, the TV's available today completely negate the advantage 
of 1080i (better picture quality) by scaling down to 1280x720 (which can 
even produce a lower-quality image than an unscaled 720p image).

So, are there any real 1920x1080 TV's out there?  I figure if I'm buying 
an HDTV, I'm not wasting money on a 1280x720 one, but I can't find any 
1920x1080 TV's.  Toshiba used to have one ( 
http://www.tacp.toshiba.com/televisions/product.asp?model=57HLX82 , MSRP 
$8999.99), but now that they've gone exclusively Digital Light 
Processing (DLP) (instead of the Liquid Crystal on Silicon (LCOS) they 
used for the 1920x1080 TV), it seems they only have 1280x720.  I'm not 
willing to spend on a projector more than twice what I spent on my car, 
so the Runco is out of the question.  Anyone know of any others?

It looks to me like I may be sticking with SDTV for several more years...
Mike
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: 1920x1080 HDTV's? (was Re: [mythtv-users] bobdeint as output filter?)

2004-12-31 Thread Dan wolf
Almost every TV you will ever see is a complete rip off.  You think it
takes 9 big ones to make a damn TV?  Think again.  In addition, all
the specs the salesmen give you are complete bullshit, and the specs
on the TV box are bullshit too.  Hell, the TV I bought SAID it had a
comb filter, but then I found out it doesn't.  I got fricken screwed.

TVs are nothing but lies


On Fri, 31 Dec 2004 14:22:55 -0500, Michael T. Dean
[EMAIL PROTECTED] wrote:
 While writing the below response, I started to wonder if there are *any*
 HDTV's available with 1920x1080 pixels.  More info below.  Keeping it
 short for those who just want to get to the point...  :)
 
 On 12/31/2004 11:55 AM, Kyle Rose wrote:
 
 FWIW, I hate interlacing.  Interlacing was a technology devised to
 overcome a limitation of one kind of display technology and should
 have been deprecated with the advent of DTV.  Sorry to open THAT can
 of worms; just my $0.02. :)
 
 
 The limitation: bandwidth.
 
 If interlacing is no longer relevant in the age of digital TV, that
 implies that we now have unlimited bandwidth.  However, looking at ATSC
 high-definition TV, we have two primary modes:  720p ([EMAIL PROTECTED]
 (fr)/sec and 60fields (fi)/sec) and 1080i ([EMAIL PROTECTED]/sec and
 60fi/sec).  (Yes, I'm ignoring the 30fr/sec with 30fi/sec, and the
 24fr/sec with 24fi/sec progressive modes available for 1080 and 720
 resolutions--not to mention the 12 other formats with lower resolutions.)
 
 So, what is the purpose of 1080i?  Basically,it allows higher resolution
 at approximately the same bandwidth. 720p gives 921,600 pixels and 1080i
 gives 2,073,600 pixels--more than double the pixels of 720p.  However
 both 720p and 1080i take approximately 3MHz bandwidth, compared to 6MHz
 for NTSC (HDTV takes less bandwidth because of the compression that's
 possible with the digital signal).  So, if 1080i takes half the
 bandwidth of NTSC, why not make it [EMAIL PROTECTED]  Well, the broadcasters
 feel that the benefits of the progressive format are not worth the cost
 of the bandwidth--i.e. they would rather be able to transmit twice the
 number of channels (=2 times as much space for advertisements) in the
 bandwidth they have available.
 
 Therefore, 1080i yields a much better picture than 720p for
 slow-changing scenes:  it is not ideal for sports or other shows that
 are composed primarily of fast-motion scenes.  Given that in most
 television shows--dramas, comedies, news, etc.--the fast-motion scenes
 are a very small percentage of the show, 1080i allows much better
 overall picture quality.
 
 Of course, since nearly all HDTV's on the market have only 1280x720
 pixels, the quality benefit is chiefly available to those people using a
 computer to output to something other than a TV (i.e. high resolution
 monitors (such as WUXGA) or--for those with a lot of extra cash lying
 around--a projector with an extremely high optical resolution that can
 fully resolve 1920x1080, like the Runco DTV-1200 (
 http://www.runco.com/OP_PA_dtv1200.html , MSRP $44,995.00)).
 
 But, wait!  My TV says it supports 1080i.  It does.  It accepts a 1080i
 signal, deinterlaces it, scales it down to 1280x720 pixels, and displays
 it.  Therefore, the TV's available today completely negate the advantage
 of 1080i (better picture quality) by scaling down to 1280x720 (which can
 even produce a lower-quality image than an unscaled 720p image).
 
 So, are there any real 1920x1080 TV's out there?  I figure if I'm buying
 an HDTV, I'm not wasting money on a 1280x720 one, but I can't find any
 1920x1080 TV's.  Toshiba used to have one (
 http://www.tacp.toshiba.com/televisions/product.asp?model=57HLX82 , MSRP
 $8999.99), but now that they've gone exclusively Digital Light
 Processing (DLP) (instead of the Liquid Crystal on Silicon (LCOS) they
 used for the 1920x1080 TV), it seems they only have 1280x720.  I'm not
 willing to spend on a projector more than twice what I spent on my car,
 so the Runco is out of the question.  Anyone know of any others?
 
 It looks to me like I may be sticking with SDTV for several more years...
 
 Mike
 ___
 mythtv-users mailing list
 mythtv-users@mythtv.org 
 http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
 


-- 

I have one Gmail invite left, email me to grab it!
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: 1920x1080 HDTV's? (was Re: [mythtv-users] bobdeint as output filter?)

2004-12-31 Thread Joe Barnhart

--- Michael T. Dean [EMAIL PROTECTED] wrote:

 While writing the below response, I started to
 wonder if there are *any* 
 HDTV's available with 1920x1080 pixels.  More info
 below.

The old tech of CRT supports 1080i.  I have a
Pioneer PRO-610HD set (about 3 years old) and its
native format is 1080i.  Even so, I don't think you
would miss much resolution at 1280x720p on a good LCD
or DLP set. If you are set on 1920x1080i you could
wait for one of the next-generation LCOS sets that are
based on Brillian technology.  (If they survive.  But
it looks encouraging this week.) 

http://www.brilliancorp.com



__ 
Do you Yahoo!? 
Yahoo! Mail - Helps protect you from nasty viruses. 
http://promotions.yahoo.com/new_mail
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


[mythtv-users] bobdeint as output filter?

2004-12-30 Thread Kyle Rose
'bobdeint' doesn't work properly for me when specified as an output
filter.

When specified on the custom filters line for a host, it does
nothing: the video is not deinterlaced as far as I can tell.

When specified as a channel-specific output filter, I end up with the
fields stacked, implying that Xv doesn't know how to properly
reconstruct each frame.

Bob works fine when I specify it as the (deprecated?) deinterlacing
algorithm above the custom filters input line.

Cheers,
Kyle
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] bobdeint as output filter?

2004-12-30 Thread Isaac Richards
On Thursday 30 December 2004 06:26 pm, Kyle Rose wrote:
 'bobdeint' doesn't work properly for me when specified as an output
 filter.

 When specified on the custom filters line for a host, it does
 nothing: the video is not deinterlaced as far as I can tell.

The custom filters setting doesn't work properly for any deinterlacing 
algorithm - it's  for the non-deinterlacing related filters, since all the 
deinterlacing algorithms are available in the deinterlacing settings box 
immediately above.

 When specified as a channel-specific output filter, I end up with the
 fields stacked, implying that Xv doesn't know how to properly
 reconstruct each frame.

Same issue as above.

 Bob works fine when I specify it as the (deprecated?) deinterlacing
 algorithm above the custom filters input line.

Why would it be deprecated?

Isaac
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users