Re: [mythtv-users] Playback vid corruption with 1080i (with pics)

2005-06-30 Thread Glen Dragon
I think that i've further narrowed down the problem. After bringing a 
different box downstairs and trying on the same display, I've learned a 
few things. -- If myth/myplayer outputs a 1080i signal at particular 
resolutions (such as 1680x1050) it will disort the output as described 
below.


-- If other output resolutions are used it works great.  Ie 1400x1050 
works fine.


The best way that I have found to play with this is to leave X at a res 
of 1680x1050, and use the appearance options in myth to try different 
resolutions. I haven't really tried very many, since i discovered this 
late last night.


Now I wondering if the problem is with the wide screen (16:10) nature 
of the 1680x1050 resolution, or if it is just too high or something.


I'd obviously like to run at 1680x1050 since that's the native 
resolution of the LCD, and also i would want it to be wide, since 
widescreen is obviously better.


What can i do?  Does anyone know why I can't use the correct 
resolution?  Is there another resolution that would be close that might 
work?  What can I try?



Quoting Glen Dragon <[EMAIL PROTECTED]>:


Quoting john sturgeon <[EMAIL PROTECTED]>:


On 6/23/05, Doug Larrick <[EMAIL PROTECTED]> wrote:


Deinterlacing is clearly not working in these pictures.  The stripes are
so wide because something (probably your video card) is scaling the
pictures to a resolution less than 1080 lines high.  Double-check you
have deinterlacing enabled, and check the output from 'mythfrontend
--verbose playback'.  Also note that bob deinterlacing is disabled if
you're playing at a speed other than 1x.
-Doug


Doug is right on.  That is classic when resizing non-deinterlaced
content.  You can also see this problem if your filter chain is in the
wrong order, and you resize *before* deinterlacing.  You should always
de-interlace prior to resizing.
John <><


Hmm.. I am using kernel deinterlacing.  Or at least thats what I 
specify in the menu.  I don't have any other exotic filters or 
anything.  I am playing at a 1x speed (normally) The display is 
outputing onto a lcd flat panel at 1680x1050, the native resolution 
of the display.  I have Xorg setup to do this resolution.


I don't know how I would re-arrange the filter chain. Is this 
configurable somewhere? I assume that X is doing the scaling from the 
1920x1080 to the 1680x1050. Is this correct?  If myth is doing the 
de-int, it should be taking place before the scaling right?












___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] Overwhelmed

2005-06-30 Thread Glen Dragon



What does the myth TV box need to contain to do
720 P and 1080i?  Do I need a DVD player or is that
covered by MythTV?  How does mythtv hook in?


You need a decent processor (2ghz will give you some
room to grow) and a DVD drive.  MythTV hooks
in to the TV with a good video card.  I like the
nvidia 5200/5700 fanless card.  It does the job
pretty good.



You will need a 3 ghz class proc to process HDTV in software.  Since 
the hardware decoding is spotty, you will want a processor that will 
allow you to use this option


___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How to get 1366x768?

2005-06-29 Thread Glen Dragon
FYI.. You will need the modeline as described below.  You will need to 
use nvidia 7167 (or earlier) rather than than the 7664. They broke high 
resolutions on LCDs in the 7664 release.


Quoting Paul Woodward <[EMAIL PROTECTED]>:


Sounds like you need a windows box, powerstrip and a custom modeline...
There are plenty of howto's on the web.
Paul

On 6/29/05, JacqueUsi <[EMAIL PROTECTED]> wrote:


Costco is currently carrying a 32" Proview LCD with RGB and HDMI inputs.
Scooped one up and connected to my Myth box. Puzzled at why the image is
stretched at 16:9 but then realized Myth .17 or rather FC2 is defaulting to
1024X768 resolution. At least that's my theory.
 When I attempt to change the resolution via GUI, no such 1366x768
resolution coming up as an option.
 Can anyone help me out? My video card is an Nvidia connecting straight to
the LCD panel via RGB cable.
 Thanks!

--
Yahoo! Sports
Rekindle the Rivalries. Sign up for Fantasy 
Football



___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users









___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] Playback vid corruption with 1080i (with pics)

2005-06-27 Thread Glen Dragon




john sturgeon wrote:

  Try playing back the .nuv file using mplayer -vf pp=lb (just for a sanity check)

I see the following in the log file:
2005-06-23 20:29:08.607 Image size. dispxoff 0, dispyoff: 53,
dispwoff: 1680, disphoff: 944
2005-06-23 20:29:08.607 Image size. imgx 77, imgy: 43, imgw: 1766, imgh: 994

are you setting the output size differently than you're display
resolution?  If so, then MythTV is probably scaling, and it is
possible that MythTV is indeed scaling prior to deinterlacing (that
would be bad) but a dev would probably be able to comment further.

  

Well. I tried mplayer, and I'm getting the same effect.  The log is
attached.
The only thing remotely interesting might be this:

Could not find matching colorspace - retrying with -vf scale...
Opening video filter: [scale]
The selected video_out device is incompatible with this codec.
VDecoder init failed :(

I'm not intentionally setting the output size different, they are set
to 0 for all the fields.  The only thing that I do have set that could
be affecting anything is the overscan setting.  I have both the horz
& vertical set to 4%. I disabled it, and there is no difference.



alidor recordings # mplayer -vf pp=lb 5006_2005061921_2005061922.nuv
MPlayer 1.0pre6-3.4.3 (C) 2000-2004 MPlayer Team
CPU: Advanced Micro Devices  (Family: 8, Stepping: 10)
Detected cache-line size is 64 bytes
Cannot test OS support for SSE, disabling to be safe.

Warning unknown option cache_min at line 144
Warning unknown option cache_prefill at line 147

77 audio & 189 video codecs
Opening joystick device /dev/input/js0
Can't open joystick device /dev/input/js0 : No such file or directory
Can't init input joystick
Setting up LIRC support...
mplayer: could not connect to socket
mplayer: No such file or directory
Failed to open LIRC support.
You will not be able to use your remote control.
Playing 5006_2005061921_2005061922.nuv.
Cache fill:  0.00% (0 bytes)TS file format detected.
DEMUX OPEN, AUDIO_ID: -1, VIDEO_ID: -1, SUBTITLE_ID: -1,
PROBING UP TO 200, PROG: 0
VIDEO MPEG2(pid=17)...AUDIO A52(pid=20) NO SUBS (yet)!  PROGRAM N. 1
Opened TS demuxer, audio: 2000(pid 20), video: 1002(pid 17)...POS=564
VIDEO:  MPEG2  1920x1080  (aspect 3)  29.970 fps  15800.0 kbps (1975.0 kbyte/s)
==
Opening audio decoder: [liba52] AC3 decoding with liba52
No accelerated IMDCT transform found
AC3: 2.0 (stereo)  48000 Hz  384.0 kbit/s
No accelerated resampler found
AUDIO: 48000 Hz, 2 ch, 16 bit (0x10), ratio: 48000->192000 (384.0 kbit)
Selected audio codec: [a52] afm:liba52 (AC3-liba52)
==
vo: X11 running at 1680x1050 with depth 24 and 32 bpp (":0.0" => local display)
Opening video filter: [pp=lb]
==
Opening video decoder: [mpegpes] MPEG 1/2 Video passthrough
VDec: vo config request - 1920 x 1080 (preferred csp: Mpeg PES)
[PP] Using external postprocessing filter, max q = 6.
Could not find matching colorspace - retrying with -vf scale...
Opening video filter: [scale]
The selected video_out device is incompatible with this codec.
VDecoder init failed :(
Opening video decoder: [libmpeg2] MPEG 1/2 Video decoder libmpeg2-v0.4.0b
Selected video codec: [mpeg12] vfm:libmpeg2 (MPEG 1 or 2 (libmpeg2))
==
Checking audio filter chain for 48000Hz/2ch/16bit -> 48000Hz/2ch/16bit...
AF_pre: af format: 2 bps, 2 ch, 48000 hz, little endian signed int 
AF_pre: 48000Hz 2ch Signed 16-bit (Little-Endian)
AO: [oss] 48000Hz 2ch Signed 16-bit (Little-Endian) (2 bps)
Building audio filter chain for 48000Hz/2ch/16bit -> 48000Hz/2ch/16bit...
Starting playback...
VDec: vo config request - 1920 x 1080 (preferred csp: Planar YV12)
[PP] Using external postprocessing filter, max q = 6.
VDec: using Planar YV12 as output csp (no 0)
Movie-Aspect is 1.78:1 - prescaling to correct movie aspect.
VO: [xv] 1920x1080 => 1920x1080 Planar YV12 
aspect: Warning: no suitable new res found!
aspect: Warning: no suitable new res found!
aspect: Warning: no suitable new res found!
aspect: Warning: no suitable new res found!  7/  7 ??% ??% ??,?% 0 0 89%   
A:36701.1 V:36701.1 A-V:  0.051 ct: -0.460 679/679 35% 22%  1.1% 0 0 48%   
Exiting... (Quit)
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] Announce: Gentoo live CVS ebuild updated

2005-06-23 Thread Glen Dragon

Quoting Nick Rosier <[EMAIL PROTECTED]>:


are you working on subversion ebuilds for MythTV?



I will update mine when it becomes necessary. I'll post them

___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] Playback vid corruption with 1080i (with pics)

2005-06-23 Thread Glen Dragon

Quoting john sturgeon <[EMAIL PROTECTED]>:


On 6/23/05, Doug Larrick <[EMAIL PROTECTED]> wrote:


Deinterlacing is clearly not working in these pictures.  The stripes are
so wide because something (probably your video card) is scaling the
pictures to a resolution less than 1080 lines high.  Double-check you
have deinterlacing enabled, and check the output from 'mythfrontend
--verbose playback'.  Also note that bob deinterlacing is disabled if
you're playing at a speed other than 1x.
-Doug


Doug is right on.  That is classic when resizing non-deinterlaced
content.  You can also see this problem if your filter chain is in the
wrong order, and you resize *before* deinterlacing.  You should always
de-interlace prior to resizing.
John <><


Hmm.. I am using kernel deinterlacing.  Or at least thats what I 
specify in the menu.  I don't have any other exotic filters or 
anything.  I am playing at a 1x speed (normally) The display is 
outputing onto a lcd flat panel at 1680x1050, the native resolution of 
the display.  I have Xorg setup to do this resolution.


I don't know how I would re-arrange the filter chain. Is this 
configurable somewhere? I assume that X is doing the scaling from the 
1920x1080 to the 1680x1050. Is this correct?  If myth is doing the 
de-int, it should be taking place before the scaling right?


I will post the output of -v playback when I get home. Anything else I 
should enable?



Isn't the 1920x1080i signal pushing the limits of what DVI can handle?
I presume you have a single-channel DVI.  It almost looks to me like
the set is on the hairy edge of sync with that signal.  The 720p
signals push slightly less information, which could explain why the ABC
broadcasts work.
I don't suppose the set has a VGA input for comparison?  (Some do, like
my old-tech Pioneer RPTV with quaint "CRT" technology.)


I am using a dual-channel DVI cable from pacific cable. I am only 
running at a resolution of 1680x1050.  It does have a VGA input. I can 
try it out, tho it'd require moving things around.







___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


[mythtv-users] Playback vid corruption with 1080i (with pics)

2005-06-23 Thread Glen Dragon
I'm still seeing the problem that I reported a week or so ago.  I took 
some pictures to show the effect. Hopefully someone knows something. 
Interestingly enough it is also causing the same effect on the OSD.

Pics: http://www.jetcom.org/gallery/dragon-myth
(There is some bluring due to difficulties in taking the picture, as I 
didn't have the tripod around. It really does look like that tho)


I've tried different nvidia drivers (7664, 7167, 6629) no difference. 
I'm not sure what else to try.


-original post---

I see wierd horizontal striping every 1/2 an inch or so.  There is a 
jagged green area on the left (and covering parts of the picture). It 
sort of makes the picture shimmer. It's semi-watchable if you can 
ignore the effect (aka it doens't obstruct the picture).


This only appears on my NBC broadcasts (and PBS too i think). I think 
this means it's on 1080i broadcasts.  Fox & ABC are crystal clear. I 
don't get CBS right now. I believe that it also appears on upconvered 
material, ie the news was the same way last night.  I only see it on 
one of my frontend which happens to have an amd64 using 64bit gentoo.  
So I'm assuming that it has something to do with that. I'm also 
assuming that it's related to the fact that he signal is interlaced. 
The recording itself is fine, ie I can play it upstairs on a different 
frontend.


The machine is an AMD64 3500+, nvidia fx5300 connected DVI to a 2005FPW 
flat panel.  I have also had this problem connected to a 50+ rp-lcd tv. 
I have tried various combinations of opengl vsync and various deints 
(kernel/bob). Not using xvmc (never got it to work on this machine), 
processor utilization is ~50-60%


This *has* worked beautifully before. I don't know why/how it has 
changed.  At one point I believed that it was mmx related, but now it 
seems to be independent. I have tried actively disabling mmx in the 
config. The only thing this now accomplishes is to increase my cpu util 
> 90-100%. I've seen this effect using various cvs builds on this 
machine since before 0.18



___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users