Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-13 Thread Joe Votour
Yeah, all those free-running components screwing
things up.  :)  Then again, my Commodore 64 can't do
MPEG-2, or else I'd have written a MythTV client in
assembly for it already.  :D

Just to add my (so far) success story to this thread,
I decided to take some of the suggestions and see what
would happen.

The first step was rebuilding the MythTV RPMs (from
ATrpms) with OpenGL support.  After about five
rebuilds and much searching of the atrpms-devel
mailing list (thanks to Jarod Wilson for posting his
macros!), I got it working.

XvMC is a bust, it doesn't seem to work on my machine
(Athlon64, running in 32-bit mode, nVidia driver on a
5200FX), I just get a frozen image or massive
stuttering.  OpenGL support on its own helps out
massively (on my system), the image (say, on CNN)
doesn't jitter as much as when using either the DRM or
RTC methods.

Anyway, to conclude, my method:
- MythTV 0.17 (from RPMs, with the frontend/backend
RPMs recompiled)
- Enable OpenGL vsync support (MythTV now shows Video
timing method: SGI OpenGL)
- No XvMC support (doesn't work)
- Use libmpeg2 instead of ffmpeg (text doesn't seem as
fat when using it)
- Enable Bob deinterlacing
- Turn on  Use Video for Timebase
- Changes to the nvidia-settings-rc file as described
in the thread

At the moment, it's looking pretty good, we'll see
what happens once something with jagged lines (like a
cartoon) comes in to see if Bob flickers like mad.

Gees, I really need to stop watching this late night
TV...  When there's judges called Extreme Akim on
the TV, well...  (So, will this combo seem like it
works when I'm fully awake in the morning?)

-- Joe

--- Cory Papenfuss [EMAIL PROTECTED] wrote:
  The key to MythTV (or any program, really) being
 able
  to render a display without tears or choppiness is
  really in two things:
  1. Being able to know when the vertical sync is,
 and,
  2. Being able to react to the vertical sync event
 in a
  timely manner
 
   There's a 3rd issue here.  Most (all?) linux video
 cards run with 
 a free-running clock.  If you want to avoid beat
 frequency issues and 
 tearing, you really want to have the mpeg stream
 itself trigger the card 
 to send out a new vertical field.  Otherwise, you
 have the MPEG stream 
 running a 29.97Hz field rate, and the video card
 running at a close, but 
 not phase/frequency locked rate of, say, 29.98Hz. 
 That leaves a 0.01Hz 
 beat frequency which can show up as screen tearing
 that moves slowly.
 
   I guess if you've got the VBI, you can do without
 this by simply 
 using the blanking interval time as a time buffer. 
 If the card runs too 
 fast and the MPEG stream doesn't have another frame
 yet, show the old 
 frame again.  If the card runs too slowly and
 *another* frame is ready 
 before the previous has been shown, drop it.  As
 said before, the 
 fundamental problem is that linux is not realtime
 (hard or even soft).
 
 -Cory
 

*
 * Cory Papenfuss
*
 * Electrical Engineering candidate Ph.D. graduate
 student   *
 * Virginia Polytechnic Institute and State
 University   *

*
 
  ___
 mythtv-users mailing list
 mythtv-users@mythtv.org

http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
 



__ 
Do you Yahoo!? 
Yahoo! Small Business - Try our new resources site!
http://smallbusiness.yahoo.com/resources/ 
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-13 Thread Sigurd Nes
Joe Votour wrote:
- Enable OpenGL vsync support (MythTV now shows Video
timing method: SGI OpenGL)
Where did you find this option ?
Sigurd
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-13 Thread Will Dormann
Sigurd Nes wrote:
Joe Votour wrote:
- Enable OpenGL vsync support (MythTV now shows Video
timing method: SGI OpenGL)

Where did you find this option ?
It's a compile option in settings.pro
--
-WD
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-13 Thread Joe Votour
It is not available in many of the default MythTV
builds, because it requires linking against an OpenGL
library of some kind.  Also, some time ago, when it
was enabled by default, some people were having
problems with it.

If you're building from source, then you must edit
settings.pro (look for opengl).  If you're using
Axel's ATrpms, then you need to recompile the MythTV
source RPM, and reinstall the new binary RPMs that are
generated, making sure to pass in whatever options
that you need when compiling them (the default options
worked for me).

Once you have recompiled MythTV (as long as you stay
with the same source release, you don't need to
recompile the add-on programs), then it will
automatically detect OpenGL support.  There will be
nothing in the GUI that allows you to turn it on or
off.

-- Joe

--- Sigurd Nes [EMAIL PROTECTED] wrote:
 Joe Votour wrote:
  - Enable OpenGL vsync support (MythTV now shows
 Video
  timing method: SGI OpenGL)
 
 Where did you find this option ?
 
 Sigurd
 ___
 mythtv-users mailing list
 mythtv-users@mythtv.org

http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
 



__ 
Do you Yahoo!? 
Make Yahoo! your home page 
http://www.yahoo.com/r/hs
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-11 Thread trev
Hi
I've got an Asus AGP MX440 and am using the composite output, direct into my 
tv.
I run Mythtv on the tv and use my computer on my monitor.
I can't see any problems on the tv screen with bars or anything, although at 
1024x768 the text is a bit unreadable on the desktop, although I expected 
that.
The manual I have does specify If the correct connector cable is connected, 
S-Video out will generally provide a higher quality output than Composite 
video out.
I can't test this at the moment because although I have an S-Video connector 
at both ends I don't have an S-Video cable and am actually unsure if the 
picture could look any better anyway.
I am not sure if this is strictly on topic, but I can't use the mouse on the 
TV, it doesn't scroll across to the other screen (a separate desktop) and 
although the keyboard works on that screen when mythtv is started if I select 
something on the monitor using the mouse the keyboard switches to that and I 
don't know how to switch it back.

Any help would be appreciated

Best Regards

Trev


On Friday 11 Mar 2005 00:56, cythraul wrote:
 I too have an AGP 440MX manufactured by XFX. Although the display is
 quite as crisp as you can get on s-video, I see some kind of ghostly
 diagonal bars scrolling.

 I don't quite notice them most of the time but in the GUI it is
 _quite_ noticeable Enough for people, to whom I demonstrate my setup,
 to make comments about.

 My friend bought the PCI version of the 440MX from a manufacturer I
 don't remember at the moment. The display is terribly clean. Even the
 flicker we experience in most themes is hardly noticeable.

 Of course the best way would be to get a TV with at least components
 inputs. But for now s-video input is all I can afford.

 So I was wondering if the manufacturer of the actual board might
 affect results people are getting.

 If so, I'd be happy to learn from people's recommendation.

 Thanks,
 cyth

 On Thu, 10 Mar 2005 16:51:14 -0500, Will Dormann [EMAIL PROTECTED] wrote:
  Tom Lichti wrote:
   As an aside, what is the preferred Nvidia card for TV output? I have a
   generic GeForce 4 MX440 and it works alright, although I'm sure it
   could be better. If there is a better card to use, what is it?
 
   From what I gather, the MX4000 is just a die-shrunk version of the
  MX440, so if you follow my instructions you should be able to get pretty
  good results. Is this not the case?
 
 
  -WD
  ___
  mythtv-users mailing list
  mythtv-users@mythtv.org
  http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-11 Thread Will Dormann
trev wrote:
The manual I have does specify If the correct connector cable is connected, 
S-Video out will generally provide a higher quality output than Composite 
video out.
I can't test this at the moment because although I have an S-Video connector 
at both ends I don't have an S-Video cable and am actually unsure if the 
picture could look any better anyway.

So what's keeping you from trying it?   S-Video reproduces the picture 
more accurately than composite.  It shouldn't have artifacts like dot 
crawl, which composite will likely have.

Check this page for more info:
http://nfg.2y.net/games/ntsc/

-WD
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-11 Thread Andrew Close
On Fri, 11 Mar 2005 10:39:41 +, trev [EMAIL PROTECTED] wrote:


 I am not sure if this is strictly on topic, but I can't use the mouse on the
 TV, it doesn't scroll across to the other screen (a separate desktop) and
 although the keyboard works on that screen when mythtv is started if I select
 something on the monitor using the mouse the keyboard switches to that and I
 don't know how to switch it back.
 
 Any help would be appreciated

Trev,

the mouse is disabled in the Myth screens.  everything is activated by
either keyboard or remote.  this is a feature. ;)

-
a 'whole lotta' GMail Invites available
Please Email me OFF-list only...
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-11 Thread trev
Hi

There is nothing to stop me fromtrying S-Video except I would need to spend 
some money buying an S-Video to S-Video cable (which I might try at some 
point anyway).
At the moment I am more worried about the XWindows Panel appearing at the 
bottom of the TV screen when I start using the the computer monitor.
My IR Remote control also does not have a huge number of buttons (Came with my 
Adaptec VIDEOh! DVD Media Center) and so I would also like to use the 
keyboard to driving Mythtv on the second Screen.
I am new to Mythtv and only recently got my Remote working and have not 
completely figured out how to drive it properly.
If I cannot get the keyboard / panel on TV problem sorted I may abandon using 
Mythtv to watch live TV (originally I got the card to just copy home videos 
onto DVDs).
I am currently running KDE 3.2.3 on Mandrake 10.1 Official and from reading 
lots of different posts / faqs on the subject of multiple screens  and moving 
the mouse between screens I might try some different window managers to see 
if that helps.  Does anyone have a similar system working with a monitor and 
TV with different screens using KDE.

Best Regards

Trev

On Friday 11 Mar 2005 14:26, Will Dormann wrote:
 trev wrote:
  The manual I have does specify If the correct connector cable is
  connected, S-Video out will generally provide a higher quality output
  than Composite video out.
  I can't test this at the moment because although I have an S-Video
  connector at both ends I don't have an S-Video cable and am actually
  unsure if the picture could look any better anyway.

 So what's keeping you from trying it?   S-Video reproduces the picture
 more accurately than composite.  It shouldn't have artifacts like dot
 crawl, which composite will likely have.

 Check this page for more info:
 http://nfg.2y.net/games/ntsc/



 -WD
 ___
 mythtv-users mailing list
 mythtv-users@mythtv.org
 http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-11 Thread Tom Lichti
Will Dormann wrote:
Tom Lichti wrote:
As an aside, what is the preferred Nvidia card for TV output? I have 
a generic GeForce 4 MX440 and it works alright, although I'm sure it 
could be better. If there is a better card to use, what is it?

From what I gather, the MX4000 is just a die-shrunk version of the 
MX440, so if you follow my instructions you should be able to get 
pretty good results. Is this not the case?

I used your settings late last night, and I didn't notice an appeciable 
difference. I did notice a lot of 'smearing' for lack of a better term, 
in that when there was quick motion, you could see alternating 
horizontal lines that were out of sync in certain places on the screen, 
most noticable when there was  large difference between light and dark 
areas of the screen. I used your first posted settings (and I did notice 
that some of the options you described were not in the same sections as 
my XF86config file for some reason)

I should mention that I can't use XvMC since I get tons of those 
'pre-buffering pause' messages and lots of a/v hiccups with it, that go 
away when I don't use XvMC. I have tried the tweaks that have been 
posted to various messages but nothing made them go away. I have a P3 
1Ghz, dual PVR250's and the aforementioned MX440. With XvMC the CPU 
usage was minimal, now it's a constant 30-40%, but at least watching TV 
is smooth.

I need to go through this thread more thoroughly to see if I missed 
anything important.

Thanks
Tom
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-11 Thread trev
Hi
I sort of solved my problem of getting rid of the KDE panel on the TV and 
switching keyboard focus on non-xinerama dual head screens although its a bit 
naff:
If before I start mythfrontend I use a fantastic program from the website:
http://wiki.gentoo-italia.net/index.php/Dual_Monitors
to switch the mouse over, I can minimize the panel at the bottom of the 
screen.
Then after starting mythfrontend, if I want to set the keyboard focus to the 
TV, I run:
xterm -display localhost:0.1
exit out of the xterm, then I have keyboard focus on the TV.
The website mentioned above did say the mouse focus program would also work to 
change the keyboard focus, but I found that the it didn't appear to on my 
setup.
I hope this helps anyone else running non-xinerama dual head screens.

Best Regards

Trev

On Friday 11 Mar 2005 15:03, you wrote:
 Hi

 There is nothing to stop me fromtrying S-Video except I would need to spend
 some money buying an S-Video to S-Video cable (which I might try at some
 point anyway).
 At the moment I am more worried about the XWindows Panel appearing at the
 bottom of the TV screen when I start using the the computer monitor.
 My IR Remote control also does not have a huge number of buttons (Came with
 my Adaptec VIDEOh! DVD Media Center) and so I would also like to use the
 keyboard to driving Mythtv on the second Screen.
 I am new to Mythtv and only recently got my Remote working and have not
 completely figured out how to drive it properly.
 If I cannot get the keyboard / panel on TV problem sorted I may abandon
 using Mythtv to watch live TV (originally I got the card to just copy home
 videos onto DVDs).
 I am currently running KDE 3.2.3 on Mandrake 10.1 Official and from reading
 lots of different posts / faqs on the subject of multiple screens  and
 moving the mouse between screens I might try some different window managers
 to see if that helps.  Does anyone have a similar system working with a
 monitor and TV with different screens using KDE.

 Best Regards

 Trev

 On Friday 11 Mar 2005 14:26, Will Dormann wrote:
  trev wrote:
   The manual I have does specify If the correct connector cable is
   connected, S-Video out will generally provide a higher quality output
   than Composite video out.
   I can't test this at the moment because although I have an S-Video
   connector at both ends I don't have an S-Video cable and am actually
   unsure if the picture could look any better anyway.
 
  So what's keeping you from trying it?   S-Video reproduces the picture
  more accurately than composite.  It shouldn't have artifacts like dot
  crawl, which composite will likely have.
 
  Check this page for more info:
  http://nfg.2y.net/games/ntsc/
 
 
 
  -WD
  ___
  mythtv-users mailing list
  mythtv-users@mythtv.org
  http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-11 Thread Jeroen Brosens
I received my GF MX4000 today and managed to get it working... except for
vsync :) Can anybody tell me the one thing I probably forgot to do here,
as you can see in this log (mythfrontend -v playback) the nVidia vsync
method is not used (nVidiaVideoSync: VBlank ioctl did not work,
unimplemented in this driver? - Oh I am using the 6629 version btw):

snip

2005-03-12 01:30:42.858 Over/underscan. V: 0, H: 0, XOff: 0, YOff: 0
2005-03-12 01:30:42.864 Using XV port 105
2005-03-12 01:30:42.870 Snapping height to avoid scaling: disphoff 576,
dispyoff: 12
2005-03-12 01:30:42.870 Image size. dispxoff 0, dispyoff: 12, dispwoff:
800, disphoff: 576
2005-03-12 01:30:42.870 Image size. imgx 0, imgy: 0, imgw: 720, imgh: 576
2005-03-12 01:30:43.738 Using deinterlace method bobdeint
2005-03-12 01:30:43.740 Using realtime priority.
2005-03-12 01:30:43.757 Changing from None to WatchingLiveTV
2005-03-12 01:30:43.839 nVidiaVideoSync: VBlank ioctl did not work,
unimplemented in this driver?
2005-03-12 01:30:43.840 DRMVideoSync: Could not open device
/dev/dri/card0, No such file or directory
2005-03-12 01:30:43.840 Set video sync frame interval to 4
2005-03-12 01:30:43.840 Using audio as timebase
2005-03-12 01:30:43.841 Video timing method: RTC

etcetera...

Thanks,
-- Jeroen
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-11 Thread Will Dormann
Jeroen Brosens wrote:
I received my GF MX4000 today and managed to get it working... except for
vsync :) Can anybody tell me the one thing I probably forgot to do here,
as you can see in this log (mythfrontend -v playback) the nVidia vsync
method is not used (nVidiaVideoSync: VBlank ioctl did not work,
unimplemented in this driver? - Oh I am using the 6629 version btw):

Did you compile MythTV with OpenGL support?   You need to manually 
enable this, at least in the tarball version.

nVidiaVideoSync is for the old nvidia drivers...   (4363)
I'm not sure about the DRM method, if there's any way to get that 
working with an nvidia card.

But nowhere in the log is an attempt to use GL Vsync.  Make sure that's 
enabled for your mythtv build.

--
-WD
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-10 Thread Cory Papenfuss
On Thu, 10 Mar 2005, [ISO-8859-1] Thomas Börkel wrote:
HI!
Cory Papenfuss wrote:
I've mentioned what I did in the past, starting with
http://www.gossamer-threads.com/lists/mythtv/users/45910#45910 Basically 
built up the circuit in the AD724 datasheet.  It's a surface-mount chip, 
and circuit layout is a little important (a few MHz bandwidth video 
signal), so I make a PCB.  I've thought of making a slightly nicer one 
(with VGA loop, though, etc).  If there were enough interest I suppose I 
could finish it off and send off for a limited run of PCBs.
Any chance for a PAL version?
	The AD724 supports PAL encoding as well as NTSC, so it should be 
possible.  If I had SCART, though, I'd rather use that.  It'd be better 
quality than a PAL S-vid, since you could put out RGBHV directly, rather 
than convert to Y/C and modulate on the color subcarrier.

-Cory
*
* Cory Papenfuss*
* Electrical Engineering candidate Ph.D. graduate student   *
* Virginia Polytechnic Institute and State University   *
*
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-10 Thread Cory Papenfuss
The key to MythTV (or any program, really) being able
to render a display without tears or choppiness is
really in two things:
1. Being able to know when the vertical sync is, and,
2. Being able to react to the vertical sync event in a
timely manner
	There's a 3rd issue here.  Most (all?) linux video cards run with 
a free-running clock.  If you want to avoid beat frequency issues and 
tearing, you really want to have the mpeg stream itself trigger the card 
to send out a new vertical field.  Otherwise, you have the MPEG stream 
running a 29.97Hz field rate, and the video card running at a close, but 
not phase/frequency locked rate of, say, 29.98Hz.  That leaves a 0.01Hz 
beat frequency which can show up as screen tearing that moves slowly.

	I guess if you've got the VBI, you can do without this by simply 
using the blanking interval time as a time buffer.  If the card runs too 
fast and the MPEG stream doesn't have another frame yet, show the old 
frame again.  If the card runs too slowly and *another* frame is ready 
before the previous has been shown, drop it.  As said before, the 
fundamental problem is that linux is not realtime (hard or even soft).

-Cory
*
* Cory Papenfuss*
* Electrical Engineering candidate Ph.D. graduate student   *
* Virginia Polytechnic Institute and State University   *
*
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-10 Thread Cory Papenfuss
The AD724 supports PAL encoding as well as NTSC, so it should be 
But there is something different that goes beyond the frequency, AFAIK. If I 
connect a PAL TV to an NTSC S-vid, I would get a B/W picture. Or was that 
with composite (cinch)? Or is that AD724 switchable between PAL nad NTSC?

	Yes, there are more differences between NTSC and PAL than simply 
the frequencies.  The most obvious ones are that the color subcarrier is 
at a different frequency (3.58MHz/NTSC, 4.43MHz/PAL)... that's why you'll 
get BW picture if you connect a PAL TV to an NTSC S-vid.  The horizontal 
scan frequencies are 15.734kHz vs 15.625kHz... most likely close enough to 
lock onto for the luma with sync.  The 50/60Hz is another matter, but from 
what I understand lots of PAL TV's are at least pseudo-capable of viewing 
NTSC video, so the vertical scanning might be able to lock onto a 60Hz 
refresh even if it was designed for 50/100.

	Now, for the chroma on the s-vid, everything is *way* off.  3.58 
is much different from 4.43 no matter how you slice it.  So, it'll not get 
much color info from it.

	To answer your question easier, though... YES the AD724 has a PAL 
mode.  Feed it a correct PAL color subcarrier frequency, change one pin, 
feed it PAL RGBHV frequencies, and it goes.  I haven't done it, mind you, 
but it should work fine.

That's right. But not all TVs have RGB-capable scart connectors.
On the other hand, if you would do VGA-SCART with RGB, you would have 
basically this, right?

http://www.sput.nl/hardware/tv-x.html
Someone else here wrote, that RGB is *too* sharp for MPEG2 video, meaning 
that you see the artifacts then.

	Correct.  I haven't noticed artifacts with what I'm doing, but I 
suspect that the chip does some filtering.  I haven't looked at the 
datasheet in awhile.

-Cory
*
* Cory Papenfuss*
* Electrical Engineering candidate Ph.D. graduate student   *
* Virginia Polytechnic Institute and State University   *
*
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-10 Thread Cory Papenfuss
What about RTLinux or RTAI Linux? This would give the realtime performance 
needed to respond to a VBI interrupt, the problem would be that the RTLinux 
code to interract with the graphics card would have to be written from 
scratch. Just a thought, I haven't really been following this thread.

I ran across this... looks interesting.
http://www.cs.yorku.ca/~av/Genlock.htm
	I've talked myself right out of genlocking with my previous post, 
however.  Genlocking is a *hard* realtime requirement, whereas since the 
PVR is fundamentally a timeshifting device, it's more of a *soft* realtime 
device.  That's not to say it's not interesting or a possible solution, 
it's just that it might be overkill.  The easiest way to adjust the time 
and keep everything synchronous would be to adjust the dotclock on the VGA 
card a very small amount.  That would keep the modeline timings the same 
on a per-frame basis, but adjust the actual time.

	Of course, this is all dependent on the specific video card used, 
how adjustable it is, and some notion of a reference clock.  See 
previous posts on linux not being realtime... :)

-Cory
*
* Cory Papenfuss*
* Electrical Engineering candidate Ph.D. graduate student   *
* Virginia Polytechnic Institute and State University   *
*
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-10 Thread Will Dormann
Thomas Börkel wrote:
For those of you running MythTV 0.17, I think the only thing you'll 
want to change is the resolution.   Rather than using the coryntsci 
resolution, try 800x600.  This fixes the bob vertical resolution 
problem I was seeing.

Should this 800x600 resolution be interlaced or not?

Not.   The built-in 800x600 mode should work fine.  The Bob does the 
deinterlacing.


-DW
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-10 Thread Stephen Williams
 Someone else here wrote, that RGB is *too* sharp for MPEG2 video,
 meaning that you see the artifacts then.
 
 Thomas

I expect it depends on where you got the MPEG2 video from. Using a VGA
- SCART converter in the UK with DVB-T I get an amazing image which
rarely has any visible artifacts. But then the DVB-T MPEG stream was
encoded with the intention of it being displayed on a TV with RGB
inputs, so that's hardly surprising.

If your MPEG stream is being encoded by a PVR-xxx card then it may
well be a completely different story, I doubt the quality of the
encoding is comparable.

Steve
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-10 Thread Tom Lichti
John Patrick Poet wrote:
I understand that you are frustrated, but your last sentence is a little
offensive.
The OpenGL vsync/bobdeint combo is awsome.  Ever since Doug implemented that
combination, my video playback has been silky smooth.  I have an nVidia
graphics card.
The reason it may not work as well for non-nVidia users, is that Doug uses
an nVidia card.  Pretty much all the developers use an nVidia card.  This
means that all other video cards are not going to be as well tested or
optimized.
 

As an aside, what is the preferred Nvidia card for TV output? I have a 
generic GeForce 4 MX440 and it works alright, although I'm sure it could 
be better. If there is a better card to use, what is it?

Also, awhile back I bought one of these: 
http://www.tigerdirect.ca/applications/SearchTools/item-details.asp?EdpNo=169832Sku=G126-1002

And I tried it with Myth, but it wouldn't work. I forget the problem, I 
believe it was something about 'couldn't connect to xv device' or 
something like that.

Would I be better served using that box with the VGA output, or a better 
Nvidia card?

Thanks
Tom
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-10 Thread Will Dormann
Tom Lichti wrote:
As an aside, what is the preferred Nvidia card for TV output? I have a 
generic GeForce 4 MX440 and it works alright, although I'm sure it could 
be better. If there is a better card to use, what is it?
From what I gather, the MX4000 is just a die-shrunk version of the 
MX440, so if you follow my instructions you should be able to get pretty 
good results. Is this not the case?


-WD
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-10 Thread cythraul
I too have an AGP 440MX manufactured by XFX. Although the display is
quite as crisp as you can get on s-video, I see some kind of ghostly
diagonal bars scrolling.

I don't quite notice them most of the time but in the GUI it is
_quite_ noticeable Enough for people, to whom I demonstrate my setup,
to make comments about.

My friend bought the PCI version of the 440MX from a manufacturer I
don't remember at the moment. The display is terribly clean. Even the
flicker we experience in most themes is hardly noticeable.

Of course the best way would be to get a TV with at least components
inputs. But for now s-video input is all I can afford.

So I was wondering if the manufacturer of the actual board might
affect results people are getting.

If so, I'd be happy to learn from people's recommendation.

Thanks,
cyth

On Thu, 10 Mar 2005 16:51:14 -0500, Will Dormann [EMAIL PROTECTED] wrote:
 Tom Lichti wrote:
  As an aside, what is the preferred Nvidia card for TV output? I have a
  generic GeForce 4 MX440 and it works alright, although I'm sure it could
  be better. If there is a better card to use, what is it?
 
  From what I gather, the MX4000 is just a die-shrunk version of the
 MX440, so if you follow my instructions you should be able to get pretty
 good results. Is this not the case?
 
 
 -WD
 ___
 mythtv-users mailing list
 mythtv-users@mythtv.org
 http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users

___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-10 Thread Tom Lichti
Will Dormann wrote:
Tom Lichti wrote:
As an aside, what is the preferred Nvidia card for TV output? I have 
a generic GeForce 4 MX440 and it works alright, although I'm sure it 
could be better. If there is a better card to use, what is it?

From what I gather, the MX4000 is just a die-shrunk version of the 
MX440, so if you follow my instructions you should be able to get 
pretty good results. Is this not the case?
I haven't gotten into it that deeply yet, but I do plan to. I'm curious 
about the convertor box I linked. It has component outputs, and my TV 
has component inputs, I did get the video to go to that output, but when 
I tried to run Myth it wouldn't work. I should try it again, that was on 
.16, now I'm on .17.

Tom
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-09 Thread Stephen Williams
Of course, the way to get 'optimal' TV-out from your Nvidia (or other)
card is not to use it's TV-out facilities at all and build a VGA -
SCART converter for SDTVs (see http://www.sput.nl/hardware/tv-x.html
for example). This is only true of home-built converters, commercial
converters perform scan conversion which lowers the image quality.

Then (with the correct modeline) you _can_ just send the originally
broadcast, interlaced, stream straight to your TV. You are also
keeping the RGB signals seperate, not mixing them all together as you
do with S-video. No deinterlacing or scaling is required so you're not
losing image quality their either, or wasting CPU power.

Oh course, this does rely on you being able to capture the original
interlaced signal (i'm capturing a DVB MPEG stream, no encoder
required). If you're encoding hardware has already thrown that
information away and mixed the odd/even fields then you've already
lost the image quality. Using a VGA - SCART converter in this case
would still give you better colour definition / sharpness, and if
you're encoding at the native resolution then no scaling is required.

I've built one of these and the result is _much_ better than the
TV-out from by Nvidia card. The image quality is even higher than from
my Sony DVB set-top-box.

Steve
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-09 Thread Thomas Börkel
HI!
Stephen Williams wrote:
Of course, the way to get 'optimal' TV-out from your Nvidia (or other)
card is not to use it's TV-out facilities at all and build a VGA -
SCART converter for SDTVs (see http://www.sput.nl/hardware/tv-x.html
for example). This is only true of home-built converters, commercial
converters perform scan conversion which lowers the image quality.
Anyone knows, if this little hardware can be bought ready-to-use (built) 
anywhere? Maybe even in Germany?

Thanks!
Thomas
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-09 Thread Stephen Williams
 Stephen Williams wrote:
  Of course, the way to get 'optimal' TV-out from your Nvidia (or other)
  card is not to use it's TV-out facilities at all and build a VGA -
  SCART converter for SDTVs (see http://www.sput.nl/hardware/tv-x.html
  for example). This is only true of home-built converters, commercial
  converters perform scan conversion which lowers the image quality.
 
 Anyone knows, if this little hardware can be bought ready-to-use (built)
 anywhere? Maybe even in Germany?

I had a look around at the time I was first comptemplating this route
but wasn't able to find any equivalent system for sale commercially.
The reason being that there is a possibility (well, a theoretical one
anyway) that you can damage your TV through use of out-of-spec
modelines, therefore noboddy sells these commercially.

I find that my TV simply won't lock onto the sync signals if you send
an out-of-spec signal. You just end up looking at colourful garbage if
you're modeline is wrong (or when you're not in X, i.e. booting). I've
yet to hear of any cases where TVs have been damaged in this way.

Steve
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-09 Thread Brian J. Murrell
On Wed, 2005-03-09 at 12:14 +, Stephen Williams wrote:
 Of course, the way to get 'optimal' TV-out from your Nvidia (or other)
 card is not to use it's TV-out facilities at all and build a VGA -
 SCART converter for SDTVs (see http://www.sput.nl/hardware/tv-x.html
 for example).

But my TV does not have a SCART connector on it.  I don't think I have
ever seen a TV with a SCART connector.  All I have is the North American
standard connector for cable/antenna, composite, s-video and component.

I do agree with your concept though of treating a TV like a fixed
frequency monitor.

On the issue of hardware encoding and whether it preserves the
interlacing, I do believe the PVR-250 I have in my machine does indeed
preserve the interlacing in the MPEG2 stream it creates.

I do transcode to MPEG4 with Myth though.  Anyone know if that still
preserves the interlaced fields?

b.



signature.asc
Description: This is a digitally signed message part
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-09 Thread Cory Papenfuss
I've built one of these and the result is _much_ better than the
TV-out from by Nvidia card. The image quality is even higher than from
my Sony DVB set-top-box.
	Doesn't surprise me.  Even a 20 year old VGA card can do twice the 
bandwidth required of SDTV.  Newer ones are more like 20x (350MHz dotclock 
vs 14Mhz for SDTV).   I've done something similar (although I'm 
using an analog S-vid chip for NTSC).  No scaling done, just build the 
proper modeline and see tvout the way it was meant to be.  Crystal-clear, 
readable terminals, high color fidelity, and unparallelled sharpness for 
SDTV.  Of course it's 30Hz interlaced, but that's something to be lived 
with (since SDTV *IS* interlaced)

-Cory
*
* Cory Papenfuss*
* Electrical Engineering candidate Ph.D. graduate student   *
* Virginia Polytechnic Institute and State University   *
*
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-09 Thread Brian J. Murrell
On Wed, 2005-03-09 at 09:35 -0500, Cory Papenfuss wrote:
 I've done something similar (although I'm 
 using an analog S-vid chip for NTSC).

Don't hold out on us Corey, do tell more.  :-)  I'm assuming you built
something that converts interlaced vga to s-video using some kind of
s-video encoding silicon?

   No scaling done, just build the 
 proper modeline and see tvout the way it was meant to be.

Sounds nice.

b.



signature.asc
Description: This is a digitally signed message part
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-09 Thread Will Dormann
Jeroen Brosens wrote:

I think the combination of the low CPU usage of XvMC and the Vsync
provided by OpenGL provides a very good looking picture.  However, if
the Bob would actually take place, I think it might indeed be optimal
as I had originally stated.

 So which version of MythTV are you using, CVS perhaps? I know there were
 some changes to the bobdeint code in Xv and XvMC video output.
I'm running 0.16 right now.   With 0.17 I had too many problems with 
mythbackend crashing.

You are correct.  0.16 has broken Bob for XvMC in that it only displays 
the first field, so it's not actually displaying double the frame rate. 
 In a previous message to this thread I have posted a link to the patch 
that fixes that.   I have manually applied it to my source code and it 
does indeed fix Bob. The only issue with it now is that the OSD only 
displays every other frame, creating a flashing effect.   I tried to 
hack it myself, comparing the videoout_xvmc.cpp file to the latest CVS 
version to determine the possible change, but I've only managed to get 
it to either hang on first displaying the OSD, or display the OSD OK but 
crash mythfrontend when I hit pause.Maybe if I check the commits 
list I could find the specific fix for the every-other-frame OSD bug 
with XvMV and Bob.Or if somebody here can point me in the right 
direction, that'd be nice!   :)

But if you're running a post-0.16 CVS version or 0.17 you might not have 
to much with any of this.

Please let me know how it works out.  I thought the onboard SIS graphics 
on my Pundit was pretty good, but now that I've got the nVidia card 
working properly, the difference is absolutely stunning!  I absolutely 
cannot tell the difference between a recording and the incoming 
broadcast signal.

If I hear from at least one other person with similar results, maybe 
I'll submit my instructions to Jarod or a Wiki or something.

-WD
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-09 Thread Will Dormann
Stephen Williams wrote:
Of course, the way to get 'optimal' TV-out from your Nvidia (or other)
card is not to use it's TV-out facilities at all and build a VGA -
SCART converter for SDTVs (see http://www.sput.nl/hardware/tv-x.html
for example). 

Yes, this may be true for those of you who have SCART capability.
Maybe I wasn't totally clear, but my instructions are intended for those 
of us in NTSC-land.   SCART isn't an option for us.  I can't see why it 
wouldn't also work for PAL, though.   (Now that I've indicated that 
800x600 is the way to go)

-WD
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-09 Thread Will Dormann
Brian J. Murrell wrote:
On the issue of hardware encoding and whether it preserves the
interlacing, I do believe the PVR-250 I have in my machine does indeed
preserve the interlacing in the MPEG2 stream it creates.
Yes it does.  The problem is, no video card appears to be able to do 
TV-Out while retaining that proper interlacing information.  The instant 
the card does any sort of scaling of the picture (to do overscan, for 
example), the interlacing will no longer match up with the TV scan lines.

Bob deinterlacing makes up for this by converting each field to a frame 
and then sending ~60frames per second, rather than 60fields per second. 
 This retains the smoothness of the original recording, but not 
requiring fancy output where everything lines up right with the interlacing.

The PVR-350 has dedicated hardware to output the video such that the 
interlacing matches up with the TV's scan lines, as that is what it was 
designed to do.  A video card with TV-out, however, seem to be designed 
to output your computer desktop (or games or whatever) at various 
resolutions and have them all look relatively OK.

I think I've probably used some wrong terminology here, but I think I've 
got the gist of things right.

-WD
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-09 Thread Cory Papenfuss
Don't hold out on us Corey, do tell more.  :-)  I'm assuming you built
something that converts interlaced vga to s-video using some kind of
s-video encoding silicon?
	I've mentioned what I did in the past, starting with
http://www.gossamer-threads.com/lists/mythtv/users/45910#45910 
Basically built up the circuit in the AD724 datasheet.  It's a 
surface-mount chip, and circuit layout is a little important (a few MHz 
bandwidth video signal), so I make a PCB.  I've thought of making a 
slightly nicer one (with VGA loop, though, etc).  If there were enough 
interest I suppose I could finish it off and send off for a limited run of 
PCBs.

  No scaling done, just build the
proper modeline and see tvout the way it was meant to be.
Sounds nice.
	Yeah, except that you get garbage until X comes up.  The VGA on 
bootup is basically 480p, which doesn't exactly work on S-vid.  Aside from 
that detail, it's great.

-Cory
*
* Cory Papenfuss*
* Electrical Engineering candidate Ph.D. graduate student   *
* Virginia Polytechnic Institute and State University   *
*
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-09 Thread Brian J. Murrell
On Wed, 2005-03-09 at 10:15 -0500, Cory Papenfuss wrote:
   I've mentioned what I did in the past, starting with
 http://www.gossamer-threads.com/lists/mythtv/users/45910#45910 
 Basically built up the circuit in the AD724 datasheet.

Seems strange that there is no commercial off the shelf products doing
this.  Perhaps there is.  I will have to search this afternoon when I
get a minute.

 It's a 
 surface-mount chip, and circuit layout is a little important (a few MHz 
 bandwidth video signal), so I make a PCB.

Indeed, but this is beyond my abilities.  :-(

 I've thought of making a 
 slightly nicer one (with VGA loop, though, etc).

What would the loop through provide?

 If there were enough 
 interest I suppose I could finish it off and send off for a limited run of 
 PCBs.

If you can do that you might want to think about filling a possible
market void.  Inventions are the way to get rich.  :-)

   Yeah, except that you get garbage until X comes up.  The VGA on 
 bootup is basically 480p, which doesn't exactly work on S-vid.  Aside from 
 that detail, it's great.

Small detail and to be expected.

b.



signature.asc
Description: This is a digitally signed message part
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-09 Thread Big Wave Dave
Looks like something such as...

http://cgi.ebay.com/ws/eBayISAPI.dll?ViewItemcategory=21169item=3879054837rd=1ssPageName=WDVW#ebayphotohosting

...might be a good solution.

Dave


On Wed, 09 Mar 2005 11:20:48 -0500, Brian J. Murrell
[EMAIL PROTECTED] wrote:
 On Wed, 2005-03-09 at 10:15 -0500, Cory Papenfuss wrote:
I've mentioned what I did in the past, starting with
  http://www.gossamer-threads.com/lists/mythtv/users/45910#45910
  Basically built up the circuit in the AD724 datasheet.
 
 Seems strange that there is no commercial off the shelf products doing
 this.  Perhaps there is.  I will have to search this afternoon when I
 get a minute.
 
  It's a
  surface-mount chip, and circuit layout is a little important (a few MHz
  bandwidth video signal), so I make a PCB.
 
 Indeed, but this is beyond my abilities.  :-(
 
  I've thought of making a
  slightly nicer one (with VGA loop, though, etc).
 
 What would the loop through provide?
 
  If there were enough
  interest I suppose I could finish it off and send off for a limited run of
  PCBs.
 
 If you can do that you might want to think about filling a possible
 market void.  Inventions are the way to get rich.  :-)
 
Yeah, except that you get garbage until X comes up.  The VGA on
  bootup is basically 480p, which doesn't exactly work on S-vid.  Aside from
  that detail, it's great.
 
 Small detail and to be expected.
 
 b.
 
 
 ___
 mythtv-users mailing list
 mythtv-users@mythtv.org
 http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
 
 
 

___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-09 Thread Cory Papenfuss
Could be but I like Corey's idea better.  The above is a scan converter.
Corey does not convert the scan but has the computer produce the right
timings for television.  His circuit just does coordinate
transformation RGB-YPbPr and NTSC composite modulation (colorburst,
3.58MHz crystal, etc).
This seems a better idea.  Process the signal as little as possible.
	That's what I thought.  I'm happy with the results, but the 
annoyance of getting everything right is definately beyond the scope of 
what most people are willing to deal with.  It was also pointed out to me 
a few months ago that my circuit wasn't completely accurate.  I use a 
3.58Mhz crystal to generate the color subcarrier, but in reality it should 
be line-locked to the horizontal sweep frequency.  That's one of the 
modifications I'd like to do (with all my free time... hahaha!)

-Cory
*
* Cory Papenfuss*
* Electrical Engineering candidate Ph.D. graduate student   *
* Virginia Polytechnic Institute and State University   *
*
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-09 Thread Brian J. Murrell
On Wed, 2005-03-09 at 11:36 -0500, Cory Papenfuss wrote:
   That's what I thought.  I'm happy with the results, but the 
 annoyance of getting everything right is definately beyond the scope of 
 what most people are willing to deal with.

Perhaps.  Certainly not for the out of the box crowd, but that's not
to say that somebody can't fill the gap between a pile of hardware and
an out of the box solution including the transcoder.  Certainly should
not allow for somebody to muck with modelines and such, but a product
could loom here.

 It was also pointed out to me 
 a few months ago that my circuit wasn't completely accurate.  I use a 
 3.58Mhz crystal to generate the color subcarrier, but in reality it should 
 be line-locked to the horizontal sweep frequency.

:-)  If you say so.

 That's one of the 
 modifications I'd like to do (with all my free time... hahaha!)

Hear ya brother.

I meant to ask.  The picture looks properly overscanned?  i.e. you are
not seeing all 7??x48? (forget what res you said you have your modeline
at) pixels right?  Just the ones that would normally show up on a real
broadcast -- missing that certain percentage of those around the border.

b.



signature.asc
Description: This is a digitally signed message part
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-09 Thread Joe Votour
Cory,

I've followed your posts on the circuit board that
you've made before - it interested me then, and it
interests me now.

Myself, I'm a software developer, and I've soldered a
grand total of one thing (a custom cable), and that
didn't even quite come out right (mind you, I just
don't have time to practice - it's on my list for
someday though).  As such, the chances of me building
one of these things from the schematics is slim to
none (right now).

Have you given thought to the minimum order that you
require for this, and the cost?  Depending on those
numbers, I'll sign up for one, maybe two (to have one
as a backup).

The no TV-out part until X is no problem for me, since
Fedora will start X fairly early in the boot process
anyway, and I can always hook up S-Video again anyway.

-- Joe

--- Cory Papenfuss [EMAIL PROTECTED] wrote:
  Don't hold out on us Corey, do tell more.  :-) 
 I'm assuming you built
  something that converts interlaced vga to s-video
 using some kind of
  s-video encoding silicon?
 
   I've mentioned what I did in the past, starting
 with

http://www.gossamer-threads.com/lists/mythtv/users/45910#45910
 
 Basically built up the circuit in the AD724
 datasheet.  It's a 
 surface-mount chip, and circuit layout is a little
 important (a few MHz 
 bandwidth video signal), so I make a PCB.  I've
 thought of making a 
 slightly nicer one (with VGA loop, though, etc).  If
 there were enough 
 interest I suppose I could finish it off and send
 off for a limited run of 
 PCBs.
 
No scaling done, just build the
  proper modeline and see tvout the way it was
 meant to be.
 
  Sounds nice.
 
   Yeah, except that you get garbage until X comes
 up.  The VGA on 
 bootup is basically 480p, which doesn't exactly work
 on S-vid.  Aside from 
 that detail, it's great.
 
 -Cory
 

*
 * Cory Papenfuss
*
 * Electrical Engineering candidate Ph.D. graduate
 student   *
 * Virginia Polytechnic Institute and State
 University   *

*
 
  ___
 mythtv-users mailing list
 mythtv-users@mythtv.org

http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
 




__ 
Celebrate Yahoo!'s 10th Birthday! 
Yahoo! Netrospective: 100 Moments of the Web 
http://birthday.yahoo.com/netrospective/
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-09 Thread Andrew Close
On Wed, 9 Mar 2005 11:30:22 -0500 (EST), Cory Papenfuss
[EMAIL PROTECTED] wrote:
snip
 Small, perhaps.  Irritating, absolutely.  Also, beyond the
 understanding of 99% of the PC-buying public (MythTV crowd is higher
 technically-saavy than most).  To most people, the tradeoffs involved are
 too subtle to be appreciated, but rather they'll get pissed when it
 doesn't just work.  They'll get even *more* pissed when they program the
 modeline wrong, let the magic smoke out of their TV or monitor, and get an
 I told you so from the person that sold it to them.  Just tell them,
 Run it at 1024x768, crank the flicker filter up to maximum, and enjoy.

ahhh, 'letting the magic smoke out...'.  that sure brings back
memories of my 'attempted' EE undergrad. ;)

-
a 'whole lotta' GMail Invites available
Please Email me OFF-list only...
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-09 Thread Cory Papenfuss
I meant to ask.  The picture looks properly overscanned?  i.e. you are
not seeing all 7??x48? (forget what res you said you have your modeline
at) pixels right?  Just the ones that would normally show up on a real
broadcast -- missing that certain percentage of those around the border.
	If by properly overscanned you mean not seeing all 720x480, then 
yes, it is.  It's actually pretty easy to change the horizontal 
overscanning by just padding the sides with black, but keeping the active 
video the same.  It's a pretty easy modeline mod.  Changing the vertical 
overscanning is a different story, since you're stuck with the 525 lines 
for NTSC.  If you want it underscanned vertically, you'd need to reduce 
the number of active picture scanlines down from 480.  That's what all the 
tvout over/underscanning adjustments are doing... they just don't tell 
you that in so many words, and do some scaling in the scanline conversion. 
Bottom line is, the TV only shows Y lines vertically, and Y  480 for 
normal TVs with overscanning.  Since horizontally is defined in analog, 
you can fill in the time for a sing sweep with as many dots as you like... 
so long as it takes 1/15734ths of a second to do each one.

-Cory
*
* Cory Papenfuss*
* Electrical Engineering candidate Ph.D. graduate student   *
* Virginia Polytechnic Institute and State University   *
*
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-09 Thread Brian J. Murrell
On Wed, 2005-03-09 at 12:07 -0500, Cory Papenfuss wrote:
   If by properly overscanned you mean not seeing all 720x480, then 
 yes, it is.

Sweet.  I am totally jealous dude.

   It's actually pretty easy to change the horizontal 
 overscanning by just padding the sides with black, but keeping the active 
 video the same.

Indeed.  But I would want it to look exactly like it did on the original
broadcast.

 Changing the vertical 
 overscanning is a different story, since you're stuck with the 525 lines 
 for NTSC.  If you want it underscanned vertically, you'd need to reduce 
 the number of active picture scanlines down from 480.

Yech.  Yeah.  Good thing I am not interested in that.  :-)

 That's what all the 
 tvout over/underscanning adjustments are doing... they just don't tell 
 you that in so many words, and do some scaling in the scanline conversion. 

Blech.

So, when you going into production with your transcoders?  :-)

Seems ATI sells a DVI-component converter for their 8500 and 9xxx cards
for $29.  Perhaps that is the way to go.  That should work with an SDTV
with component video connections no?  Maybe cheaper than the cheapest
vidcard you would want to buy and then an additional U$129 for the Audio
whatsitcalled thingy I posted about earlier.

b.



signature.asc
Description: This is a digitally signed message part
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-09 Thread Cory Papenfuss
Seems ATI sells a DVI-component converter for their 8500 and 9xxx cards
for $29.  Perhaps that is the way to go.  That should work with an SDTV
with component video connections no?  Maybe cheaper than the cheapest
vidcard you would want to buy and then an additional U$129 for the Audio
whatsitcalled thingy I posted about earlier.
	I'm pretty sure that all their doing is a trick.  Something I've 
though about doing, as well... just with regular video cards.  The R/G/B 
outputs on the vid cards aren't that special... they can output any data 
you want.  If you get one that can add a composite sync pulse (e.g. 
sync-on-green capable), you'd be set to output component straight off the 
card.  Put Y data (with CSYNC) out the G DAC, and Pb/Pr out the other two. 
Then you'd just need to trick the card into *NOT* doing a colorspace 
rotation.  Something like a phony Xv type like YUY2 UYVY YV12 I420, etc... 
just no transformation.

	I'm pretty sure that's what ATI's doing, and I'm also pretty sure 
it's a winders driver only thing.

-Cory
*
* Cory Papenfuss*
* Electrical Engineering candidate Ph.D. graduate student   *
* Virginia Polytechnic Institute and State University   *
*
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-09 Thread Harondel J. Sibble


On 9 Mar 2005 at 10:15, Cory Papenfuss wrote:

 slightly nicer one (with VGA loop, though, etc).  If there were enough 
 interest I suppose I could finish it off and send off for a limited run of
 PCBs.


Or use this service instead

http://www.olimex.com/

Basically, you supply the design specs, pay a small fee, they add it to their 
database, then anyone can order a pcb as needed, in single or multiple 
quantities.  From what I remember the initial pcb will be like us$50 or so.

It's what the folks at OpenEEG use for pcb's

http://openeeg.sourceforge.net/doc/

Pricing from Olimex for the OpenEEG pcb's

http://www.olimex.com/gadgets/openeeg.html
-- 
Harondel J. Sibble 
Sibble Computer Consulting
Creating solutions for the small business and home computer user.
[EMAIL PROTECTED] (use pgp keyid 0x3AD5C11D) http://www.pdscc.com
(604) 739-3709 (voice/fax)  (604) 686-2253 (pager)


___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-09 Thread Jeroen Brosens
Stephen Williams wrote:
 Of course, the way to get 'optimal' TV-out from your Nvidia (or other)
 card is not to use it's TV-out facilities at all and build a VGA -
 SCART converter for SDTVs (see http://www.sput.nl/hardware/tv-x.html
 for example).

Yes, this may be true for those of you who have SCART capability.

Maybe I wasn't totally clear, but my instructions are intended for those
of us in NTSC-land.   SCART isn't an option for us.  I can't see why it
wouldn't also work for PAL, though.   (Now that I've indicated that
800x600 is the way to go)


-WD


I already went through the effort of making a VGA-SCART converter cable
(I can provide schematics and photo's if you'd like) hoping that it would
bring me the ultimate in MythTV viewing. But it didn't.

It gives the best colors (separate R/G/B and sync) and sharpness of all
possible methods, but... no smooth motion - on my setup that is. Some of
you may recall my postings about struggling to get either bobdeint working
on my SiS video (ASUS Pundit) or completely bypass the TV-out that can't
handle interlaced video properly.

When I finally got the modeline right for the VGA-SCART cable to work I
was hoping that the video encoded by my PVR-250 (interlaced material)
would play back 1:1 on the TV. Somehow, it gave the same blurry motion
that kerneldeint or linear blend, or even 'no deinterlacing' produces, not
50 fields/sec but 25 frames/sec. Not even close to the perfect display you
get using bobdeint. As an extra, the ultra sharp picture really made the
MPEG artifacts clearly visible which isn't a pretty sight too, not to
mention not being able to monitor the boot procedure because of the
garbled image.

So the cable wasn't really that hard to make and it only cost me € 5,- but
I found the results rather unsatisfying. See also an earlier post from me
on this: http://www.mail-archive.com/mythtv-dev@mythtv.org/msg03975.html

I gave up after trying *every* combination of driver/cable/modeline/X11
video output device/eating shoes and decided to buy a nVidia card after
Staffan Pettersson persuaded me (thank you):
http://www.mail-archive.com/mythtv-dev@mythtv.org/msg04131.html

The card is due to arrive friday, so I can finally get OpenGL
vsync+XvMC+bobdeint working.

-- Jeroen
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-09 Thread Jeroen Brosens
 I gave up after trying *every* combination of driver/cable/modeline/X11
 video output device/eating shoes and decided to buy a nVidia card after
 Staffan Pettersson persuaded me (thank you):
 http://www.mail-archive.com/mythtv-dev@mythtv.org/msg04131.html

 The card is due to arrive friday, so I can finally get OpenGL
 vsync+XvMC+bobdeint working.

 -- Jeroen


I'd like to add that I'd like to challenge the MythTV dev-people to review
the Xv/XvMC code regarding the handling of vsync while using bobdeint.

One needs hardware with OpenGL support to have a Vsync to get bobdeint
working without going out of sync now and then (horrible jittering occurs)
and that isn't good news for users of a barebone with built-in graphics
that can't support that, like myself (using an ASUS Pundit).

It would wipe out all problems I had to get it working! One lousy bobdeint
filter that just works.

-- Jeroen
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-09 Thread Isaac Richards
On Wednesday 09 March 2005 02:06 pm, Jeroen Brosens wrote:
 I'd like to add that I'd like to challenge the MythTV dev-people to review
 the Xv/XvMC code regarding the handling of vsync while using bobdeint.

 One needs hardware with OpenGL support to have a Vsync to get bobdeint
 working without going out of sync now and then (horrible jittering occurs)
 and that isn't good news for users of a barebone with built-in graphics
 that can't support that, like myself (using an ASUS Pundit).

Well, how do you expect it to know when to flip the buffers, if the video card 
can't tell it accurately?  Magic?

Isaac
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-09 Thread John Patrick Poet

On Wed, 9 Mar 2005, Jeroen Brosens wrote:

  I gave up after trying *every* combination of driver/cable/modeline/X11
  video output device/eating shoes and decided to buy a nVidia card after
  Staffan Pettersson persuaded me (thank you):
  http://www.mail-archive.com/mythtv-dev@mythtv.org/msg04131.html
 
  The card is due to arrive friday, so I can finally get OpenGL
  vsync+XvMC+bobdeint working.
 
  -- Jeroen
 

 I'd like to add that I'd like to challenge the MythTV dev-people to review
 the Xv/XvMC code regarding the handling of vsync while using bobdeint.

 One needs hardware with OpenGL support to have a Vsync to get bobdeint
 working without going out of sync now and then (horrible jittering occurs)
 and that isn't good news for users of a barebone with built-in graphics
 that can't support that, like myself (using an ASUS Pundit).

 It would wipe out all problems I had to get it working! One lousy bobdeint
 filter that just works.

I understand that you are frustrated, but your last sentence is a little
offensive.

The OpenGL vsync/bobdeint combo is awsome.  Ever since Doug implemented that
combination, my video playback has been silky smooth.  I have an nVidia
graphics card.

The reason it may not work as well for non-nVidia users, is that Doug uses
an nVidia card.  Pretty much all the developers use an nVidia card.  This
means that all other video cards are not going to be as well tested or
optimized.

If someone wanted to buy Doug an Asus Pundit, he *might* be willing to take
the time to work on improving bobdeint for non nVidia cards, but he is
unlikely to buy one for himself just for that purpose.

Doug would probably be willing to explain how the bobdeint code works, if
someone without nVidia hardware wanted to work with it.


John
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-09 Thread Robert Johnston
On Wed, 9 Mar 2005 12:34:00 -0700 (MST), John Patrick Poet
[EMAIL PROTECTED] wrote:
 
 On Wed, 9 Mar 2005, Jeroen Brosens wrote:
 
   I gave up after trying *every* combination of driver/cable/modeline/X11
   video output device/eating shoes and decided to buy a nVidia card after
   Staffan Pettersson persuaded me (thank you):
   http://www.mail-archive.com/mythtv-dev@mythtv.org/msg04131.html
  
   The card is due to arrive friday, so I can finally get OpenGL
   vsync+XvMC+bobdeint working.
  
   -- Jeroen
  
 
  I'd like to add that I'd like to challenge the MythTV dev-people to review
  the Xv/XvMC code regarding the handling of vsync while using bobdeint.
 
  One needs hardware with OpenGL support to have a Vsync to get bobdeint
  working without going out of sync now and then (horrible jittering occurs)
  and that isn't good news for users of a barebone with built-in graphics
  that can't support that, like myself (using an ASUS Pundit).
 
  It would wipe out all problems I had to get it working! One lousy bobdeint
  filter that just works.
 
 I understand that you are frustrated, but your last sentence is a little
 offensive.
 
 The OpenGL vsync/bobdeint combo is awsome.  Ever since Doug implemented that
 combination, my video playback has been silky smooth.  I have an nVidia
 graphics card.
 
 The reason it may not work as well for non-nVidia users, is that Doug uses
 an nVidia card.  Pretty much all the developers use an nVidia card.  This
 means that all other video cards are not going to be as well tested or
 optimized.
 
 If someone wanted to buy Doug an Asus Pundit, he *might* be willing to take
 the time to work on improving bobdeint for non nVidia cards, but he is
 unlikely to buy one for himself just for that purpose.
 
 Doug would probably be willing to explain how the bobdeint code works, if
 someone without nVidia hardware wanted to work with it.

If this was the Windows world, I'd suggest using DirectX or something
similar to get the VSync (As even the pundit's drivers would be DX
compatible). And if we were using VESA, we could capture the VBlank
interrupt from the VESA bus. However, this is *nix, and I'm not sure
how different drivers work WRT VBlank. Is there something in the MESA
project we could use? I believe they implement VSync on non-GL
cards...
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-09 Thread Jeroen Brosens
 On Wednesday 09 March 2005 02:06 pm, Jeroen Brosens wrote:
 I'd like to add that I'd like to challenge the MythTV dev-people to
 review
 the Xv/XvMC code regarding the handling of vsync while using bobdeint.

 One needs hardware with OpenGL support to have a Vsync to get bobdeint
 working without going out of sync now and then (horrible jittering
 occurs)
 and that isn't good news for users of a barebone with built-in graphics
 that can't support that, like myself (using an ASUS Pundit).

 Well, how do you expect it to know when to flip the buffers, if the video
 card
 can't tell it accurately?  Magic?

 Isaac



No Isaac, just something else than GL Vsync. I am not venting my
frustrations upon people either, rather just stirring up some new ideas on
this. Afterall, this is MythTV, Linux, Open Source... where meeting
challenges is the fun of everything! I also could have installed Windows
MCE and to be 'just a regular user on the safe side' but I want to be able
to participate in meeting the challenges where I can.

What I understand now is that all of the devvers use nVidia, can you agree
that this diminishes compatibility regarding video-related
functionalities? I am not a C++ developer you know; if I were I would have
tried to fix the problems myself but I can't.

Now on topic; am I talking plain nonsense when I ask whether the VBI
device can be used for Vsyncing? What I know is, that it is used for
teletext data and 'walks' in sync with the video fields so maybe that is
an alternative for using GL vsync.

-- Jeroen
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-09 Thread David
Will Dormann wrote:
If I hear from at least one other person with similar results, maybe 
I'll submit my instructions to Jarod or a Wiki or something. 
err, I put them up here:
 http://www.mythtv.info/moin.cgi/NVidiaMX4000HowTo
I think there's some followon I may have missed.
And it needs some 0.16/0.17 caveating
David
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-09 Thread David
Jeroen Brosens wrote:
Stephen Williams wrote:
   

Of course, the way to get 'optimal' TV-out from your Nvidia (or other)
card is not to use it's TV-out facilities at all and build a VGA -
SCART converter for SDTVs (see http://www.sput.nl/hardware/tv-x.html
for example).
 

Yes, this may be true for those of you who have SCART capability.
Maybe I wasn't totally clear, but my instructions are intended for those
of us in NTSC-land.   SCART isn't an option for us.  I can't see why it
wouldn't also work for PAL, though.   (Now that I've indicated that
800x600 is the way to go)
-WD
   


I already went through the effort of making a VGA-SCART converter cable
(I can provide schematics and photo's if you'd like)
Yes please - and it would be excellent to put them on the wiki (you can 
upload images there.)

David
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-09 Thread Cory Papenfuss
It gives the best colors (separate R/G/B and sync) and sharpness of all
possible methods, but... no smooth motion - on my setup that is. Some of
you may recall my postings about struggling to get either bobdeint working
on my SiS video (ASUS Pundit) or completely bypass the TV-out that can't
handle interlaced video properly.
	I used to have the same issue.  Basically, it was a tearing of 
the picture.  Now, I'm using an NVidia card, and I think that VSYNC must 
be working.  I've turned off all deinterlacing filters, and it works 
great.

-Cory
*
* Cory Papenfuss*
* Electrical Engineering candidate Ph.D. graduate student   *
* Virginia Polytechnic Institute and State University   *
*
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-09 Thread Isaac Richards
On Wednesday 09 March 2005 03:33 pm, Jeroen Brosens wrote:
  On Wednesday 09 March 2005 02:06 pm, Jeroen Brosens wrote:
  I'd like to add that I'd like to challenge the MythTV dev-people to
  review
  the Xv/XvMC code regarding the handling of vsync while using bobdeint.
 
  One needs hardware with OpenGL support to have a Vsync to get bobdeint
  working without going out of sync now and then (horrible jittering
  occurs)
  and that isn't good news for users of a barebone with built-in graphics
  that can't support that, like myself (using an ASUS Pundit).
 
  Well, how do you expect it to know when to flip the buffers, if the video
  card
  can't tell it accurately?  Magic?
 
  Isaac

 No Isaac, just something else than GL Vsync. I am not venting my
 frustrations upon people either, rather just stirring up some new ideas on
 this.

But it does - myth also supports getting the vsync info through the DRM 
interface.  What 'new ideas' are you stirring up?  I mean, it's obvious that 
you don't know what you're talking about, but other than that, it seems like 
you're just whinging to me.

 Afterall, this is MythTV, Linux, Open Source... where meeting 
 challenges is the fun of everything! I also could have installed Windows
 MCE and to be 'just a regular user on the safe side' but I want to be able
 to participate in meeting the challenges where I can.

 What I understand now is that all of the devvers use nVidia, can you agree
 that this diminishes compatibility regarding video-related
 functionalities? I am not a C++ developer you know; if I were I would have
 tried to fix the problems myself but I can't.

No, I don't agree that it 'diminishes compatability' at all.  There's nothing 
nvidia specific in either the drm or opengl vsync methods.  

If the driver for a particular video card doesn't provide certain services, 
such as, oh, providing a method to know when the next vsync will be, there's 
absolutely nothing that I can do about it.

 Now on topic; am I talking plain nonsense when I ask whether the VBI
 device can be used for Vsyncing? What I know is, that it is used for
 teletext data and 'walks' in sync with the video fields so maybe that is
 an alternative for using GL vsync.

Err, the vbi device would be input, not output.

Isaac
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-09 Thread Radek Svoboda
Jeroen Brosens wrote:
I already went through the effort of making a VGA-SCART converter cable
(I can provide schematics and photo's if you'd like) hoping that it would
bring me the ultimate in MythTV viewing. But it didn't.
It gives the best colors (separate R/G/B and sync) and sharpness of all
possible methods, but... no smooth motion - on my setup that is. Some of
you may recall my postings about struggling to get either bobdeint working
on my SiS video (ASUS Pundit) or completely bypass the TV-out that can't
handle interlaced video properly.
When I finally got the modeline right for the VGA-SCART cable to work I
was hoping that the video encoded by my PVR-250 (interlaced material)
would play back 1:1 on the TV. Somehow, it gave the same blurry motion
that kerneldeint or linear blend, or even 'no deinterlacing' produces, not
50 fields/sec but 25 frames/sec. Not even close to the perfect display you
get using bobdeint. As an extra, the ultra sharp picture really made the
MPEG artifacts clearly visible which isn't a pretty sight too, not to
mention not being able to monitor the boot procedure because of the
garbled image.
So the cable wasn't really that hard to make and it only cost me  5,- but
I found the results rather unsatisfying. See also an earlier post from me
on this: http://www.mail-archive.com/mythtv-dev@mythtv.org/msg03975.html
I gave up after trying *every* combination of driver/cable/modeline/X11
video output device/eating shoes and decided to buy a nVidia card after
Staffan Pettersson persuaded me (thank you):
http://www.mail-archive.com/mythtv-dev@mythtv.org/msg04131.html
The card is due to arrive friday, so I can finally get OpenGL
vsync+XvMC+bobdeint working.
And what about YUV *Progressive* input - is there a way to get it out of 
VGA connector in the similar way how TV out of VGA was made? My TV 
supports this and since it is Progressive (my TV is 100Hz tube), I 
suppose it will solve most of problems with interlacing/deinterlacing.
Does anybody know the details of YUV Progressive format?

  Best regards
   Radek Svoboda
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-09 Thread Doug Larrick
John Patrick Poet wrote:
If someone wanted to buy Doug an Asus Pundit, he *might* be willing to take
the time to work on improving bobdeint for non nVidia cards, but he is
unlikely to buy one for himself just for that purpose.
I would not accept any hardware; I barely have time (recently: none at
all) to hack on MythTV for my own purposes.  I certainly would not
accept a low-powered frontend, as all my MythTV-recorded programming is
HDTV.
But really, the hard part of the vsync code is all in the video card
driver.  You card's driver doesn't provide some sort of method of
getting vertical retrace timing?  Complain to the author or
manufacturer, not here.  Or get coding.  Or spend $30 on a new video
card, which is what most of us have decided to do.
If your driver *does* provide a vertical retrace timing method that Myth
doesn't support, it's  50 lines of code to add a new subclass to
vsync.cpp and plug it in.  This work obviously has to be done by
somebody with the hardware, but there are a half dozen methods there
already, so there's plenty of example code.  And the two methods that
actually sync to hardware are brand-agnostic, one following a Linux
semi-standard (DRM), the other an actual industry standard (SGI OpenGL
vsync).
Doug would probably be willing to explain how the bobdeint code works, if
someone without nVidia hardware wanted to work with it.
Bob deint is in two pieces, as I have written in the past:
1. The filter part rearranges the scan lines so that the top field is in
the top half of the video frame, and the bottom field in the second half.
2a. The video output part tells the video output class to display twice
as many frames as usual (e.g. at 50 Hz rather than 25)
2b. ...and arranges to display first the top half (field) then the
bottom half of the frame (or vice versa if the video has these reversed)
at this higher rate.
Note that nothing here has any relation whatsoever to vertical retrace
sync.  That's by design.  Didn't used to be this way, but now it's
fairly clean and modular.
I think much of the reason bob looks *worse* than other deinterlacing
methods for some people is because it's putting twice as much strain on
the video output software and hardware, by displaying at twice the
refresh rate.
Bob deint is designed to output to progressive display devices, such as
HDTVs (or EDTVs) or projectors.  The fact that it sometimes looks better
than non-deinterlaced material on non-progressive displays is an
indication of how *($#ed up video display in Linux is in general.  I
think if your hardware had a way of getting the vertical retrace, it
would look better w/o a deinterlacer as well.


signature.asc
Description: OpenPGP digital signature
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-09 Thread Cory Papenfuss
Bob deint is designed to output to progressive display devices, such as
HDTVs (or EDTVs) or projectors.  The fact that it sometimes looks better
than non-deinterlaced material on non-progressive displays is an
indication of how *($#ed up video display in Linux is in general.  I
think if your hardware had a way of getting the vertical retrace, it
would look better w/o a deinterlacer as well.
Well said.
-Cory
*
* Cory Papenfuss*
* Electrical Engineering candidate Ph.D. graduate student   *
* Virginia Polytechnic Institute and State University   *
*
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-09 Thread Joe Votour
I'll preface this by saying that I'm not a graphics
guru.  However, I have been doing graphics-related
work off and on for the last ten years (although
nothing too fancy, mostly 2D stuff).

The key to MythTV (or any program, really) being able
to render a display without tears or choppiness is
really in two things:
1. Being able to know when the vertical sync is, and,
2. Being able to react to the vertical sync event in a
timely manner

With regards to (1):
Basically, the way that images are displayed on the
screen, or a monitor (I don't know about projection
units or the like) is that the image is drawn left to
right, up to down, via an electron beam sometimes
called the raster.  When the raster goes out of the
viewable area, then you are said to be in the vertical
blanking interval (VBI).  For the purposes of this
discussion, we'll ignore that some TV signals have
Closed Captioning data sent during the VBI on the
input signal.

Generally speaking, when you are within the VBI, you
may update any portion of the screen that you want,
and have no tearing or artifacts (because the raster
isn't updating it).  It is extremely important that
you update your screen fully during the VBI, or, at
the very least, update your screen before it is drawn
by the raster (otherwise, you will see
tearing/jitter).

If your output driver/software doesn't have any way of
knowing when it is within the VBI area, then you can't
hope to accomplish this, and there are very good odds
that you will see tearing or jitter.

As Isaac said in his response, DRM is one method.  The
OpenGL vsync is another.

Now, on with (2):
Knowing when the VBI occurs isn't enough - you must be
able to get control of things while you're still in
the VBI to be completely effective (again, otherwise,
you might be updating the screen while the raster is
drawing it).

The biggest problem that MythTV faces is that Linux is
not a real-time operating system (this problem is not
limited to Linux, Windows suffers from the same fate).
 While the 2.6 kernel has a much improved scheduler
(and I was actually at the Embedded Systems Conference
today, discussing some scheduler related issues my
employer is having on one of their products, with
MontaVista, LynuxWorks and TimeSys to see if any of
them can help us), it is still not real-time (like
vxWorks).  So, although you might know that you're in
the VBI, your task might not have gotten it's
time-slice from the kernel yet.  This means that you
have this precious time you should be drawing in,
going to another task/thread, and by the time you get
the time-slice, you might be out of the VBI.

Now, back before computers (and their OS') became so
complicated, I used to be able to get perfect, tear
free displays back on my Commodore 64 with the VIC-II
chip, all in software.  How, you might ask?  Well,
first of all, I wrote my code in 6510 assembly
language, and chained it off of the IRQ (interrupt
request).  Secondly, the VIC-II had a register, which,
when read, would give you the scan line (raster line)
the chip was drawing on, writing to this register
would allow you to trigger an interrupt whenever the
chip hit the desired scan line.

Thus, I was able to have such complicated things as
splitting the screen into five parts, making changes
to part two, while part four was being drawn by the
raster.

Unfortunately, we don't have such control over regular
Linux...

(As an aside, cards like the PVR-350 handle this in
their hardware, as part of the video decoder chip,
that's why they get such a clean image.  I suspect
that XvMC also helps in this regard, but I really
don't know anything about XvMC, so I might be talking
out of my ass on that one).

-- Joe

--- Jeroen Brosens [EMAIL PROTECTED] wrote:
  On Wednesday 09 March 2005 02:06 pm, Jeroen
 Brosens wrote:
  I'd like to add that I'd like to challenge the
 MythTV dev-people to
  review
  the Xv/XvMC code regarding the handling of vsync
 while using bobdeint.
 
  One needs hardware with OpenGL support to have a
 Vsync to get bobdeint
  working without going out of sync now and then
 (horrible jittering
  occurs)
  and that isn't good news for users of a barebone
 with built-in graphics
  that can't support that, like myself (using an
 ASUS Pundit).
 
  Well, how do you expect it to know when to flip
 the buffers, if the video
  card
  can't tell it accurately?  Magic?
 
  Isaac
 
 
 
 No Isaac, just something else than GL Vsync. I am
 not venting my
 frustrations upon people either, rather just
 stirring up some new ideas on
 this. Afterall, this is MythTV, Linux, Open
 Source... where meeting
 challenges is the fun of everything! I also could
 have installed Windows
 MCE and to be 'just a regular user on the safe side'
 but I want to be able
 to participate in meeting the challenges where I
 can.
 
 What I understand now is that all of the devvers use
 nVidia, can you agree
 that this diminishes compatibility regarding
 video-related
 functionalities? I am not a C++ 

Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-09 Thread Thomas Börkel
HI!
Cory Papenfuss wrote:
I've mentioned what I did in the past, starting with
http://www.gossamer-threads.com/lists/mythtv/users/45910#45910 Basically 
built up the circuit in the AD724 datasheet.  It's a surface-mount chip, 
and circuit layout is a little important (a few MHz bandwidth video 
signal), so I make a PCB.  I've thought of making a slightly nicer one 
(with VGA loop, though, etc).  If there were enough interest I suppose I 
could finish it off and send off for a limited run of PCBs.
Any chance for a PAL version?
Thomas
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-09 Thread Thomas Börkel
HI!
Will Dormann wrote:
Will Dormann wrote:
While the combination of settings I originally posted does give 
excellent results, I've recently discovered that that particular 
combination does not actually do Bob Deinterlacing.

Ok, I'm very close to getting this right!
For those of you running MythTV 0.17, I think the only thing you'll want 
to change is the resolution.   Rather than using the coryntsci 
resolution, try 800x600.  This fixes the bob vertical resolution 
problem I was seeing.
Should this 800x600 resolution be interlaced or not?
Thomas
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-08 Thread Will Dormann
Jeroen Brosens wrote:
Therefore, using bob is paramount for a smooth video playback. All other
deinterlacers as well as no deinterlacing (!) can't provide this.
Just a follow-up to this thread.
While the combination of settings I originally posted does give 
excellent results, I've recently discovered that that particular 
combination does not actually do Bob Deinterlacing.

I had a hunch that this was the case, as the non-XvMC settings @ 640x480 
+ Bob enabled did seem to produce a slightly smoother picture than XvMC 
@ coryntsci + Bob.  Although the smooth motion was excellent with the 
former, the OSD was ugly and the Bob didn't always seem to get the 
picture right, especially after a seek.   (Sometimes 1/2 the vertical 
resolution would appear to be lost)

The way I determined this was by zooming in significantly and pausing 
the video during a high motion scene.   This made the mouse tooth 
appearance of the interlacing effects quite evident.  If the Bob was 
actually taking place, none of the frames should have any interlacing at 
all.  (Since each frame is derived from a single field).   The 
mythfrontend log does indicate that the Bob is taking place:
 2005-03-08 19:20:41 XvMC will use bob deinterlacing
 2005-03-08 19:20:41 Using deinterlace method bobdeint
but visual analysis of the picture shows otherwise.  I'm not sure if 
this is some sort of problem with the drivers, resolution and/or 
rounding errors, or MythTV itself.

I think the combination of the low CPU usage of XvMC and the Vsync 
provided by OpenGL provides a very good looking picture.  However, if 
the Bob would actually take place, I think it might indeed be optimal 
as I had originally stated.

--
-WD
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-08 Thread Will Dormann
Will Dormann wrote:
While the combination of settings I originally posted does give 
excellent results, I've recently discovered that that particular 
combination does not actually do Bob Deinterlacing.
Ok, I'm very close to getting this right!
For those of you running MythTV 0.17, I think the only thing you'll want 
to change is the resolution.   Rather than using the coryntsci 
resolution, try 800x600.  This fixes the bob vertical resolution 
problem I was seeing.

For those of you running MythTV 0.16, try applying this patch first:
http://www.gossamer-threads.com/lists/mythtv/dev/87191
This fixes the problem with Bob and XvMC not displaying the second field.
Also change the resolution to 800x600 to fix the video resolution problem.
Note that 0.16 plus the above patch will result in a flickering (nearly 
unusable) OSD.  I'll have to figure out what other change between then 
and 0.17 fixes that.

But the end result is:
- Double Frame Frame Rate
- No Interlacing artifacts
- Very low CPU usage
- Proper Vsync
Add all of those up, and I'd say it's close to perfect!
--
-WD
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-07 Thread Neil Brideau
This is a great post! Im looking forward to trying this. Thanks for 
sharing the knowledge.

Will Dormann wrote:
I recently added an nVidia graphics card to my Asus Pundit, in hopes 
of improving the TV-out quality.   The onboard SIS chip isn't bad, but 
I figured I could do better.  After tweaking settings for quite a 
bit, here's how I achieved what I believe to be the optimal output.   
I have an SDTV connected to the card via S-Video, FWIW.   Some of 
these settings may be redundant and/or unnecessary, but I'm just 
relaying all the steps I've taken.

1) Install nVidia 6229 drivers
2) Recompile MythTV (0.16) with support for XvMC and OpenGL (via 
settings.pro)
3) Modify your ~/.nvidia-settings-rc with the following values.  You 
may need to run nvidia-settings once to create this file initially.

0/SyncToVBlank=1
0/TVOverScan[TV-0]=125
0/TVFlickerFilter[TV-0]=1
0/TVSaturation[TV-0]=156
0/XVideoOverlaySaturation=4250
0/XVideoOverlayContrast=4096
0/XVideoTextureSyncToVBlank=1
0/XVideoBlitterSyncToVBlank=1
4) Modify the XF86Config file with the following:
Add to Monitor section:
ModeLine coryntscpi 28.6 720 760 824 912 480 484 492 525 interlace
Add to Device section:
   Option NoLogo true
   Option HWCursor true
   Option RenderAccel true
Add to Screen section:
   DefaultDepth 24
   Option TVStandard NTSC-M
   Option ConnectedMonitor TV
   Option TVOutFormat SVIDEO
   Option TVOverScan 0.8
Add to Display subsection:
   Modes coryntscpi
5) In the MythFrontend Setup screen for TV Playback, enable:
- Deinterlace
- Mode: Bob (2x)
- Use Video for Timebase
- XvMC Playback
That's it.   With these settings, I get great image quality, great 
motion, and great smoothness (no jitter in scrollers across the bottom 
of the screen, for example).X CPU usage is between 1-2% during 
playback.  I don't see any interlacing artifacts in high-motion 
scenes.  The special modeline is required to get good resolution with 
Bob Deinterlacing.   With the standard modeline and 640x480 
resolution, I seem to lose about 1/2 of my vertical resolution when 
enabling Bob Deinterlacing or XvMC.

You can temporarily add  --verbose playback to the mythfrontend 
command line to troubleshoot your playback.   I see the following in 
mine:

-- This means XvMC is working --
2005-03-05 13:56:48 XvMCSurfaceTypes::find(w 720, h 1, c 1, i 2, m 
0,sw 0, sh 10
5, disp, p= 105, 4800 =p, port, surfNum)
2005-03-05 13:56:48 Trying XvMC port 105
2005-03-05 13:56:48 Found a suitable XvMC surface 0
2005-03-05 13:56:48 Using XV port 105
-- This means XvMC is working --

-- This means Bob Deinterlacing is working --
2005-03-05 13:56:48 XvMC will use bob deinterlacing
2005-03-05 13:56:48 Using deinterlace method bobdeint
-- This means Bob Deinterlacing is working --
-- This means OpenGL Vsync + Video Timebase is working --
2005-03-05 13:56:48 Using video as timebase
2005-03-05 13:56:48 Video timing method: SGI OpenGL
-- This means OpenGL Vsync + Video Timebase is working --



--
No virus found in this outgoing message.
Checked by AVG Anti-Virus.
Version: 7.0.308 / Virus Database: 266.6.2 - Release Date: 3/4/2005
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-07 Thread Brian J. Murrell
On Sat, 2005-03-05 at 14:35 -0500, Will Dormann wrote:
 After tweaking settings for quite a bit, 
 here's how I achieved what I believe to be the optimal output.

Hrm.  Optimal meaning as good as it can get with this card but not
quite ultimate?, or do you believe you have a TV-Out signal that
represents how the content was originally broadcast (i.e. perfectly
interlaced fields)?

 I have 
 an SDTV

Which is just a regular old interlaced 59.94 fields/s TV right?  Is SDTV
Standard Definition TV?

 Add to Monitor section:
 ModeLine coryntscpi 28.6 720 760 824 912 480 484 492 525 interlace

OK.  Seems like you are using the SDTV native resolution, so you are
displaying vertical lines one to one with what was received and
recorded (presumably).

 5) In the MythFrontend Setup screen for TV Playback, enable:
 - Deinterlace
   - Mode: Bob (2x)

Why de-interlace?

Why not send the interlaced recorded signal you recorded back to the TV
in the format that it expects to be played in, interlaced, with each
field being shown for 1/59.94th of a second instead of combined with
another field and only shown for 1/29.97th of a second.

If you de-interlace you are essentially decimating any temporal (i.e.
movement from one scene to another) down from 60/s to 30/s.

b.



signature.asc
Description: This is a digitally signed message part
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-07 Thread Brian J. Murrell
On Mon, 2005-03-07 at 23:20 +0100, Jeroen Brosens wrote:
 
 You probably don't know how bob works too well. Using bobdeint is the
 *only* way to achieve the original field rate of the broadcast, as played
 on a TV. Just outputting frames that are captured by the MPEG2 encoder
 (which are a combination of the even and odd fields!) 1:1 using an
 interlaced modeline leaves you with a setup that plays back at 30
 f(rames)ps, whereas TV signal is 60 f(ields)ps (replace with 25 and 50
 respectively for PAL).

If that is how your video card is dealing with frames sent to it.  It
was explained to me that the G400 will, when programmed into proper
TV-Out mode, take a frame (i.e. the odd and even fields together in
one frame @29.97/s as you explain above) and first display one field and
then the other, each at the proper interval of 59.94 _fields_ per
second.

I don't know what your card does but it sounds like it's wanting a
field per transaction rather than a frame of two fields (which it
will then separate and display one after the other) which is what the
Bob/(Progressive Scan) filter explanation on the below reference page
shows.

I guess this is just another way to skin the same cat.  Whether you send
two fields at once and have the card display them separately or send
them indvidually is pretty much six of one, half dozen of the other I
guess, video card transfer overhead not-withstanding.

 Therefore, using bob is paramount for a smooth video playback. All other
 deinterlacers as well as no deinterlacing (!) can't provide this.

It can, as long as the video card understands it is getting an
interlaced frame and that it is to show the frame's two fields
separately.

 Please do read this: http://www.100fps.com/, it offers excellent
 explanations!

b.



signature.asc
Description: This is a digitally signed message part
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-07 Thread Will Dormann
Brian J. Murrell wrote:
Hrm.  Optimal meaning as good as it can get with this card but not
quite ultimate?, or do you believe you have a TV-Out signal that
represents how the content was originally broadcast (i.e. perfectly
interlaced fields)?
optimal meaning the best quality out of the nVidia card using the 
various combinations of options that I have used.   Not quite as good as 
broadcast quality.  (Or possibly the PVR-350)


Which is just a regular old interlaced 59.94 fields/s TV right?  Is SDTV
Standard Definition TV?
Yes.

Why de-interlace?
Why not send the interlaced recorded signal you recorded back to the TV
in the format that it expects to be played in, interlaced, with each
field being shown for 1/59.94th of a second instead of combined with
another field and only shown for 1/29.97th of a second.
From what I've gathered, the nVidia cards (or pretty much any other 
computer video card) won't retain the original recording's interlacing. 
 Dedicated hardware such as the PVR-350 is required for this.I 
could be wrong, though, so feel free to correct me.

XvMC + Bob gives me a better resolution picture than XvMC + No 
Deinterlacing.   I'm not sure why this is, but the difference is pretty 
clear.   Interestingly enough if I'm not using XvMC, no deinterlacing 
give me the better quality picture and Bob gives me something that looks 
like 1/2 the vertical resolution is lost.  (The opposite results).  I 
had a sample video segment that had some text and some diagonal lines. 
In the low vertical resolution mode (either XvMC w/ no deinterlacing 
or non-XvMC with Bob deinterlacing), the diagonal lines would have a 
stairstep appearance.   Also, the text of the OSD would be very blocky. 
 Toggling the deinterlace option would make the picture (and OSD) look 
right again.

I've read that the Bob deinterlacing for XvMC is completely different 
code than the Bob deinterlacing without.  So perhaps that has anything 
to do with it.

Now that I think about it, I wonder if it's a field order issue?   I'm 
not totally certain how the Bob deinterlacer in mythtv works, but let's 
say that it pushes the top field down by a half line and the bottom 
field up by a half line.  If the field order is reversed, then it's 
pushing the top field up by a half line and the bottom field down by a 
half line.  (Which could possibly explain the stair-stepping of diagonal 
lines and resolution loss in the OSD)  If so, I wonder if it's possible 
to reverse it somehow?Or then again, maybe I'm just seeing this:
http://mythtv.org/bugs/show_bug.cgi?id=167
(though in my case, it's not a jitter up and down, it's an obvious loss 
of resolution)

--
-WD
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] How I got great quality TV-out on my nVidia MX4000

2005-03-07 Thread Jeroen Brosens
5) In the MythFrontend Setup screen for TV Playback, enable:
- Deinterlace
- Mode: Bob (2x)
Why de-interlace?
Why not send the interlaced recorded signal you recorded back to the TV
 in the format that it expects to be played in, interlaced, with each
 field being shown for 1/59.94th of a second instead of combined with
 another field and only shown for 1/29.97th of a second.
If you de-interlace you are essentially decimating any temporal (i.e.
 movement from one scene to another) down from 60/s to 30/s.
b.
You probably don't know how bob works too well. Using bobdeint is the
*only* way to achieve the original field rate of the broadcast, as played
on a TV. Just outputting frames that are captured by the MPEG2 encoder
(which are a combination of the even and odd fields!) 1:1 using an
interlaced modeline leaves you with a setup that plays back at 30
f(rames)ps, whereas TV signal is 60 f(ields)ps (replace with 25 and 50
respectively for PAL).
Therefore, using bob is paramount for a smooth video playback. All other
deinterlacers as well as no deinterlacing (!) can't provide this.
Please do read this: http://www.100fps.com/, it offers excellent
explanations!

___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users