It is not available in many of the default MythTV
builds, because it requires linking against an OpenGL
library of some kind. Also, some time ago, when it
was enabled by default, some people were having
problems with it.
If you're building from source, then you must edit
settings.pro (look for "o
Sigurd Nes wrote:
Joe Votour wrote:
- Enable OpenGL vsync support (MythTV now shows "Video
timing method: SGI OpenGL")
Where did you find this option ?
It's a compile option in settings.pro
--
-WD
___
mythtv-users mailing list
mythtv-users@mythtv.org
htt
Joe Votour wrote:
- Enable OpenGL vsync support (MythTV now shows "Video
timing method: SGI OpenGL")
Where did you find this option ?
Sigurd
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
Yeah, all those free-running components screwing
things up. :) Then again, my Commodore 64 can't do
MPEG-2, or else I'd have written a MythTV client in
assembly for it already. :D
Just to add my (so far) success story to this thread,
I decided to take some of the suggestions and see what
would
Jeroen Brosens wrote:
I received my GF MX4000 today and managed to get it working... except for
vsync :) Can anybody tell me the one thing I probably forgot to do here,
as you can see in this log (mythfrontend -v playback) the nVidia vsync
method is not used (nVidiaVideoSync: VBlank ioctl did not w
I received my GF MX4000 today and managed to get it working... except for
vsync :) Can anybody tell me the one thing I probably forgot to do here,
as you can see in this log (mythfrontend -v playback) the nVidia vsync
method is not used (nVidiaVideoSync: VBlank ioctl did not work,
unimplemented in
Hi
I sort of solved my problem of getting rid of the KDE panel on the TV and
switching keyboard focus on non-xinerama dual head screens although its a bit
naff:
If before I start mythfrontend I use a fantastic program from the website:
http://wiki.gentoo-italia.net/index.php/Dual_Monitors
to swit
Will Dormann wrote:
Tom Lichti wrote:
As an aside, what is the preferred Nvidia card for TV output? I have
a generic GeForce 4 MX440 and it works alright, although I'm sure it
could be better. If there is a better card to use, what is it?
From what I gather, the MX4000 is just a die-shrunk versi
Hi
There is nothing to stop me fromtrying S-Video except I would need to spend
some money buying an S-Video to S-Video cable (which I might try at some
point anyway).
At the moment I am more worried about the XWindows Panel appearing at the
bottom of the TV screen when I start using the the com
On Fri, 11 Mar 2005 10:39:41 +, trev <[EMAIL PROTECTED]> wrote:
> I am not sure if this is strictly on topic, but I can't use the mouse on the
> TV, it doesn't scroll across to the other screen (a separate desktop) and
> although the keyboard works on that screen when mythtv is started if I s
trev wrote:
The manual I have does specify "If the correct connector cable is connected,
S-Video out will generally provide a higher quality output than Composite
video out."
I can't test this at the moment because although I have an S-Video connector
at both ends I don't have an S-Video cable a
Hi
I've got an Asus AGP MX440 and am using the composite output, direct into my
tv.
I run Mythtv on the tv and use my computer on my monitor.
I can't see any problems on the tv screen with bars or anything, although at
1024x768 the text is a bit unreadable on the desktop, although I expected
tha
Will Dormann wrote:
Tom Lichti wrote:
As an aside, what is the preferred Nvidia card for TV output? I have
a generic GeForce 4 MX440 and it works alright, although I'm sure it
could be better. If there is a better card to use, what is it?
From what I gather, the MX4000 is just a die-shrunk versi
I too have an AGP 440MX manufactured by XFX. Although the display is
quite as crisp as you can get on s-video, I see some kind of ghostly
diagonal bars scrolling.
I don't quite notice them most of the time but in the GUI it is
_quite_ noticeableâ Enough for people, to whom I demonstrate my setup,
Tom Lichti wrote:
As an aside, what is the preferred Nvidia card for TV output? I have a
generic GeForce 4 MX440 and it works alright, although I'm sure it could
be better. If there is a better card to use, what is it?
From what I gather, the MX4000 is just a die-shrunk version of the
MX440, so
John Patrick Poet wrote:
I understand that you are frustrated, but your last sentence is a little
offensive.
The OpenGL vsync/bobdeint combo is awsome. Ever since Doug implemented that
combination, my video playback has been silky smooth. I have an nVidia
graphics card.
The reason it may not work
> Someone else here wrote, that RGB is *too* sharp for MPEG2 video,
> meaning that you see the artifacts then.
>
> Thomas
I expect it depends on where you got the MPEG2 video from. Using a VGA
-> SCART converter in the UK with DVB-T I get an amazing image which
rarely has any visible artifacts. B
Thomas Börkel wrote:
For those of you running MythTV 0.17, I think the only thing you'll
want to change is the resolution. Rather than using the "coryntsci"
resolution, try "800x600". This fixes the bob "vertical resolution"
problem I was seeing.
Should this 800x600 resolution be interlaced
What about RTLinux or RTAI Linux? This would give the realtime performance
needed to respond to a VBI interrupt, the problem would be that the RTLinux
code to interract with the graphics card would have to be written from
scratch. Just a thought, I haven't really been following this thread.
The AD724 supports PAL encoding as well as NTSC, so it should be
But there is something different that goes beyond the frequency, AFAIK. If I
connect a PAL TV to an NTSC S-vid, I would get a B/W picture. Or was that
with composite (cinch)? Or is that AD724 switchable between PAL nad NTSC?
The key to MythTV (or any program, really) being able
to render a display without tears or choppiness is
really in two things:
1. Being able to know when the vertical sync is, and,
2. Being able to react to the vertical sync event in a
timely manner
There's a 3rd issue here. Most (all?) linux vid
HI!
Cory Papenfuss wrote:
I've mentioned what I did in the past, starting with
http://www.gossamer-threads.com/lists/mythtv/users/45910#45910
Basically built up the circuit in the AD724 datasheet. It's a
surface-mount chip, and circuit layout is a little important (a few
MHz bandwidth video
On Thu, 10 Mar 2005, [ISO-8859-1] Thomas Börkel wrote:
HI!
Cory Papenfuss wrote:
I've mentioned what I did in the past, starting with
http://www.gossamer-threads.com/lists/mythtv/users/45910#45910 Basically
built up the circuit in the AD724 datasheet. It's a surface-mount chip,
and circuit l
I'll preface this by saying that I'm not a graphics
guru. However, I have been doing graphics-related
work off and on for the last ten years (although
nothing too fancy, mostly 2D stuff).
The key to MythTV (or any program, really) being able
to render a display without tears or choppiness is
reall
HI!
Will Dormann wrote:
Will Dormann wrote:
While the combination of settings I originally posted does give
excellent results, I've recently discovered that that particular
combination does not actually do Bob Deinterlacing.
Ok, I'm very close to getting this right!
For those of you running Myth
HI!
Cory Papenfuss wrote:
I've mentioned what I did in the past, starting with
http://www.gossamer-threads.com/lists/mythtv/users/45910#45910 Basically
built up the circuit in the AD724 datasheet. It's a surface-mount chip,
and circuit layout is a little important (a few MHz bandwidth video
I'll preface this by saying that I'm not a graphics
guru. However, I have been doing graphics-related
work off and on for the last ten years (although
nothing too fancy, mostly 2D stuff).
The key to MythTV (or any program, really) being able
to render a display without tears or choppiness is
real
Bob deint is designed to output to progressive display devices, such as
HDTVs (or EDTVs) or projectors. The fact that it sometimes looks better
than non-deinterlaced material on non-progressive displays is an
indication of how *&($#ed up video display in Linux is in general. I
think if your hardw
John Patrick Poet wrote:
If someone wanted to buy Doug an Asus Pundit, he *might* be willing to take
the time to work on improving bobdeint for non nVidia cards, but he is
unlikely to buy one for himself just for that purpose.
I would not accept any hardware; I barely have time (recently: none at
a
Jeroen Brosens wrote:
I already went through the effort of making a VGA->SCART converter cable
(I can provide schematics and photo's if you'd like) hoping that it would
bring me the ultimate in MythTV viewing. But it didn't.
It gives the best colors (separate R/G/B and sync) and sharpness of all
po
On Wednesday 09 March 2005 03:33 pm, Jeroen Brosens wrote:
> > On Wednesday 09 March 2005 02:06 pm, Jeroen Brosens wrote:
> >> I'd like to add that I'd like to challenge the MythTV dev-people to
> >> review
> >> the Xv/XvMC code regarding the handling of vsync while using bobdeint.
> >>
> >> One ne
It gives the best colors (separate R/G/B and sync) and sharpness of all
possible methods, but... no smooth motion - on my setup that is. Some of
you may recall my postings about struggling to get either bobdeint working
on my SiS video (ASUS Pundit) or completely bypass the TV-out that can't
handle
Jeroen Brosens wrote:
Stephen Williams wrote:
Of course, the way to get 'optimal' TV-out from your Nvidia (or other)
card is not to use it's TV-out facilities at all and build a VGA ->
SCART converter for SDTVs (see http://www.sput.nl/hardware/tv-x.html
for example).
Yes, this may be tr
Will Dormann wrote:
If I hear from at least one other person with similar results, maybe
I'll submit my instructions to Jarod or a Wiki or something.
err, I put them up here:
http://www.mythtv.info/moin.cgi/NVidiaMX4000HowTo
I think there's some followon I may have missed.
And it needs some 0.16
> On Wednesday 09 March 2005 02:06 pm, Jeroen Brosens wrote:
>> I'd like to add that I'd like to challenge the MythTV dev-people to
>> review
>> the Xv/XvMC code regarding the handling of vsync while using bobdeint.
>>
>> One needs hardware with OpenGL support to have a Vsync to get bobdeint
>> wor
On Wed, 9 Mar 2005 12:34:00 -0700 (MST), John Patrick Poet
<[EMAIL PROTECTED]> wrote:
>
> On Wed, 9 Mar 2005, Jeroen Brosens wrote:
>
> > > I gave up after trying *every* combination of driver/cable/modeline/X11
> > > video output device/eating shoes and decided to buy a nVidia card after
> > > S
On Wed, 9 Mar 2005, Jeroen Brosens wrote:
> > I gave up after trying *every* combination of driver/cable/modeline/X11
> > video output device/eating shoes and decided to buy a nVidia card after
> > Staffan Pettersson persuaded me (thank you):
> > http://www.mail-archive.com/mythtv-dev@mythtv.org/
On Wednesday 09 March 2005 02:06 pm, Jeroen Brosens wrote:
> I'd like to add that I'd like to challenge the MythTV dev-people to review
> the Xv/XvMC code regarding the handling of vsync while using bobdeint.
>
> One needs hardware with OpenGL support to have a Vsync to get bobdeint
> working witho
> I gave up after trying *every* combination of driver/cable/modeline/X11
> video output device/eating shoes and decided to buy a nVidia card after
> Staffan Pettersson persuaded me (thank you):
> http://www.mail-archive.com/mythtv-dev@mythtv.org/msg04131.html
>
> The card is due to arrive friday,
>Stephen Williams wrote:
>> Of course, the way to get 'optimal' TV-out from your Nvidia (or other)
>> card is not to use it's TV-out facilities at all and build a VGA ->
>> SCART converter for SDTVs (see http://www.sput.nl/hardware/tv-x.html
>> for example).
>
>Yes, this may be true for those of yo
On 9 Mar 2005 at 10:15, Cory Papenfuss wrote:
> slightly nicer one (with VGA loop, though, etc). If there were enough
> interest I suppose I could finish it off and send off for a limited run of
> PCBs.
Or use this service instead
http://www.olimex.com/
Basically, you supply the design spe
Seems ATI sells a DVI->component converter for their 8500 and 9xxx cards
for $29. Perhaps that is the way to go. That should work with an SDTV
with component video connections no? Maybe cheaper than the cheapest
vidcard you would want to buy and then an additional U$129 for the Audio
whatsitcall
On Wed, 2005-03-09 at 12:07 -0500, Cory Papenfuss wrote:
> If by "properly overscanned" you mean not seeing all 720x480, then
> yes, it is.
Sweet. I am totally jealous dude.
> It's actually pretty easy to change the horizontal
> overscanning by just padding the sides with black, but ke
Oh, me three. I would love nothing more than to be done with all of
this "choose the right video card for TV-Out magic" and be able to use
any old video card that has any number of other features I may want (in
reality any video card that can scale horizontally would suffice for
me).
Ah... if onl
I meant to ask. The picture looks properly overscanned? i.e. you are
not seeing all 7??x48? (forget what res you said you have your modeline
at) pixels right? Just the ones that would normally show up on a real
broadcast -- missing that certain percentage of those around the border.
If by "prop
On Wed, 2005-03-09 at 08:44 -0800, Joe Votour wrote:
> Have you given thought to the minimum order that you
> require for this, and the cost? Depending on those
> numbers, I'll sign up for one, maybe two (to have one
> as a backup).
Oh, me three. I would love nothing more than to be done with al
On Wed, 9 Mar 2005 11:30:22 -0500 (EST), Cory Papenfuss
<[EMAIL PROTECTED]> wrote:
> Small, perhaps. Irritating, absolutely. Also, beyond the
> understanding of 99% of the PC-buying public (MythTV crowd is higher
> technically-saavy than most). To most people, the tradeoffs involved are
Cory,
I've followed your posts on the circuit board that
you've made before - it interested me then, and it
interests me now.
Myself, I'm a software developer, and I've soldered a
grand total of one thing (a custom cable), and that
didn't even quite come out right (mind you, I just
don't have tim
On Wed, 2005-03-09 at 11:36 -0500, Cory Papenfuss wrote:
> That's what I thought. I'm happy with the results, but the
> annoyance of getting everything right is definately beyond the scope of
> what most people are willing to deal with.
Perhaps. Certainly not for the "out of the box" cro
Could be but I like Corey's idea better. The above is a scan converter.
Corey does not convert the scan but has the computer produce the right
timings for television. His circuit just does "coordinate
transformation RGB->YPbPr and NTSC composite modulation (colorburst,
3.58MHz crystal, etc)".
Thi
On Wed, 2005-03-09 at 08:27 -0800, Big Wave Dave wrote:
> Looks like something such as...
>
> http://cgi.ebay.com/ws/eBayISAPI.dll?ViewItem&category=21169&item=3879054837&rd=1&ssPageName=WDVW#ebayphotohosting
>
> ...might be a good solution.
Could be but I like Corey's idea better. The above is
I've mentioned what I did in the past, starting with
http://www.gossamer-threads.com/lists/mythtv/users/45910#45910
Basically built up the circuit in the AD724 datasheet.
Seems strange that there is no commercial "off the shelf" products doing
this. Perhaps there is. I will have to search
Looks like something such as...
http://cgi.ebay.com/ws/eBayISAPI.dll?ViewItem&category=21169&item=3879054837&rd=1&ssPageName=WDVW#ebayphotohosting
...might be a good solution.
Dave
On Wed, 09 Mar 2005 11:20:48 -0500, Brian J. Murrell
<[EMAIL PROTECTED]> wrote:
> On Wed, 2005-03-09 at 10:15 -05
On Wed, 2005-03-09 at 10:15 -0500, Cory Papenfuss wrote:
> I've mentioned what I did in the past, starting with
> http://www.gossamer-threads.com/lists/mythtv/users/45910#45910
> Basically built up the circuit in the AD724 datasheet.
Seems strange that there is no commercial "off the shelf"
On Wed, 2005-03-09 at 10:14 -0500, Will Dormann wrote:
>
> Yes it does. The problem is, no video card appears to be able to do
> TV-Out while retaining that proper interlacing information.
As it was explained to me, the G400 does.
> The instant
> the card does any sort of scaling of the pictu
Don't hold out on us Corey, do tell more. :-) I'm assuming you built
something that converts interlaced vga to s-video using some kind of
s-video encoding silicon?
I've mentioned what I did in the past, starting with
http://www.gossamer-threads.com/lists/mythtv/users/45910#45910
Basically built
Brian J. Murrell wrote:
On the issue of hardware encoding and whether it preserves the
interlacing, I do believe the PVR-250 I have in my machine does indeed
preserve the interlacing in the MPEG2 stream it creates.
Yes it does. The problem is, no video card appears to be able to do
TV-Out while r
Stephen Williams wrote:
Of course, the way to get 'optimal' TV-out from your Nvidia (or other)
card is not to use it's TV-out facilities at all and build a VGA ->
SCART converter for SDTVs (see http://www.sput.nl/hardware/tv-x.html
for example).
Yes, this may be true for those of you who have SCA
Jeroen Brosens wrote:
>
>>I think the combination of the low CPU usage of XvMC and the Vsync
>>provided by OpenGL provides a very good looking picture. However, if
>>the Bob would actually take place, I think it might indeed be "optimal"
>>as I had originally stated.
>>
> So which version of MythT
On Wed, 2005-03-09 at 09:35 -0500, Cory Papenfuss wrote:
> I've done something similar (although I'm
> using an analog S-vid chip for NTSC).
Don't hold out on us Corey, do tell more. :-) I'm assuming you built
something that converts interlaced vga to s-video using some kind of
s-video encoding
I've built one of these and the result is _much_ better than the
TV-out from by Nvidia card. The image quality is even higher than from
my Sony DVB set-top-box.
Doesn't surprise me. Even a 20 year old VGA card can do twice the
bandwidth required of SDTV. Newer ones are more like 20x (350MHz dot
On Wed, 2005-03-09 at 12:14 +, Stephen Williams wrote:
> Of course, the way to get 'optimal' TV-out from your Nvidia (or other)
> card is not to use it's TV-out facilities at all and build a VGA ->
> SCART converter for SDTVs (see http://www.sput.nl/hardware/tv-x.html
> for example).
But my TV
> Stephen Williams wrote:
> > Of course, the way to get 'optimal' TV-out from your Nvidia (or other)
> > card is not to use it's TV-out facilities at all and build a VGA ->
> > SCART converter for SDTVs (see http://www.sput.nl/hardware/tv-x.html
> > for example). This is only true of home-built con
HI!
Stephen Williams wrote:
Of course, the way to get 'optimal' TV-out from your Nvidia (or other)
card is not to use it's TV-out facilities at all and build a VGA ->
SCART converter for SDTVs (see http://www.sput.nl/hardware/tv-x.html
for example). This is only true of home-built converters, comme
Of course, the way to get 'optimal' TV-out from your Nvidia (or other)
card is not to use it's TV-out facilities at all and build a VGA ->
SCART converter for SDTVs (see http://www.sput.nl/hardware/tv-x.html
for example). This is only true of home-built converters, commercial
converters perform sca
> Jeroen Brosens wrote:
>> Therefore, using bob is paramount for a smooth video playback. All other
>> deinterlacers as well as no deinterlacing (!) can't provide this.
>
> Just a follow-up to this thread.
>
>
> I think the combination of the low CPU usage of XvMC and the Vsync
> provided by OpenG
Will Dormann wrote:
While the combination of settings I originally posted does give
excellent results, I've recently discovered that that particular
combination does not actually do Bob Deinterlacing.
Ok, I'm very close to getting this right!
For those of you running MythTV 0.17, I think the only
Jeroen Brosens wrote:
Therefore, using bob is paramount for a smooth video playback. All other
deinterlacers as well as no deinterlacing (!) can't provide this.
Just a follow-up to this thread.
While the combination of settings I originally posted does give
excellent results, I've recently discove
5) In the MythFrontend Setup screen for TV Playback, enable:
- Deinterlace
- Mode: Bob (2x)
Why de-interlace?
Why not send the interlaced recorded signal you recorded back to the TV
> in the format that it expects to be played in, interlaced, with each
> field being shown for 1/59.94th of a
Brian J. Murrell wrote:
Hrm. Optimal meaning "as good as it can get with this card but not
quite ultimate?", or do you believe you have a TV-Out signal that
represents how the content was originally broadcast (i.e. perfectly
interlaced fields)?
"optimal" meaning the best quality out of the nVidia
On Mon, 2005-03-07 at 23:20 +0100, Jeroen Brosens wrote:
>
> You probably don't know how bob works too well. Using bobdeint is the
> *only* way to achieve the original field rate of the broadcast, as played
> on a TV. Just outputting frames that are captured by the MPEG2 encoder
> (which are a com
On Sat, 2005-03-05 at 14:35 -0500, Will Dormann wrote:
> After tweaking settings for quite a bit,
> here's how I achieved what I believe to be the optimal output.
Hrm. Optimal meaning "as good as it can get with this card but not
quite ultimate?", or do you believe you have a TV-Out signal that
This is a great post! Im looking forward to trying this. Thanks for
sharing the knowledge.
Will Dormann wrote:
I recently added an nVidia graphics card to my Asus Pundit, in hopes
of improving the TV-out quality. The onboard SIS chip isn't bad, but
I figured I could do better. After twea
73 matches
Mail list logo