Re: [Xpert]Fixing XF86VidMode extension

2002-01-07 Thread Mike

Sottek, Matthew J <[EMAIL PROTECTED]> wrote:
>
>[...]
>works most of the time. Allowing user defined modelines in XF86Config
>is bad enough, but having Apps insert modelines on the fly is really
>[...]

Yes but Xmame users need these user defined modelines and it wouldn't be worth us 
running X if they werent there.

Mike.



___
Xpert mailing list
[EMAIL PROTECTED]
http://XFree86.Org/mailman/listinfo/xpert



Re: [Xpert]Fixing XF86VidMode extension

2002-01-04 Thread 'Billy Biggs'

Sottek, Matthew J ([EMAIL PROTECTED]):

> Let me summarize the options you've discussed and comment on each.

  Matt!  Great summary of the options.

> Assume 59.94 video.

  Let's assume true 59.94 video, no 3:2 pulldown or other conversions,
and I want to display the full motion or better.  Playback of
24/25fps video isn't a problem, and 29.97fps video isn't too big a deal
either, unless you're outputting to a TV.

> Option 1: Run the display at 59.94. This is what you were attempting
> to do by inserting modelines I presume? Using this method you don't
> introduce any more judder than already existed in the video sequence.
> 
> The issue I have here is that you are inserting unknown modelines.
> [...] The ideal solution here would be to let the driver have a set of
> "available" timings as well as the set of "active" ones (The ones that
> are in use in the CTRL-ALT+- list) Then your app could query for a
> driver supported set of timings, even when the user isn't actively
> using them. At least this way the driver has the ability to provide a
> "table" of known good timings.

  I'd love to be able to adjust the refresh rate through a nice API,
rather than giving raw modelines.  I think that would be ideal, and
makes sense in the world of LCD screens and CRTs.  I would really like
to see 59.94fps output working with the refresh rate synced before I
start to try and do fancy interpolations.

  With VidMode there is the API to query a list of dot clocks.  You
think maybe it's reasonable to export a higher-level API?

> Option 2: Run the display really fast and hope nobody notices. This is
> the easiest and probably works pretty well. The faster the refresh the
> smaller the added judder, go fast enough and it just doesn't matter
> anymore.

  Yep.  But this only works if your monitor can go >= 95hz, which isn't
possible on any LCD screen I've seen.

> Option 3: Work on the video stream to make the judder go away. This is
> very hard but this seems to be the goal of your deinterlacer anyway
> RIGHT?

  Sure, but right now I'm using alot of time just showing the frames as
I get them.  Remember that I have to copy/transform the image from v4l
to a shared x buffer, and then x has to copy that again to video memory,
and then display it.  The extra copy hurts at 768 x 480 x 16bpp x 60hz.

> Is it really that absurd to add in the additional step of weighting
> the pixels as was described in your link? Seems like that would
> produce excellent results. This also has another advantage, it scales
> up with faster processors.

  Well sure, but note how the author pans linear interpolation as being
a potentially bad idea.  I haven't tried it yet, and I will soon, but
I'm not optimistic that this is a reasonable conversion method.

-- 
Billy Biggs
[EMAIL PROTECTED]
___
Xpert mailing list
[EMAIL PROTECTED]
http://XFree86.Org/mailman/listinfo/xpert



RE: [Xpert]Fixing XF86VidMode extension

2002-01-04 Thread Sottek, Matthew J


Let me summarize the options you've discussed and comment on each.
Assume 59.94 video.

Option 1: Run the display at 59.94. This is what you were attempting to do
by inserting modelines I presume? Using this method you don't
introduce any more judder than already existed in the video sequence.

The issue I have here is that you are inserting unknown modelines.
Really the only one with any right to determine the available modelines
is the graphics driver. The driver usually has (On other OS's, XFree
is a little different) a set of canned timings and then may par them
down or add a few more after talking it over with the monitor. XFree
moved most of this up into the device independent portion since most
drivers make use of the same canned timings. This isn't ideal but it
works most of the time. Allowing user defined modelines in XF86Config
is bad enough, but having Apps insert modelines on the fly is really
scary.  The ideal solution here would be to let the driver have a
set of "available" timings as well as the set of "active" ones (The
ones that are in use in the CTRL-ALT+- list) Then your app could
query for a driver supported set of timings, even when the user isn't
actively using them. At least this way the driver has the ability to
provide a "table" of known good timings.

Option 2: Run the display really fast and hope nobody notices. This is
the easiest and probably works pretty well. The faster the refresh the
smaller the added judder, go fast enough and it just doesn't matter
anymore.

Option 3: Work on the video stream to make the judder go away. This is
very hard but this seems to be the goal of your deinterlacer anyway
right?  The video you are getting at 59.94 may be the result of 3:2
pulldown so it may already have judder. You have to detect this and
get back to the 24fps to get rid of the judder. Plus you may have to
timeshift half the fields to get rid of the jaggies. Is it really
that absurd to add in the additional step of weighting the pixels as
was described in your link? Seems like that would produce excellent
results. This also has another advantage, it scales up with faster
processors.
For example assume infinite processor power. If your video is 59.94
with 3:2 pulldown you've got 24fps of real video. Assume your display
is going at 100hz. You could display 100fps by linearly weighting and
blending the pixels of your 24fps video to generate 100fps of unique
video. Basically this is motion blur for video.

The link you gave also suggests that flat panels with their "always on"
type pixels are not idea for video because the eye can detect the
judder more easily than with a crt's "flashing" pixels. Blurring the
video would probably produce better results at high speed than would
be produced with clean pixels.

I vote for #3, let me know when you're done :)

-Matt


___
Xpert mailing list
[EMAIL PROTECTED]
http://XFree86.Org/mailman/listinfo/xpert



Re: [Xpert]Fixing XF86VidMode extension

2002-01-04 Thread Billy Biggs

Sottek, Matthew J ([EMAIL PROTECTED]):

> I understand why you are trying to generate the modelines on the fly,
> but I really think this is a bad idea.  [...] I actually have a flat
> panel (Analog input) at home that is sooo finicky with timings that I
> have to have them exactly as expected or else the scalar gets shaky.

  The problem is eliminating judder when playing back high framerate
video.  If you have a v4l-supported card, you can try my app 'tvtime'
which deinterlaces TV input to 59.94fps video:

  http://www.dumbterm.net/graphics/tvtime/

  With 60fps video, you'll see nasty judder unless your refresh rate
exactly matches the video input rate.  That is, a refresh of 60hz for
video at 59.94fps will cause a 'jump' in the video every 20s or so.
It's really annoying when watching the scrolling text at the bottom of
'CNN' for example.  I've reproduced this with tvtime+vidmode hacks.

  So, the trick is to sample the rate of the incoming TV channel (not
predictable) and dynamically create a modeline to match it.  Otherwise,
you can set your refresh to something sufficiently high that you don't
notice.  95-100hz and up I find acceptable for 60fps video.

  However, I haven't seen a flat panel which can hit that kind of
refresh rate.  Dave Marsh in this paper suggests that flat panels should
be run at the input video rate:

  http://www.microsoft.com/hwdev/archive/TVBROADCAST/TempRate.asp

  I'm attempting to implement his suggestion and see how well this holds
in practice.

  Thoughts?

-- 
Billy Biggs
[EMAIL PROTECTED]
___
Xpert mailing list
[EMAIL PROTECTED]
http://XFree86.Org/mailman/listinfo/xpert



[Xpert]Fixing XF86VidMode extension

2002-01-04 Thread Billy Biggs

  Over the holidays I fixed XF86VidModeAddModeLine and
XF86VidModeSwitchToMode in my local copy of 4_1-branch.  My patch though
is very strange and not entirely correct.

  I don't believe the current CVS code ever worked for anyone: I felt I
was completing the code rather than bugfixing it.  That said, I'm
worried that my patch may be missing the intention of the code and how
it is supposed to work.  For example, when should the new modelines be
validated and how to return errors to the user.

  Who is currently 'in charge' of this code?

  My application is my deinterlacer.  I was attempting to build
modelines on the fly to soft-genlock the monitor refresh to the incoming
video signal.  Since this is proving difficult, I now generate modelines
on the fly with sufficiently high refresh rates to hide judder effects.

  It's going ok so far. Much easier with a working VidMode extension. :)

-- 
Billy Biggs
[EMAIL PROTECTED]
___
Xpert mailing list
[EMAIL PROTECTED]
http://XFree86.Org/mailman/listinfo/xpert