berger patrick wrote:
>
> [EMAIL PROTECTED] wrote:
>
> > > I'd like to display a picture for 40 usec exactly (this value can be changed)
> > > with gtk or qt (i may use X for use matrox dual head)
> > > is it possible with rtlinux?
> >
> > Your CRT display displays a frame for at least 10 ms.
> > (Assuming 100 frame/sec, non-interlaced.)
> > There is no way to change the picture durint this interval.
> > The electron beam won't run back in the middle of the screen
> > if you change the video RAM after 40 usec.
> >
> > BTW. What is this? Some subliminal advertisement? ;-)
> >
> > Gabor
>
> sorry :))) i'd say 40 ms and it's for a priming manip
Ok... Still close to the refresh rates though, and I wouldn't expect
anything like *exactly* 40 ms, even if you can quite easily construct a
suitable video timing to do that.
Check out svgalib and the function vga_addtiming(), the config file etc
- this should be able to let you set up a suitable refresh rate so that
a certain number of frames would produce the required 40 ms.
X could probably do it as well, but if you really need a thorough
*guaratee* on those 40 ms, you're probably best off with svgalib and
SCHED_FIFO on a Linux kernel with Mingo's lowlatency patch. RTL + shared
memory could do as well; catch the VGA retrace IRQ with an RTL ISR, and
put code to blit in/remove the image there, or preferably in an RTL
thread that the ISR wakes up. (I think this could be done with X as
well, as it's really a hardware and memory management thing, rather than
anything that involves the driver in the time critical stuff. The driver
is only used to set up the mode and to make the video RAM available as
shared memory.)
GGI on fbdev or KGI are two other options, but it seems like at least
the fbdev target lacks raster sync... (That rules out the
Linux/lowlatency + SCHED_FIFO way, but makes no difference to the RTL
way of using the IRQ directly.)
However, the problem WRT accuracy is the "afterglow time" (or whatever
the correct term is) of the CRT. In order to minimize flickering, all
CRTs have a certain time constant on the light emitting phosphorus
layer, and although this is a very short time compared to that of an
LCD, or even a PAL/NTSC TV set CRT, I'm afraid it might stretch those 40
ms a few percent.
Also, you may have to consider the fact that the image is drawn scan
line by scan line, from top to bottom, which actually takes the major
part of the refresh time. You can see this clearly if you take a photo
of a TV screen with a fast film and short shutter time; you'll actually
see pretty clearly where on the screen the raster beam was drawing when
the shutter was open!
BTW, LED displays (exist in RGB graphic configurations as well, at least
in large, or rather, huge formats) have the capability of being very
fast, and if the drivers permit it; could be "flashed" full-screen with
any frame duration. (Unfortunately, most standard LED display drivers
for bigger displays work pretty much like CRTs in this respect - by
repeatedly flashing one row, group or figure at a time, in order to save
components by not throwing in one driver stage per LED segment...)
Oh, well... Does that cover your questions? :-)
//David
-- [rtl] ---
To unsubscribe:
echo "unsubscribe rtl" | mail [EMAIL PROTECTED] OR
echo "unsubscribe rtl <Your_email>" | mail [EMAIL PROTECTED]
---
For more information on Real-Time Linux see:
http://www.rtlinux.org/rtlinux/