Re: [Xpert]Is the XFree development stuck in a dead end?

2002-07-17 Thread rjh

At first I was thinking What a high quality troll.  Good job..  But
since it appears to be somewhat serious I suggest that if you really
want a better XFree86, you should define, document, and perform the
appropriate benchmark tests.  Then the XFree developers are very likely
to make improvements that improve that benchmark.

These words have meanings:

  Define - The benchmark should be structured so that it really
  represents a test of X, not some application.  If possible, it should
  reflect an understanding of the X environment so that individual
  contributing factors can be isolated and improved.

  Document - This means it should be reasonable for some third party to
  reproduce your results and perform these benchmarks without needing
  any further advice.

  Peform - Tell us what it does today (in your environment) and what you
  think the should be achievable.  


From some of the later discussion I started to wonder whether the
following optimization might be appropriate as an X option or extension.
At present, XMotionEvents are sent at a fairly high rate when there is
rapid mouse motion.  I have measured XMotionEvents in excess of 100 per
second. A well written application will collapse multiple XMotionEvents
into a single motion.  This is a routine optimization that is part of
every programming example that I have read.  

I experienced, found, and fixed one path through the WindowMaker window
manager where the combining was not being done.  This was a bug in
WindowMaker that really looked like an X bug.  You had to get just the
right configuration because almost all of the paths through WindowMaker
were combining events.  I happened to have the one that did not.  It
showed up first with X 4.0 because 4.0 was much faster than 3.5.

Perhaps there would be value to a motion rate limiting option in the X
server. This is not merely covering up for flawed applications.  In the
current environment, a fast application can respond to the XMotionEvent
faster than the events are sent.  So the fast application is performing
the cycle of read an event, update the screen, and returning to read
another event several times per screen refresh.  This is waste motion
and interferes with scheduler logic.  It would make sense to specify to
the X server that it should combine motion events for 10-20 milliseconds
before sending another motion event.  This avoids all the scheduling and
messaging traffic during that invisible interval.

But what will this break?  I know that making the rate limit too long
will definitely make the system very jerky and erratic.  Does this break
other aspects of the system?  I think that this has to be done at the
server level rather than at any lower level (like window), so I want to
be fairly sure that it will not cause problems.  The implementation will
have some difficult spots, like making sure that the final event does
make it out to the client when the time delay expires.

R Horn

___
Xpert mailing list
[EMAIL PROTECTED]
http://XFree86.Org/mailman/listinfo/xpert



Re: [Xpert]Re: 10-bits per colour

2002-06-24 Thread rjh

On 23 Jun, [EMAIL PROTECTED] wrote:
 The minimum realistic requirement for medical work is a high quality
 monitor and a 10-bit DAC.  This lets you adjust the output LUT so that
 you can comply with the display standard while using 8-bit data.
 Assuming that there is compliance with the CPI profile, you can degrade
 10-bit to 8-bit image data with a minimum loss of utility.
 
 The digital imaging systems that I have been linking to
 provide raw 12 and 14 bit grey scale, this is then processed
 to provide contrast enhancement so that medical inserts can
 easily be seen even with low dosage X-ray monitoring. The
 advantage is that only one ADC/DAC pair is required. Colour
 is added to the processed image, when a drop to 8 bit can be
 accepted. 
 
 The x-ray systems had to be very high resolution 12 bit
 before acceptable as a replacement to film in a number of
 situations. 
 
Yes, although in all likelihood that requirement was driven by the needs
of the acquisition and image processing.  I rarely see display
situations where the viewing environment permits 12-bit image
discrimination.  The normal viewing environment is about 10 to 11-bit
capable.  But, you can often need more precise imaging data to permit
proper image processing or to deal with exposure and sensor variations.

I usually deal with 12-bit CR data, and occasionally with 10-bit.  In
very low exposure imagery you often need the 12-bit capable system so
that you get sufficient detail.  Then image processing can compensate
for exposure and acquisition problems and bring the image into proper
range for viewing with more ordinary monitors.

General radiography remains the most demanding of the medical imaging
applications. The CT, MR, and Ultrasound images are much less demanding.

R Horn

___
Xpert mailing list
[EMAIL PROTECTED]
http://XFree86.Org/mailman/listinfo/xpert



Re: [Xpert]Re: 10-bits per colour

2002-06-23 Thread rjh

On 21 Jun, Dr Andrew C Aitchison wrote:
 On Thu, 20 Jun 2002, Christoph Koulen wrote:
 The delimiting factor, I agree, would be the human eye! I wonder, if it
 is capable of distinguishing between 1024 shades of a primary color?

Yes it is.  There is scientific work on the eye-brain vision system and
one of the results is a clear answer yes.  At this level of detail you
must also be precise about the meaning of the word distinguish.  I've
experienced several definitions of it:

 1) Able to differentiate two halves of a split circle contrast target
  on a neutral background.
 2) Able to correctly identify the polarity of a 3x3 checkboard target.  I.e., 
 correctly identify the center square as darker or lighter.
 3) Able to correctly locate a contour line at a flat spot in an image.

The tests I worked with were done in grey and blue (because those are
what radiology works in).  When alert and wearing glasses my vision
quits somewhere around 1500-2000 levels.  In the semi-random sample of
several thousand field service engineers we found none that were below
300 levels.  Most were 500 or better in the mid-greys.

 
 Probably not, especially since there are colours too bright and too dim
 for a monitor to show. However with only 256 shades the steps between 
 adjacent colors are not always even (gamma mapping can reduce this problem)
 and it isn't difficult to find single steps which are very obvious,
 especially on a gray ramp. 1024 shades makes it easer to make the steps
 even, and maybe allow all of them to be invisible.
 

The luminance (the kind in cd/m^2) of image and environment are key
parameters in defining the number of visible levels.  The key parameters
when the display covers the full field of view are the brightness of
black (which includes reflection of ambient lighting) and the
brightness of white.  Typically lit ordinary CRT monitors are often in
a range where the eye is limited to under 256 levels.  As you mentioned,
constraining these 256 levels to be points of equal voltage to a monitor
further eliminates levels because these levels are not placed uniformly
in perceptual space.  

The natural CRT gamma curve is a fair approximation to the eye's
response, which is why CRTs have been successful.  It is not perfect.
Increasing the DAC resolution to 10-bits voltage puts sufficient
adjustment into the exact positioning of luminance levels so that all of
the roughly 256 visible levels can be displayed.  This is one of the
major reasons for the need for 10-bit video output resolution.  Note
that a 10-bit DAC is enough.  8-bit RGB pixel storage remains sufficient
because the purpose of the 10-bit DAC is presentation using the eye's
response curve rather than the CRT's gamma curve.

For specific examples of how this can be used, see  
http://medical.nema.org/dicom/2001.html/01_14PU.PDF  or Barten's book
Contrast Sensitivity of the Human Eye and Its Effects on Image Quality

Then follow the references to track down other major researchers on this
topic.

R Horn

___
Xpert mailing list
[EMAIL PROTECTED]
http://XFree86.Org/mailman/listinfo/xpert



Re: [Xpert]Re: 10-bits per colour

2002-06-23 Thread rjh

On 23 Jun, Detlef Grittner wrote:
For specific examples of how this can be used, see  
http://medical.nema.org/dicom/2001.html/01_14PU.PDF  or Barten's book
Contrast Sensitivity of the Human Eye and Its Effects on Image Quality

Then follow the references to track down other major researchers on this
topic.

R Horn
 
 I found the correct link at http://medical.nema.org/dicom/2001/01_14PU.PDF 
 
 Thank you for the information. As I'm working on medical viewers people often ask me 
how many gray scales are needed.
 Some people even doubt that more than 8 Bit (256) gray scales are necessary.
 But typical radiological images often come with 10 Bit (1024) gray scales.
 If you have a display and a video card that can display distinguishable 1024 grays 
that would be invaluable.

Sorry about the typo.  If you are working in medical you must also take
a look at  http://www.rsna.org/IHE/tf/ihe_tf_index.shtml

In particular, it is becoming a market necessary to comply with the
Consistent Presentation of Images (CPI) profile.  At present it only
applies to greyscale images.  Calibrating color monitor presentation to
comply with the greyscale standard makes a significant improvement to
the quality of a color presentation.  The further work on color space
calibration remains in committee.  This came up very briefly at last
weeks DICOM WG-6 meeting, mostly as a question regarding when was the
color standard going to be ready, with a response of Don't know.  One
of the issues is the weakness of the scientific literature regarding the
diagnostic requirement for color consistency.  

The minimum realistic requirement for medical work is a high quality
monitor and a 10-bit DAC.  This lets you adjust the output LUT so that
you can comply with the display standard while using 8-bit data.
Assuming that there is compliance with the CPI profile, you can degrade
10-bit to 8-bit image data with a minimum loss of utility.

There are a number of vendors for medical quality displays.  These are
all quite expensive because they provide both 10-bit input and 12+bit
DAC controls so that they can both calibrate the system to comply with
the display standard and convey 10-bit image data.  They also
incorporate the very high luminance required to achieve 10-bit
viewability.  Unfortunately from an XFree perspective, most of the
medical vendors will not disclose the programming information for the
display controllers.

The actual requirement for resolution depends on the imaging modality
and the purpose of viewing.  For many purposes, an 8-bit display that
meets the display standard will be sufficient.  For other purposes it
will not.  I would never consider doing general radiographic diagnosis
of a chest with anything under 10-bit.  I would never consider doing it
in without controlled ambient lighting and the very high brightness of a
radiology oriented monitor.  These are the norm in any reading room. But
for ultrasound an 8-bit display with calibration and proper lighting
should suffice.  

There is also a big difference between diagnosis and other uses.

Further, just a warning about safety regulations.  The FDA regulates
medical devices under the Safe Medical Device Act.  You might be an
unwitting manufacturer. The definitions of device and manufacturer are
very broad. So check whether these safety laws apply to you. It is a
very serious crime to ship a medical device without a serious effort to
comply with the laws. It is a far less serious crime to make mistakes in
compliance. The website at http://www.fda.gov/cdrh/overview.html is a
good starting point.  FDA regulations include efficacy rules, and
questions about necessary display quality might be an efficacy question.

R Horn 

___
Xpert mailing list
[EMAIL PROTECTED]
http://XFree86.Org/mailman/listinfo/xpert



Re: [Xpert]Re: Re: XIE and PEX5?

2002-06-10 Thread rjh

On 10 Jun, Juliusz Chroboczek wrote:
 MH Even moreso I hope nobody uses PEX.  ;o)
 
 Could somebody smart explain why PHIGS was abandoned?  Is that because
 OpenGL is strictly more expressive?
 
 (Not being argumentative here, but genuinely incompetent and curious.)

My speculation is that it is because PHIGS is highly specialized towards
the needs of CAD/CAM operations and rather deficient from the
perspective of games.  CAD/CAM has remained a niche market.  Within the
CAD/CAM market there is little interest in separating the display from
the processing, so PHIGS is of interest only to the extent that the
common X implementation outperforms the equivalent implementation within
the CAD/CAM application.  Using the X primitives directly is probably
superior when you do not suffer round trip delays during direct
interaction between application and mouse motion.

Of course I stopped paying attention to PHIGS when I left the CAD/CAM
market.  There may be other reasons as well.

R Horn

___
Xpert mailing list
[EMAIL PROTECTED]
http://XFree86.Org/mailman/listinfo/xpert



Re: [Xpert]Re: libxml needs iconv.h ?

2002-02-20 Thread rjh

On 19 Feb, Keith Packard wrote:
 
 Around 14 o'clock on Feb 19, [EMAIL PROTECTED] wrote:
 
 I know that XML is the computer religion du jure, but as David and
 others have mentioned, the bulk of the problems are from situations
 where users are required to adjust the configuration files to deal with
 problems in the X server.
 
 The trouble is that many of us instinctively recoil from any wildly popular
 new mechanism assuming that it's being misapplied again and again.  Most of
 the time, this is true.  I've been considering this issue for several 
 years and have had many separate people suggest that perhaps XML is a good 
 fit for this job; please bear with me as we try to rid ourselves of our 
 common predjudices against anything we see in the popular press.
 

My concern with XML is whether we will encourage mis-use and unrealistic
expectations.  I have seen completely unrealistic expectations from XML
believers lead to very bad decisions.  I do not want to encourage that.

As a format, XML is fine.  I would prefer S-expressions, but a
restricted XML schema is merely S-expressions with pointy delimiters
instead of curved delimiters.  S-expressions are considered utterly
obscure and immensely difficult while XML is intuitively obvious to the
naive observer.  This is not due to the difference between pointy and
curved delimiters.  It is the impact of herd mentality in computers. I
understand S-expressions, both their strengths and weaknesses, and I
think that S-expressions are a good choice for configuration files.
Giving them pointy delimiters does not change that.

If the schema is designed and specified so that none of the XML
extensions beyond the S-expression analogues are permitted, then we can
also consider using a tiny customized XML parser when we want to reduce
the footprint of X.  S-expression parsing can be done in a very compact
form, and the equivalent restricted XML parser would be similarly
compact and fast.

I am using the term schema deliberately.  I think that the schema is the
proper route, not the DTD.  The differences are small but important.
Schema give you a little more rope with which to hang yourself, but the
DTD already gave you more than enough to get in real trouble.  Schema
permit you to document the contents and constraints much more clearly
than DTDs and that is important.  If we stick to S-expression
equivalents, the schema can be converted into a DTD automatically so
that older tools that only support DTDs can be kept happy.  

R Horn

___
Xpert mailing list
[EMAIL PROTECTED]
http://XFree86.Org/mailman/listinfo/xpert



Re: [Xpert]Re: libxml needs iconv.h ?

2002-02-19 Thread rjh

I know that XML is the computer religion du jure, but as David and
others have mentioned, the bulk of the problems are from situations
where users are required to adjust the configuration files to deal with
problems in the X server.  We should eliminate them first, because these
are problems regardless of config file format.

Then we should design our schema before releasing anything.  I have
dealt with various schema and have designed some XML schema.  Ad hoc
inconsistent XML schema are worse than current configuration file
format. We need to have a very clear schema structure, clear rules on
how the schema is extended for particular driver features, etc.  If we
don't do this up front, then the distributed nature of X development
will lead to an incoherent and inconsistent XML mess.

This does not mean we should not do some prototypes and learning first.
It does mean that these should be recognized as such.  We should learn
from them and warn people that we will scrap the early XML formats later
(after we have learned what works well and what works poorly).  Then we
should make a coherent Schema design that is well described so that the
independent developers create sub-schema that fit together in a
consistent manner.

R Horn

___
Xpert mailing list
[EMAIL PROTECTED]
http://XFree86.Org/mailman/listinfo/xpert



Re: [Xpert]Laptop backlight question

2002-01-06 Thread rjh

On  5 Jan, Kenneth Crudup wrote:
 On Sat, 5 Jan 2002 [EMAIL PROTECTED] wrote:
 
 I've lost track of which chip you are dealing with.
 
 S3 Virge MX.
 
 For the R128 chip I found that the DPMS only shuts down video output.
 
 I'd expect that, but I was under the impression that there was code
 specifically for the MX (an LCD-specific chip) that could shut down
 the associated backlight, too.

Not in the Xfree driver for R128.  The DPMS code shuts down the CRTC.
This certainly shuts down video, but it has nothing special for LCD.  If
the MX or LCD electronics uses video shutdown to trigger LCD shutdown
the same way that the CRT does, then the LCD will shut down.  If not,
then it will stay on.

The change that I made was to add code that instructs the R128 to enter
and exit S3 state when DPMS enters and exits shutdown mode.  I do not
know everything that it does for S3 state, but it is more than just
shutting down the CRTC.

R Horn

___
Xpert mailing list
[EMAIL PROTECTED]
http://XFree86.Org/mailman/listinfo/xpert



Re: [Xpert]Laptop backlight question

2002-01-05 Thread rjh

On  4 Jan, Kenneth Crudup wrote:
 On Fri, 4 Jan 2002, Kevin Brosius wrote:
 
 Does 'xset dpms force off' turn off the backlight on that machine?
 
 First thing I tried.
 

I've lost track of which chip you are dealing with.  For the R128 chip I
found that the DPMS only shuts down video output.  It does not activate
any of the other power saving modes.  This is for a good reason (at
least as default).  Undesirable interactions were reported with the APM
support when the X DPMS code used these other commands.

In my case I do not want APM or ACPI to activate.  I am cutting power in
a server environment, and the APM/ACPI is aimed more at single user
systems like laptops.  Also, the Linux APM/ACPI support remains flakey.
So I modified the R128 DPMS code to not only cut the video output, but
also to activate the shutdown for other parts of the controller.  I can
post the diffs if this helps.

This does raise a question for X users in general.  Should there be an
option to enable a more aggressive power management when DPMS is used?
This would probably interfere with APM/ACPI, so the option would need
appropriate warnings.  But it would help the multi-user systems that
need a more selective power management control.  I my situation, power
management consists of killing sound and display after idle, and
controlling idle spindown time on selected disk drives.  Some disks must
remain spinning to meet response goals.  Others can be spun down.  The
CPU must always remain ready for immediate response.

R Horn

___
Xpert mailing list
[EMAIL PROTECTED]
http://XFree86.Org/mailman/listinfo/xpert



Re: [Xpert]EGA-Cards on XFree ?

2001-09-19 Thread rjh

On 19 Sep, division by zero wrote:
 
 I have a nice display (lcd) with 640x480 but it needs to be driven by
 digital signals. My Idee was to take an old EGA-Card i have here (AST3G)
 and if i'm right, EGA is digital. Now my question is, does anyone know,
 if EGA-cards are supported in anyway by XFree ? (And which one are
 supported ?)
 
 xrXO EGA? EGA is _not_ digital, it's _very_ analog, and pretty old. Far as I
 xrXO know, even XFree 3.3.x didn't support anything less than a standard VGA
 xrXO display device. Are you sure it's an EGA card you have?
 
 If you have an older LCD display, and you are SURE it is TTL/EGA/CGA
 and that it will sync to EGA horizontal scan, then it should work.
 But I couldn't even begin to speculate about Xfree86 support.

There were also some LCD and plasma displays built that came with their
own simple interface card that presented an EGA interface.  I used to
have one of these.  But the LCD/plasma interface was a proprietary
digital interface, not a normal video signal.  If you have one of these
old LCD/plasma systems you need to have the original EGA interface card
that came with it.  You cannot substitute a different card.

R Horn

___
Xpert mailing list
[EMAIL PROTECTED]
http://XFree86.Org/mailman/listinfo/xpert