Re: [Xpert]pre-allocated colormap?

2001-09-24 Thread Juliusz Chroboczek

KP> Can DPS handle a non-uniform set of colors?

No.  The protocol requires a colour cube (not necessarily cubic) and a
gray ramp (possibly being the diagonal of the colour cube); in
addition, the relevant colourmap indices need to be in arithmetic
progression.

The current implementation additionally requires that the colour cube
be cubic.

KP> If you made DPS use Render, these questions would all be moot :-)

I'll most definitely move DPS to be above RENDER when I next have some
time to work on it (something that hasn't happened in over six months,
though).  But I'll still use Ghostscript's code for dithering.

Juliusz
___
Xpert mailing list
[EMAIL PROTECTED]
http://XFree86.Org/mailman/listinfo/xpert



Re: [Xpert]pre-allocated colormap?

2001-09-23 Thread Keith Packard


Around 16 o'clock on Sep 23, Juliusz Chroboczek wrote:

> The 9 + 4 x 4 x 4 scheme gives excellent results (to my eyes) while
> leaving a lot of colour cells free.  I was suggesting that you should
> try implementing a similar colourmap for RENDER, and see what the
> results look like.

I think a 4x4x4 cube has too many dark entries; I'd rather weight the
values so that it uses more light entries instead.

Can DPS handle a non-uniform set of colors?  Render is happy with 
essentially any set, but I suspect DPS will want to use relatively 
simplistic dithering...

> Just like you, I consider 8 bit colour displays to be obsolete, and I
> am not interested in putting much effort into making DPS work well on
> those.  (8 and even 4 bit grayscale displays are quite nice, on the
> other hand.)

If you made DPS use Render, these questions would all be moot :-)

[EMAIL PROTECTED]XFree86 Core Team  SuSE, Inc.


___
Xpert mailing list
[EMAIL PROTECTED]
http://XFree86.Org/mailman/listinfo/xpert



Re: [Xpert]pre-allocated colormap?

2001-09-23 Thread Juliusz Chroboczek

>> On an eight-bit visual, XDPS will allocate a 9-level gray ramp, and a
>> 4x4x4 colour cube, for a total of 73 colourmap entries.  (Colour
>> allocation happens on the client side in XDPS.)

>> Can RENDER deal with such a configuration?

KP> RENDER must have the colors allocated within the server; there's no 
KP> provision for the extension to use a client-provided colormap.

I understand that.  I was asking whether RENDER can deal with a gray
ramp of a different size than the diagonal of the colour cube.

The 9 + 4 x 4 x 4 scheme gives excellent results (to my eyes) while
leaving a lot of colour cells free.  I was suggesting that you should
try implementing a similar colourmap for RENDER, and see what the
results look like.

> if you happened to create a colormap compatible with DPS for Render

Just like you, I consider 8 bit colour displays to be obsolete, and I
am not interested in putting much effort into making DPS work well on
those.  (8 and even 4 bit grayscale displays are quite nice, on the
other hand.)

Juliusz
___
Xpert mailing list
[EMAIL PROTECTED]
http://XFree86.Org/mailman/listinfo/xpert



Re: [Xpert]pre-allocated colormap?

2001-09-22 Thread Keith Packard


Around 19 o'clock on Sep 22, Juliusz Chroboczek wrote:

> On an eight-bit visual, XDPS will allocate a 9-level gray ramp, and a
> 4x4x4 colour cube, for a total of 73 colourmap entries.  (Colour
> allocation happens on the client side in XDPS.)

> Can RENDER deal with such a configuration?

RENDER must have the colors allocated within the server; there's no 
provision for the extension to use a client-provided colormap.

That being the case, the current render code is happy to use any colormap 
whatsoever, so if you happened to create a colormap compatible with DPS 
for Render, it will use it and DPS can share.

-keith


___
Xpert mailing list
[EMAIL PROTECTED]
http://XFree86.Org/mailman/listinfo/xpert



Re: [Xpert]pre-allocated colormap?

2001-09-22 Thread Juliusz Chroboczek

On an eight-bit visual, XDPS will allocate a 9-level gray ramp, and a
4x4x4 colour cube, for a total of 73 colourmap entries.  (Colour
allocation happens on the client side in XDPS.)

Can RENDER deal with such a configuration?

Juliusz

___
Xpert mailing list
[EMAIL PROTECTED]
http://XFree86.Org/mailman/listinfo/xpert



Re: [Xpert]pre-allocated colormap?

2001-09-16 Thread Mark Vojkovich

On Sun, 16 Sep 2001 [EMAIL PROTECTED] wrote:

> 3) do we have any idea how many times things fail if static color is
> made the default?  I've certainly occasionally run into such servers,
> but not recently, and therefore don't have a feel as to how often things break.

   The legacy PseudoColor apps will break.  Most of the time they
don't do any palette tricks, they just allocate cells instead of
colors.  If they just called XAllocColor they'd have worked with
all visuals.  If the default is StaticColor they probably won't
work at any depth anymore.

   I suspect there are another class of applications which try
to allocate cells only if they see a depth 8 visual.  Those would
have previously worked in all depths, but may now break in depth
8.


Mark.

___
Xpert mailing list
[EMAIL PROTECTED]
http://XFree86.Org/mailman/listinfo/xpert



Re: [Xpert]pre-allocated colormap?

2001-09-16 Thread Jim Gettys

1) since monitors vary, I think some to match colors into an existing
preallocated cube is OK.  Presumably we can figure out how much slop is
tolerable.  I don't think rws would be too upset (nor am I), if we
can come up with a defensible description of what we're doing.  It is
unpredictable action that seems really wrong to me.

2) the size of the cube allocated should probably be settable when the
server starts.  We've lived with similar situations with Netscape in the
past, and I think we'll survive in the future this way.

3) do we have any idea how many times things fail if static color is
made the default?  I've certainly occasionally run into such servers,
but not recently, and therefore don't have a feel as to how often things break.

4) presumably, we could make it optional how much slop to tolerate; this
interacts with the amount of space left unallocated in some fashion.  Some
amount of unallocated space is needed for antique apps that use writable entries
to run at all without flashing.
  - Jim


--
Jim Gettys
Cambridge Research Laboratory
Compaq Computer Corporation
[EMAIL PROTECTED]

___
Xpert mailing list
[EMAIL PROTECTED]
http://XFree86.Org/mailman/listinfo/xpert



Re: [Xpert]pre-allocated colormap?

2001-09-16 Thread Jim Gettys

Sidik,

I know the image display hackery done for astronomy well: I used to write
this code myself (at SAO, in the OIR group, for some of the early CCD
systems) before I started hacking window systems.  So I can comment on
this on both sides of the issue: as window system designer and former astronomy
hacker.

A large part of why pseudocolor exists in X was to support such applications
on the 1 mip class machines of the late 1980's, where displays were often
over 2 megabyte/second busses (with a tailwind).  But that is now over a decade
in the past.  We knew then (or at least I did) that pseudocolor was evil and
would eventually go away, but accepted it as certain apps, such as those
used by astronomers and for other imaging applications, were not possible
without it on that class hardware.

Thankfully, that hardware is now hardware of the (long) past

Any computer system with PCI class display hardware is fast enough
to emulate colormap tricks in completely in software that should be adequate
for interactive, real time contrast enhancement/display trickery
completely on the client side, particularly using
the shared memory extension.

I also know that most astronomers have access to most of their source code, and
it is, I believe, the right thing in your case to rewrite the app,
since hardware even capable of pseudocolor at all is becoming hard to come by.
This doesn't handle the case of what to do when source is not available.

I'll comment on the general discussions separately.
- Jim

--
Jim Gettys
Cambridge Research Laboratory
Compaq Computer Corporation
[EMAIL PROTECTED]

___
Xpert mailing list
[EMAIL PROTECTED]
http://XFree86.Org/mailman/listinfo/xpert



Re: [Xpert]pre-allocated colormap?

2001-09-15 Thread Keith Packard


Around 16 o'clock on Sep 15, Mark Vojkovich wrote:

>Like by temporarily lowering the sigBits?  Changing current allocation
> behavior makes me uneasy.  Doesn't the specification say something
> about this?

AllocColor:

"This request allocates a read-only colormap entry corresponding to the
 closest RGB values provided by the hardware. It also returns the pixel and
 the RGB values actually used. Multiple clients requesting the same
 effective RGB values can be assigned the same read-only entry, allowing
 entries to be shared."

This has always been interpreted to say that PseudoColor allocations 
return the best possible color and fail if no exactly matching cell is 
available.  And, I'm sure rws would concur that this was the intent.

However, we might consider stretching this to allow matching of existing 
entries that were not precisely the same as the requested value as a way 
to avoid application failure or flashing.  This is what StaticColor 
visuals already do, so it's not as if we'd be breaking new ground here.

The server could allocate a set of fixed colors that would be used to 
match all read-only requests; read-write requests would use the remainder 
of the colormap.  That seems like it would give applications the benefits 
of PseudoColor visuals while forcing other applications to share the 
available read-only cells better than they do today.

It would be easy to do; just make read-only allocations from clients match 
existing read-only cells in the colormap, then let the Render extension 
build the read-only cells and you're all set.

> Besides the quality will drop noticeable.

Yes, applications allocating read-only colors will see a significant 
reduction in fidelity. Those same applications usually have an option to 
create their own private colormap, in which case they'll have complete 
access to the available hardware capabilities.

> How about a config file option for the size of Render's color cube?

That's my plan; just specify the maximum number of colors available to 
Render in the config file and make it suffer.  When presented with 0 
colors, it should disable itself.

[EMAIL PROTECTED]XFree86 Core Team  SuSE, Inc.


___
Xpert mailing list
[EMAIL PROTECTED]
http://XFree86.Org/mailman/listinfo/xpert



Re: [Xpert]pre-allocated colormap?

2001-09-15 Thread Mark Vojkovich

On Sat, 15 Sep 2001, Keith Packard wrote:

> What do others think about making read-only PseudoColor allocations a bit 
> less strict and permitting them to match nearby entries with not exactly 
> the same RGB values?  That would reduce color fidelity while also avoiding
> application failure in many cases.
> 

   Like by temporarily lowering the sigBits?  Changing current allocation
behavior makes me uneasy.  Doesn't the specification say something
about this?  I would have thought that to be illegal.  Besides the
quality will drop noticeable.  I used to run in 8 bit most of the
time on my 2 Meg cards.  There was a big difference between 6 and
8 sigBits in xv and Netscape.

   How about a config file option for the size of Render's color cube?


Mark.

___
Xpert mailing list
[EMAIL PROTECTED]
http://XFree86.Org/mailman/listinfo/xpert



Re: [Xpert]pre-allocated colormap?

2001-09-15 Thread Keith Packard


Around 15 o'clock on Sep 15, Mark Vojkovich wrote:

>   3) No preallocated colors by default. 
> 
> It might be reasonable to assume users stuck in PseudoColor are there
> because they have antiquated hardware, in which case, they don't have
> very high expectations for things like anti-aliasing.  Either that
> or they are in PseudoColor for a the palettes, in which case, they
> might resent preallocations of any type.

That's equivalent to disabling the Render extension, which should probably 
be an option.  As Render operates strictly with RGBA values, not providing 
a mapping for pixels on the screen means there's no way to draw anything.

I'd argue that this is not a reasonable default though, I hope that most 
applications will be able to count on the existance of the Render 
extension in the future and avoid significant client-side complexity as a  
result.

The question then is what the right default is; I think Render should be 
enabled, but probably with fewer colors than it consumes right now.

What do others think about making read-only PseudoColor allocations a bit 
less strict and permitting them to match nearby entries with not exactly 
the same RGB values?  That would reduce color fidelity while also avoiding
application failure in many cases.

[EMAIL PROTECTED]XFree86 Core Team  SuSE, Inc.


___
Xpert mailing list
[EMAIL PROTECTED]
http://XFree86.Org/mailman/listinfo/xpert



Re: [Xpert]pre-allocated colormap?

2001-09-15 Thread Mark Vojkovich

On Sat, 15 Sep 2001, Keith Packard wrote:

> I see two easy solutions for this problem:
> 
>   1)  Have Render use fewer colors from the default colormap
>   2)  Have XFree86 promote StaticColor as the usual default visual

   3) No preallocated colors by default. 

It might be reasonable to assume users stuck in PseudoColor are there
because they have antiquated hardware, in which case, they don't have
very high expectations for things like anti-aliasing.  Either that
or they are in PseudoColor for a the palettes, in which case, they
might resent preallocations of any type.


Mark.

___
Xpert mailing list
[EMAIL PROTECTED]
http://XFree86.Org/mailman/listinfo/xpert



Re: [Xpert]pre-allocated colormap?

2001-09-15 Thread Mark Vojkovich

On Sat, 15 Sep 2001, Sidik Isani wrote:

> Keith Packard wrote:
> |
> |
> |Around 10 o'clock on Sep 15, Sidik Isani wrote:
> |
> |>   Unfortunately, there is.  We look at astronomy CCD images, and
> |>   the visualization tools like to manipulate a colormap to give a
> |>   quick contrast adjustment.
> |
> |If you have source, you could consider fixing these to use DirectColor 
> |instead; that would permit adjusting the luminence curves on the fly.  
> |But, I suspect you also like to switch colors around, which this would not 
> |do.
> 
>   It may be enough.  Usually we look at our images with simple
>   grey scales, and just need a way to exaggerate contrast to bring
>   out features.  I'd like to learn more about DirectColor.  How much
>   control does it give over luminence curves?  Which XFree86 drivers
>   support it (at depths > 8bpp)?

   Most drivers support this for depths > 8bpp.  NVIDIA and Matrox
at least, probably alot of others.  TrueColor is essentially three
palette lookups with the indicies for the 3 channels (Red, Green and
Blue) being extracted from the pixel according to the RGB masks.
The difference between TrueColor and DirectColor is that you can
change the lookup tables with DirectColor.  Still, there is only
one hardware palette so you get flashing when changing focus between
TrueColor and a DirectColor window.  If you just want to change
the contrast a little this is not so annoying.  Switching colors
around or heavy thresholding, while possible with DirectColor,
will suffer the same flashing consequences as private PseudoColor maps.


> 
> |I assume these tools are creating their own colormaps; if this is the 
> |case, you might consider using StaticColor for the default visual, this 
> |would avoid flashing among the other windows while still permitting the 
> |display to run at 8 bits.
> 
>   Thanks.  I'll give that a try too.

   But of course, the StaticColor palette is a simple color cube
and might not give you the range of colors you need.  Certainly not
a good number of greys.  And in the overlay mode only PseudoColor and
GrayScale visuals are offered, not StaticColor.  That is because the
transparency key steals a color and only in the read/write maps can
the key be preallocated.



Mark.

___
Xpert mailing list
[EMAIL PROTECTED]
http://XFree86.Org/mailman/listinfo/xpert



Re: [Xpert]pre-allocated colormap?

2001-09-15 Thread Keith Packard


Around 11 o'clock on Sep 15, Sidik Isani wrote:

>   It may be enough.  Usually we look at our images with simple
>   grey scales, and just need a way to exaggerate contrast to bring
>   out features.  I'd like to learn more about DirectColor.  How much
>   control does it give over luminence curves?  Which XFree86 drivers
>   support it (at depths > 8bpp)?

DirectColor provides a LUT for each component -- the 256 possible 
values for each component are mapped to any of 256 possible inputs to the 
DAC.  Traditionally, this is used to compensate for the monitor response 
curve while exposing linear intensity to applications.

Most PC graphics hardware supports this mode; although the i128 driver 
doesn't currently expose it.  I don't know if this is a hardware 
limitation or just missing code in the driver though.

The apm, ati, mga, nv, s3, savage drivers look like they do support 
DirectColor; I don't know about the others.

>  Would "x2x" plus 2 screen layouts do the trick?

I don't know; I'm not very familiar with x2x.

>  That sounds like a reasonable solution.  Do any popular programs
>  require Render?  Can I just leave this extension out completely
>  for this application?

As Render is still new, most applications can work without it.  It 
provides for anti-aliased graphics and RGBA image compositing, and on your 
digital monitors performs sub-pixel optimized rendering to make text 
really sharp looking.  I'll see about adding a configuration option to 
tune the number of pixels used; it will work with only black & white which 
are always available.

[EMAIL PROTECTED]XFree86 Core Team  SuSE, Inc.


___
Xpert mailing list
[EMAIL PROTECTED]
http://XFree86.Org/mailman/listinfo/xpert



Re: [Xpert]pre-allocated colormap?

2001-09-15 Thread Sidik Isani

Keith Packard wrote:
|
|
|Around 10 o'clock on Sep 15, Sidik Isani wrote:
|
|>   Unfortunately, there is.  We look at astronomy CCD images, and
|>   the visualization tools like to manipulate a colormap to give a
|>   quick contrast adjustment.
|
|If you have source, you could consider fixing these to use DirectColor 
|instead; that would permit adjusting the luminence curves on the fly.  
|But, I suspect you also like to switch colors around, which this would not 
|do.

  It may be enough.  Usually we look at our images with simple
  grey scales, and just need a way to exaggerate contrast to bring
  out features.  I'd like to learn more about DirectColor.  How much
  control does it give over luminence curves?  Which XFree86 drivers
  support it (at depths > 8bpp)?

|I assume these tools are creating their own colormaps; if this is the 
|case, you might consider using StaticColor for the default visual, this 
|would avoid flashing among the other windows while still permitting the 
|display to run at 8 bits.

  Thanks.  I'll give that a try too.

|> Since we had some Matrox cards lying around, 
|> I gave the Overlay feature a try and our applications work well with that.
|> We'd have to order some PCI versions to be able to plug three in, and we'd
|> have to use multi-link converters on each monitor.
|
|Analog signals to your digital monitors.  Yuck.

  Indeed.  But Matrox has a digital version (or add-on?) to the card.
  It would still require the use of the multi-link to go from one
  digital interface to another though.  Yuck.

|>  If we run only one screen in Overlay mode and exclude it from the
|>  Xinerama desktop, I guess that could work, but we'd prefer one
|>  logical screen, of course, even if dragging a pseudocolor window
|>  out of the Overlay'd screen made it appear black/garbage on the
|>  other monitors which don't support Overlay.
|
|Xinerama exposes the intersection of the screen capabilities, not the 
|union.  It would be a significant undertaking to change that.  It also 
|doesn't allow you to split things so that some screens are xinerama'd and 
|others are not.

  Would "x2x" plus 2 screen layouts do the trick?

|In the long term, however, the RandR extension will allow unsupported 
|visual classes to be emulated in software.  This, in conjunction with 
|Xinerama, would allow you to use one card with hardware overlays while 
|emulating the overlay in software on the other two cards.  I'm still 
|struggling over how to make this work with the existing XFree86 
|architecture though; don't expect news on this front anytime soon.

  Ok, good to know though.

|I think that for now, you should whack the number of colors reserved for
|the Render extension; eventually that will probably make it into the
|configuration file.

  That sounds like a reasonable solution.  Do any popular programs
  require Render?  Can I just leave this extension out completely
  for this application?

Thanks,

- Sidik
___
Xpert mailing list
[EMAIL PROTECTED]
http://XFree86.Org/mailman/listinfo/xpert



Re: [Xpert]pre-allocated colormap?

2001-09-15 Thread Keith Packard


Around 10 o'clock on Sep 15, Sidik Isani wrote:

>   Unfortunately, there is.  We look at astronomy CCD images, and
>   the visualization tools like to manipulate a colormap to give a
>   quick contrast adjustment.

If you have source, you could consider fixing these to use DirectColor 
instead; that would permit adjusting the luminence curves on the fly.  
But, I suspect you also like to switch colors around, which this would not 
do.

I assume these tools are creating their own colormaps; if this is the 
case, you might consider using StaticColor for the default visual, this 
would avoid flashing among the other windows while still permitting the 
display to run at 8 bits.

> Since we had some Matrox cards lying around, 
> I gave the Overlay feature a try and our applications work well with that.
> We'd have to order some PCI versions to be able to plug three in, and we'd
> have to use multi-link converters on each monitor.

Analog signals to your digital monitors.  Yuck.

>  If we run only one screen in Overlay mode and exclude it from the
>  Xinerama desktop, I guess that could work, but we'd prefer one
>  logical screen, of course, even if dragging a pseudocolor window
>  out of the Overlay'd screen made it appear black/garbage on the
>  other monitors which don't support Overlay.

Xinerama exposes the intersection of the screen capabilities, not the 
union.  It would be a significant undertaking to change that.  It also 
doesn't allow you to split things so that some screens are xinerama'd and 
others are not.

In the long term, however, the RandR extension will allow unsupported 
visual classes to be emulated in software.  This, in conjunction with 
Xinerama, would allow you to use one card with hardware overlays while 
emulating the overlay in software on the other two cards.  I'm still 
struggling over how to make this work with the existing XFree86 
architecture though; don't expect news on this front anytime soon.

I think that for now, you should whack the number of colors reserved for
the Render extension; eventually that will probably make it into the
configuration file.

[EMAIL PROTECTED]XFree86 Core Team  SuSE, Inc.


___
Xpert mailing list
[EMAIL PROTECTED]
http://XFree86.Org/mailman/listinfo/xpert



Re: [Xpert]pre-allocated colormap?

2001-09-15 Thread Sidik Isani


|Around 9 o'clock on Sep 15, Sidik Isani wrote:
|
||>   With 8 bpp pseudocolor, I'm finding that simply starting the
||>   X-Server allocates almost all the color cells in the default
||>   colormap.  There are only about 12 free cells, and most programs
||>   either fail or install a private colormap.  Is there an extension
||>   which may be doing this?
|
|Yes, the Render extension allocates most of your colormap now.  I was 
|planning on making the number it consumed configurable but I got 
|sidetracked by other work temporarily.  If you build your own X server, 
|you can easily hack this number in the source; look at
|
|   programs/Xserver/render/miindex.c:miInitIndexed
|
|What do others think we should do about this?  One possibility would be to 
|bend the protocol and match incoming requests to this palette when the 
|colormap was otherwise filled; that would eliminate color flashing while 
|still providing reasonable color matching.  The protocol is rather vague 
|on when this kind of sharing will occur; perhaps some less stringent 
|metric could be applied that allowed more distant colors to be used.
|
|The other option is to use StaticColor as your default visual; read-only 
|allocations will always succeed.  Is there some reason you can't run your 
|display at 16 or 24 bits?

Hello Keith -

  Unfortunately, there is.  We look at astronomy CCD images, and
  the visualization tools like to manipulate a colormap to give a
  quick contrast adjustment.  Many will not even run without a
  pseudocolor visual.  We are using 3 1600SW's for the task, and
  it is a real shame to run them at 8-bits!  Since we had some
  Matrox cards lying around, I gave the Overlay feature a try and
  our applications work well with that.  We'd have to order some
  PCI versions to be able to plug three in, and we'd have to use
  multi-link converters on each monitor.  (The power strip with all
  the AC adapters plugged in for this will become a fire hazard. ;-)

  Andrew Aitchison just posted that the Oxygen VX1-1600SW also
  has the Overlay feature (thanks!) so an interesting, but pricey
  solution for us might be to buy six of those (since we have two
  workstations which we'd like to keep the same).  This would
  allow direct digital connections to our 1600SWs, like we have
  now with the i128.

  If we run only one screen in Overlay mode and exclude it from the
  Xinerama desktop, I guess that could work, but we'd prefer one
  logical screen, of course, even if dragging a pseudocolor window
  out of the Overlay'd screen made it appear black/garbage on the
  other monitors which don't support Overlay.  I know it's a hack
  (but one which saves us about $1400) ... is it possible?

Thanks for the help,

- Sidik
___
Xpert mailing list
[EMAIL PROTECTED]
http://XFree86.Org/mailman/listinfo/xpert



Re: [Xpert]pre-allocated colormap?

2001-09-15 Thread Keith Packard


Around 12 o'clock on Sep 15, Mark Vojkovich wrote:

>Can render allocate the colors as needed and use whatever it
> gets?  This would be more consistent with traditional PsuedoColor
> behavior.  That is, first come first serve and if you don't like that
> use a private map.  

Hmm.  Not with the current protocol.  Nor do I think it possible in 
general.

As you know, pixels don't have RGB values in the core protocol.  Render
imposes RGB values through PictFormats.  For indexed PictFormats, it
provides for a static pixel<->RGB mapping.

For each indexed visual (i.e. those other than Direct/True), Render creates
an indexed PictFormat and initializes the static mapping by allocating
colors from an appropriate colormap.  For the default visual, it uses the
default colormap.  For non-default visuals, it creates a private colormap
that can be shared among all windows using that visual.

As any RGB value can be generated as a result of an image composition
operation, Render needs to be able to select an appropriate pixel value 
with the assurance that it will be available.  Attempting to do this 
dynamically would require windows to switch colormaps on the fly -- if 
Render was unable to allocate the desired color, it would have little 
choice but to force the window to use a different colormap.

There isn't any provision to update the mapping dynamically; the data
are provided to applications without any asychronous notification of 
change.

Given the ever-shrinking number of pseudo-color displays, I felt it was 
more important that applications developed on TrueColor hardware work 
without error on PseudoColor than that the PseudoColor hardware be used to 
its best advantage.  We've learned the hard way that developers make
incautious assumptions about how X visuals work; many PseudoColor 
applications still fail on TrueColor displays.  I was attempting to avoid 
the reverse problem when designing this new extension.

The extension could be changed to allocate the color cube when the first 
application requested information about it's contents; that would make 
environments not using the Render extension work just like they did 
before.  However, I think that's a band-aid solution; users would have to 
avoid most modern applications (all of KDE and Gnome) to make this work.

I see two easy solutions for this problem:

1)  Have Render use fewer colors from the default colormap
2)  Have XFree86 promote StaticColor as the usual default visual

The second would eliminate flashing for the vast bulk of applications while
causing strange application failures where assumptions about the default 
visual class were made.  The first would provide for Render applications 
to run at reduced color fidelity, that may be acceptable for people 
continuing to use older hardware incapable of running deeper than 8 bits.

[EMAIL PROTECTED]XFree86 Core Team  SuSE, Inc.


___
Xpert mailing list
[EMAIL PROTECTED]
http://XFree86.Org/mailman/listinfo/xpert



Re: [Xpert]pre-allocated colormap?

2001-09-15 Thread Mark Vojkovich

On Sat, 15 Sep 2001, Keith Packard wrote:

> 
> Around 9 o'clock on Sep 15, Sidik Isani wrote:
> 
> |>   With 8 bpp pseudocolor, I'm finding that simply starting the
> |>   X-Server allocates almost all the color cells in the default
> |>   colormap.  There are only about 12 free cells, and most programs
> |>   either fail or install a private colormap.  Is there an extension
> |>   which may be doing this?
> 
> Yes, the Render extension allocates most of your colormap now.  I was 
> planning on making the number it consumed configurable but I got 
> sidetracked by other work temporarily.  If you build your own X server, 
> you can easily hack this number in the source; look at
> 
>   programs/Xserver/render/miindex.c:miInitIndexed
> 
> What do others think we should do about this?  One possibility would be to 
> bend the protocol and match incoming requests to this palette when the 
> colormap was otherwise filled; that would eliminate color flashing while 
> still providing reasonable color matching.  The protocol is rather vague 
> on when this kind of sharing will occur; perhaps some less stringent 
> metric could be applied that allowed more distant colors to be used.

   Can render allocate the colors as needed and use whatever it
gets?  This would be more consistent with traditional PsuedoColor
behavior.  That is, first come first serve and if you don't like that
use a private map.  


Mark.


___
Xpert mailing list
[EMAIL PROTECTED]
http://XFree86.Org/mailman/listinfo/xpert



Re: [Xpert]pre-allocated colormap?

2001-09-15 Thread Mark Vojkovich

On Sat, 15 Sep 2001, Sidik Isani wrote:

> Mark Vojkovich wrote:
> |On Fri, 14 Sep 2001, Sidik Isani wrote:
> |
> |> Hello -
> |> 
> |>   With 8 bpp pseudocolor, I'm finding that simply starting the
> |>   X-Server allocates almost all the color cells in the default
> |>   colormap.  There are only about 12 free cells, and most programs
> |>   either fail or install a private colormap.  Is there an extension
> |>   which may be doing this?
> |
> |   Not that I know of.  Run xcmap to visualize the colormap.
> |When the server starts up you will get approximately 2 colors
> |(black and white).  I don't think xterm should be allocating
> |very much of the palette.  If you find differently, this is
> |something that should be addressed.
> 
> Hello Mark -
> 
>   2 colors (black and white) is indeed what I get when I run 4.1.0.
>   But if I swap that server out for the CVS one (which I need in order
>   to have Xinerama working on the i128 driver) I get black, white, and
>   exactly 242 colors in a standard-looking colormap pattern.
> 
>   I wonder if the fact that I've compiled my X-Server without modules
>   is causing some extension to initialize itself?  I tried again
>   without BuildCup, but same problem.  Is there a call, such as
>   AllocColor(), to which I could add some debugging to track this
>   down?
> 

   Hmmm.  This appears to be caused by the Render extension.
Have PictureInitIndexedFormats() in xc/programs/Xserver/render/miindex.c
return immediately without doing anything to see if that fixes it.


Mark.

___
Xpert mailing list
[EMAIL PROTECTED]
http://XFree86.Org/mailman/listinfo/xpert



Re: [Xpert]pre-allocated colormap?

2001-09-15 Thread Keith Packard


Around 9 o'clock on Sep 15, Sidik Isani wrote:

|>   With 8 bpp pseudocolor, I'm finding that simply starting the
|>   X-Server allocates almost all the color cells in the default
|>   colormap.  There are only about 12 free cells, and most programs
|>   either fail or install a private colormap.  Is there an extension
|>   which may be doing this?

Yes, the Render extension allocates most of your colormap now.  I was 
planning on making the number it consumed configurable but I got 
sidetracked by other work temporarily.  If you build your own X server, 
you can easily hack this number in the source; look at

programs/Xserver/render/miindex.c:miInitIndexed

What do others think we should do about this?  One possibility would be to 
bend the protocol and match incoming requests to this palette when the 
colormap was otherwise filled; that would eliminate color flashing while 
still providing reasonable color matching.  The protocol is rather vague 
on when this kind of sharing will occur; perhaps some less stringent 
metric could be applied that allowed more distant colors to be used.

The other option is to use StaticColor as your default visual; read-only 
allocations will always succeed.  Is there some reason you can't run your 
display at 16 or 24 bits?

[EMAIL PROTECTED]XFree86 Core Team  SuSE, Inc.


___
Xpert mailing list
[EMAIL PROTECTED]
http://XFree86.Org/mailman/listinfo/xpert



Re: [Xpert]pre-allocated colormap?

2001-09-15 Thread Sidik Isani

Mark Vojkovich wrote:
|On Fri, 14 Sep 2001, Sidik Isani wrote:
|
|> Hello -
|> 
|>   With 8 bpp pseudocolor, I'm finding that simply starting the
|>   X-Server allocates almost all the color cells in the default
|>   colormap.  There are only about 12 free cells, and most programs
|>   either fail or install a private colormap.  Is there an extension
|>   which may be doing this?
|
|   Not that I know of.  Run xcmap to visualize the colormap.
|When the server starts up you will get approximately 2 colors
|(black and white).  I don't think xterm should be allocating
|very much of the palette.  If you find differently, this is
|something that should be addressed.

Hello Mark -

  2 colors (black and white) is indeed what I get when I run 4.1.0.
  But if I swap that server out for the CVS one (which I need in order
  to have Xinerama working on the i128 driver) I get black, white, and
  exactly 242 colors in a standard-looking colormap pattern.

  I wonder if the fact that I've compiled my X-Server without modules
  is causing some extension to initialize itself?  I tried again
  without BuildCup, but same problem.  Is there a call, such as
  AllocColor(), to which I could add some debugging to track this
  down?

Thanks,

- Sidik
___
Xpert mailing list
[EMAIL PROTECTED]
http://XFree86.Org/mailman/listinfo/xpert



Re: [Xpert]pre-allocated colormap?

2001-09-15 Thread Mark Vojkovich

On Fri, 14 Sep 2001, Sidik Isani wrote:

> Hello -
> 
>   With 8 bpp pseudocolor, I'm finding that simply starting the
>   X-Server allocates almost all the color cells in the default
>   colormap.  There are only about 12 free cells, and most programs
>   either fail or install a private colormap.  Is there an extension
>   which may be doing this?

   Not that I know of.  Run xcmap to visualize the colormap.
When the server starts up you will get approximately 2 colors
(black and white).  I don't think xterm should be allocating
very much of the palette.  If you find differently, this is
something that should be addressed.


Mark.

___
Xpert mailing list
[EMAIL PROTECTED]
http://XFree86.Org/mailman/listinfo/xpert



Re: [Xpert]pre-allocated colormap?

2001-09-15 Thread Dr Andrew C Aitchison

On Fri, 14 Sep 2001, Sidik Isani wrote:

> P.S.: Other than mga, does anything else support "Overlay"?

The glint driver supports overlay on appropriate 3Dlabs cards
(Oxygen VX1-1600SW is the only one I tried, and it works).
*Some* S3 cards have suitable hardware (TVP3026 or RGB526 ramdacs,
possibly other members of those chip families); I don't think that
the drivers support the feature yet.

>   What happens if I include one mga screen with "overlay"
>   in a xinerama configuration where the other screens are
>   different cards not supporting overlay?

xinerama only exports features available on all screens, so you would
lose overlay in this case.

-- 
Dr. Andrew C. Aitchison Computer Officer, DPMMS, Cambridge
[EMAIL PROTECTED]   http://www.dpmms.cam.ac.uk/~werdna

___
Xpert mailing list
[EMAIL PROTECTED]
http://XFree86.Org/mailman/listinfo/xpert



[Xpert]pre-allocated colormap?

2001-09-14 Thread Sidik Isani

Hello -

  With 8 bpp pseudocolor, I'm finding that simply starting the
  X-Server allocates almost all the color cells in the default
  colormap.  There are only about 12 free cells, and most programs
  either fail or install a private colormap.  Is there an extension
  which may be doing this?

  This also happens when starting the server with:

xinit /usr/X11R6/bin/xterm -- :0

  I'm running XFree4 and the i128 driver from yesterday's CVS.

Thanks for your help,

- Sidik

P.S.: Other than mga, does anything else support "Overlay"?
  What happens if I include one mga screen with "overlay"
  in a xinerama configuration where the other screens are
  different cards not supporting overlay?
___
Xpert mailing list
[EMAIL PROTECTED]
http://XFree86.Org/mailman/listinfo/xpert