Re: Linux OpenGL ABI discussion

2005-09-30 Thread Helge Hafting

Nicolai Haehnle wrote:

The real issue with an IHV-supplied libGL.so is mixing vendors' graphics 
cards. As an OpenGL user (i.e. a developer of applications that link 
against libGL), I regularly switch graphics cards around to make sure 
things work with all the relevant major vendors. Having a vendor-supplied 
libGL.so makes this unnecessarily difficult on the software side (add to 
that the custom-installed header files that have ever so slightly different 
semantics, and there is a whole lot of fun to be had).


Not to mention the use case with two graphics cards installed at the same 
time, from different vendors. While the above problem is annoying but 
acceptable, there's simply no reasonable way to use two graphics cards from 
vendors that insist on their custom libGL.so. Having to hack around with 
LD_LIBRARY_PATH and the likes is ridiculous.


Hear hear. 


I have such a two card system, I want to upgrade _one_
of them (an unstable pci radeon that hangs the machine
while doing 3d) but my options are limited indeed.  There
is no use installing some card if it comes with software that
that disables my one working card - that way i'd
still only have one working 3D card. :-(

Helge Hafting


---
This SF.Net email is sponsored by:
Power Architecture Resource Center: Free content, downloads, discussions,
and more. http://solutions.newsforge.com/ibmarch.tmpl
--
___
Dri-devel mailing list
Dri-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/dri-devel


Re: Linux OpenGL ABI discussion

2005-09-29 Thread Alan Cox
On Iau, 2005-09-29 at 09:49 +0200, Christoph Hellwig wrote:
 On Wed, Sep 28, 2005 at 04:07:56PM -0700, Andy Ritger wrote:
  Some of the topics raised include:
  
  - minimum OpenGL version required by libGL
  - SONAME change to libGL
  - libGL installation path
 
 I think the single most important point is to explicitly disallow
 vendor-supplied libGL binaries in the LSB.  Every other LSB componenet
 relies on a single backing implementation for a reason, and in practic

That is not actually true. It defines a set of API and ABI behaviours
which are generally based on a single existing common implementation.

 the Nvidia libGL just causes endless pain where people acceidentally
 link against it.  The DRI libGL should be declare the one and official
 one, and people who need extended features over it that aren't in the
 driver-specific backend will need to contribute them back.

If the LSB standard deals with libGL API/ABI interfaces then any
application using other interfaces/feature set items would not be LSB
compliant. Educating users to link with the base libGL is an education
problem not directly inside the LSB remit beyond the LSB test tools.

In addition the way GL extensions work mean its fairly sane for an
application to ask for extensions and continue using different
approaches if they are not available. In fact this is done anyway for
hardware reasons. There is a lack of is XYZ accelerated as an API but
that is an upstream flaw.

Alan



---
This SF.Net email is sponsored by:
Power Architecture Resource Center: Free content, downloads, discussions,
and more. http://solutions.newsforge.com/ibmarch.tmpl
--
___
Dri-devel mailing list
Dri-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/dri-devel


Re: Linux OpenGL ABI discussion

2005-09-29 Thread Nicolai Haehnle
On Thursday 29 September 2005 18:30, Alan Cox wrote:
 On Iau, 2005-09-29 at 09:49 +0200, Christoph Hellwig wrote:
  On Wed, Sep 28, 2005 at 04:07:56PM -0700, Andy Ritger wrote:
   Some of the topics raised include:
   
   - minimum OpenGL version required by libGL
   - SONAME change to libGL
   - libGL installation path
  
  I think the single most important point is to explicitly disallow
  vendor-supplied libGL binaries in the LSB.  Every other LSB componenet
  relies on a single backing implementation for a reason, and in practic
 
 That is not actually true. It defines a set of API and ABI behaviours
 which are generally based on a single existing common implementation.
 
  the Nvidia libGL just causes endless pain where people acceidentally
  link against it.  The DRI libGL should be declare the one and official
  one, and people who need extended features over it that aren't in the
  driver-specific backend will need to contribute them back.
 
 If the LSB standard deals with libGL API/ABI interfaces then any
 application using other interfaces/feature set items would not be LSB
 compliant. Educating users to link with the base libGL is an education
 problem not directly inside the LSB remit beyond the LSB test tools.
 
 In addition the way GL extensions work mean its fairly sane for an
 application to ask for extensions and continue using different
 approaches if they are not available. In fact this is done anyway for
 hardware reasons. There is a lack of is XYZ accelerated as an API but
 that is an upstream flaw.

The real issue with an IHV-supplied libGL.so is mixing vendors' graphics 
cards. As an OpenGL user (i.e. a developer of applications that link 
against libGL), I regularly switch graphics cards around to make sure 
things work with all the relevant major vendors. Having a vendor-supplied 
libGL.so makes this unnecessarily difficult on the software side (add to 
that the custom-installed header files that have ever so slightly different 
semantics, and there is a whole lot of fun to be had).

Not to mention the use case with two graphics cards installed at the same 
time, from different vendors. While the above problem is annoying but 
acceptable, there's simply no reasonable way to use two graphics cards from 
vendors that insist on their custom libGL.so. Having to hack around with 
LD_LIBRARY_PATH and the likes is ridiculous.

I'm not too familiar with the exact details of the DRI client-server 
protocol, so maybe it may be necessary to turn the libGL.so into even more 
of a skeleton, and reduce the basic DRI protocol to a simple tell me the 
client side driver name, so that IHVs can combine (for example) custom GLX 
extensions with direct rendering.

cu,
Nicolai


pgpeJXl3bNGf2.pgp
Description: PGP signature


Re: Linux OpenGL ABI discussion

2005-09-29 Thread Adam Jackson
On Thursday 29 September 2005 04:35, Dave Airlie wrote:
 I have to agree with Christoph, the libGL should be a
 one-size-fits-all and capable of loading drivers from any vendor, I'm
 not sure what is so hard about this apart from the fact that neither
 vendor has seemed willing to help out infrastructure on the basis of
 some belief that they shouldn't have to (maybe because they don't on
 Windows) or maybe because they don't want to be seen to collaborate on
 things there is hardly any major secrets in the libGL interface
 that should stop it...

There is exactly one secret: how to go from GL entrypoint to driver dispatch 
table as fast as possible while still being thread-correct and etc.  However 
this can be read right out of the compiled object with any reasonable 
disassembler, so it's not much of a secret.

 As far as I know idr did a lot of work recently on libGL so we can
 expose GL extensions for vendors like ATI without them having to ship
 their own driver (I'm not sure if ATI contributed anything more than a
 list of things needed).. I think he mentioned this was a bit more
 difficult for glx.. but I'm sure it should be possible...

We already had this thread:

http://lists.freedesktop.org/archives/dri-egl/2005-July/000565.html

In particular, Andy's response about why they're uninterested in a common 
libGL is basically The Last Word on the subject.  It would require that 
nvidia expend time, effort, and money to get to the same level of 
functionality they already have.  This applies equally to any other IHV, and 
to ISVs like XiG and SciTech too for that matter.  You can have whatever 
opinion you like about that stance, but it's simply an economic reality.

It's also irrelevant.  libGL simply needs to provide ABI guarantees.  
Specifying driver compatibility is outside the scope of the LSB.

I would make the case that the sonumber for a libGL that supports OpenGL 2.0 
should start with 1.  DSO version numbers are for ABI changes, and OpenGL 2.0 
is simply not backwards-incompatible with OpenGL 1.5 for the set of 
entrypoints they share.  It's not like 2.0 changes the prototype for glEnd() 
or anything.  So, 1.6.  Or 1.10 or whatever, if we really think that people 
want to do more GL 1.x versions.

I would also make the case that the LSB should in no case require an 
implementation to have features unavailable in open source.  In particular, 
requiring GL 2.0 would be broken.  Remember what the L stands for here.

The deeper issue here is whether it's actually useful to require some minimum 
level of functionality even when large swaths of it will be software.  If I 
don't have cube map support in hardware, do I really want to try it in 
software?  Is that a useful experience for developers or for users?

Perhaps what I would like is a new set of glGetString tokens that describe 
what version and extensions the hardware is actually capable of accelerating, 
rather than what the software supports.  Because in some sense, advertising 
GL 2.0 on a Riva is so inaccurate as to be worse than lying.

 This is as far as I know how MS's OpenGL ICD system works, there is
 one frontend and your driver can expose extra things via it...

It's not.  MS's MCD (mini client driver) system had something like our current 
system, where you have one GL dispatch layer and the vender provides a driver 
that gets loaded by the system.  In the ICD scheme, opengl32.dll (or whatever 
it is) is provided per-vendor.

- ajax


pgp6pMPXttBP8.pgp
Description: PGP signature


Re: Linux OpenGL ABI discussion

2005-09-29 Thread Allen Akin
On Thu, Sep 29, 2005 at 01:54:00PM -0400, Adam Jackson wrote:
| The deeper issue here is whether it's actually useful to require some minimum 
| level of functionality even when large swaths of it will be software.  If I 
| don't have cube map support in hardware, do I really want to try it in 
| software?  Is that a useful experience for developers or for users?

For OpenGL at least, history suggests the answer is usually yes.  The
argument goes back to the pre-1.0 days, when texture mapping was only
available on fairly exotic hardware.  The decision was made to require
it in the standard, and it turned out to be valuable on pure software
implementations because (1) it was fast enough to be usable for a
surprisingly large range of apps; (2) people with older hardware still
had the option to use it, rather than having that option closed off
up-front by the people defining the standard, and they found uses that
were worthwhile; (3) development could occur on older hardware for
deployment on newer hardware; (4) it served as a reference for hardware
implementations and a debugging tool for apps.

This experience was repeated with a number of other features as OpenGL
evolved.

If there's no consensus in the ARB about the desirability of a given
piece of functionality, it tends to be standardized as an extension (or
very rarely as a subset, like the Imaging Operations).  Extensions are
optional, so they provide middle ground.  But eventually, if a piece of
functionality proves valuable enough to achieve consensus, it moves into
the OpenGL core and software implementations become mandatory.

OpenGL ES has taken a slightly different route (with API profiles).  I
don't have firsthand knowledge of how well that's worked out.

| Perhaps what I would like is a new set of glGetString tokens that describe 
| what version and extensions the hardware is actually capable of accelerating, 
| rather than what the software supports.

This question also goes back to the very earliest days of OpenGL.

The fundamental technical problem is that there is no tractable way to
define an operation so that you can make a simple query to learn
whether it's accelerated.  So much depends on the current graphics state
(how many TMUs are enabled, the size of image or texture operands vs.
the size of available video memory, whether colors are specified by
floats or unsigned chars, whether vertices lie in DMAable or
CPU-accessible address space, etc., etc., ad infinitum) that most of the
time you can't even express a simple question like Is triangle drawing
accelerated?  A number of other APIs have gone down this road in the
past, and none of them found a viable solution to the problem.

In practice, two approaches are used with OpenGL.  One is simply to
benchmark the operations you want to perform and determine whether a
given OpenGL implementation is fast enough.  (This is used by
isfast-style libraries and by game developers, always during
development but occasionally during installation or initialization.) The
other is to assume that if an extension is advertised (via glGetString),
then it's accelerated; if an extension is present but not advertised,
then it's probably not accelerated.

There was interest a couple of years ago in implementing a more
sophisticated mechanism.  One option was a query of the form If I try
to execute a given drawing operation right now, with all the graphics
state that currently holds, will it be accelerated? (D3D has a pipeline
validation mechanism that's something like this).  Another was a query
of the form Have any software fallbacks occurred since the last time I
asked? that you could make after you'd actually run one or more
operations.  There were unanswered questions about whether either of
these could be made worthwhile.  I haven't tracked the ARB since late
last year so I don't know if any progress has been made on this front.

Sorry for the long reply.  These questions come up from time to time,
and I wanted to make sure everyone had the background information.

Allen


---
This SF.Net email is sponsored by:
Power Architecture Resource Center: Free content, downloads, discussions,
and more. http://solutions.newsforge.com/ibmarch.tmpl
--
___
Dri-devel mailing list
Dri-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/dri-devel


Re: Linux OpenGL ABI discussion

2005-09-29 Thread Dave Airlie

 I think the single most important point is to explicitly disallow
 vendor-supplied libGL binaries in the LSB.  Every other LSB componenet
 relies on a single backing implementation for a reason, and in practice
 the Nvidia libGL just causes endless pain where people acceidentally
 link against it.  The DRI libGL should be declare the one and official
 one, and people who need extended features over it that aren't in the
 driver-specific backend will need to contribute them back.

I have to agree with Christoph, the libGL should be a
one-size-fits-all and capable of loading drivers from any vendor, I'm
not sure what is so hard about this apart from the fact that neither
vendor has seemed willing to help out infrastructure on the basis of
some belief that they shouldn't have to (maybe because they don't on
Windows) or maybe because they don't want to be seen to collaborate on
things there is hardly any major secrets in the libGL interface
that should stop it...

As far as I know idr did a lot of work recently on libGL so we can
expose GL extensions for vendors like ATI without them having to ship
their own driver (I'm not sure if ATI contributed anything more than a
list of things needed).. I think he mentioned this was a bit more
difficult for glx.. but I'm sure it should be possible...

This is as far as I know how MS's OpenGL ICD system works, there is
one frontend and your driver can expose extra things via it...

Dave.


---
This SF.Net email is sponsored by:
Power Architecture Resource Center: Free content, downloads, discussions,
and more. http://solutions.newsforge.com/ibmarch.tmpl
--
___
Dri-devel mailing list
Dri-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/dri-devel


Re: Linux OpenGL ABI discussion

2005-09-29 Thread Christoph Hellwig
On Wed, Sep 28, 2005 at 04:07:56PM -0700, Andy Ritger wrote:
 Some of the topics raised include:
 
 - minimum OpenGL version required by libGL
 - SONAME change to libGL
 - libGL installation path

I think the single most important point is to explicitly disallow
vendor-supplied libGL binaries in the LSB.  Every other LSB componenet
relies on a single backing implementation for a reason, and in practice
the Nvidia libGL just causes endless pain where people acceidentally
link against it.  The DRI libGL should be declare the one and official
one, and people who need extended features over it that aren't in the
driver-specific backend will need to contribute them back.



---
This SF.Net email is sponsored by:
Power Architecture Resource Center: Free content, downloads, discussions,
and more. http://solutions.newsforge.com/ibmarch.tmpl
--
___
Dri-devel mailing list
Dri-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/dri-devel


Re: Linux OpenGL ABI discussion

2005-09-29 Thread Ian Romanick
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

(I corrected the CC address for the lsb-desktop list.  It was
incorrectly listed as being at lists.freedesktop.org, so none of this
thread has made it to the list where the discussion should be.)

Allen Akin wrote:
 On Thu, Sep 29, 2005 at 01:54:00PM -0400, Adam Jackson wrote:
 | The deeper issue here is whether it's actually useful to require some 
 minimum 
 | level of functionality even when large swaths of it will be software.  If I 
 | don't have cube map support in hardware, do I really want to try it in 
 | software?  Is that a useful experience for developers or for users?
 
 For OpenGL at least, history suggests the answer is usually yes.  The
 argument goes back to the pre-1.0 days, when texture mapping was only
 available on fairly exotic hardware.  The decision was made to require
 it in the standard, and it turned out to be valuable on pure software
 implementations because (1) it was fast enough to be usable for a
 surprisingly large range of apps; (2) people with older hardware still
 had the option to use it, rather than having that option closed off
 up-front by the people defining the standard, and they found uses that
 were worthwhile; (3) development could occur on older hardware for
 deployment on newer hardware; (4) it served as a reference for hardware
 implementations and a debugging tool for apps.
 
 This experience was repeated with a number of other features as OpenGL
 evolved.
 
 If there's no consensus in the ARB about the desirability of a given
 piece of functionality, it tends to be standardized as an extension (or
 very rarely as a subset, like the Imaging Operations).  Extensions are
 optional, so they provide middle ground.  But eventually, if a piece of
 functionality proves valuable enough to achieve consensus, it moves into
 the OpenGL core and software implementations become mandatory.

This represents a goal of OpenGL to lead the hardware.  The idea is that
most current version of OpenGL defines the features that the next
generation of hardware will have standard.  In terms of making
functionality available and leading developers, this is a really good
strategy to take.

However, that's not (or at least shouldn't be) our goal.  Our goal is to
define the minimum that is required to be available on our platform.  As
such, that should reflect what actually exists on our platform.  From
talking to people at the various distros, the most common piece of
graphics hardware is the Intel i830 chipset (and derived chips like
i845G, i855GM, etc.).  That hardware is only capable of OpenGL 1.3.

If all applications were well behaved (i.e., allowed users to enable or
disable the use of individual hardware features like DOT3 texture
environment or shadow maps), this wouldn't be a problem.  That is sadly
not the case.

I think there is an alternative middle ground here that will satisfy
most people's concerns.  I propose that we require 1.2 as the minimum
supported version.  I also propose that we provide a standard mechanism
to demand that the driver advertise a user specified version up to
1.5.  For example, a user might run an app like:

LIBGL_FORCE_VERSION=1.5 ./a.out

When 'a.out' queries the version string, it will get 1.5 even if the
driver has to do software fallbacks for the new 1.5 functionality.

This will prevent the unexpected performance cliff I mentioned in
another e-mail, and it will still provide more modern functionality to
users that need / want it.
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.2.6 (GNU/Linux)

iD8DBQFDPEkjX1gOwKyEAw8RAtP4AJ9qf8IwVNDc+DCEek5Zfg5dPFuT6wCfQMdd
wlSNVXShsZTgPAUzY4sicT8=
=xRsC
-END PGP SIGNATURE-


---
This SF.Net email is sponsored by:
Power Architecture Resource Center: Free content, downloads, discussions,
and more. http://solutions.newsforge.com/ibmarch.tmpl
--
___
Dri-devel mailing list
Dri-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/dri-devel


Re: Linux OpenGL ABI discussion

2005-09-29 Thread Alan Cox
On Iau, 2005-09-29 at 22:02 +0200, Christoph Hellwig wrote:
 And replacing system libraries is not something we can allow anyone.
 It's totally reasonable to have different 3cards in the same systems
 and they're supposed to work. 

Agreed - but the LSB job is still that of defining an ABI. Obviously
users who replace system libraries with ones they got from another
source get burned whether its a perl upgrade required by a vendor or
libc.

Alan



---
This SF.Net email is sponsored by:
Power Architecture Resource Center: Free content, downloads, discussions,
and more. http://solutions.newsforge.com/ibmarch.tmpl
--
___
Dri-devel mailing list
Dri-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/dri-devel


Re: Linux OpenGL ABI discussion

2005-09-29 Thread Allen Akin
On Thu, Sep 29, 2005 at 01:05:55PM -0700, Ian Romanick wrote:
|  ...  Our goal is to
| define the minimum that is required to be available on our platform.  ...

If by our goal you mean the goal of the Linux OpenGL ABI effort, then
I agree.  I intended my previous note to address the more general
questions about performance queries and subsetting that Adam and Alan
raised.

Haven't had time to check the LSB mailing list yet, but I'll try to do
so in a day or two.

Allen


---
This SF.Net email is sponsored by:
Power Architecture Resource Center: Free content, downloads, discussions,
and more. http://solutions.newsforge.com/ibmarch.tmpl
--
___
Dri-devel mailing list
Dri-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/dri-devel


Re: Linux OpenGL ABI discussion

2005-09-29 Thread Christoph Hellwig
On Thu, Sep 29, 2005 at 01:54:00PM -0400, Adam Jackson wrote:
 http://lists.freedesktop.org/archives/dri-egl/2005-July/000565.html
 
 In particular, Andy's response about why they're uninterested in a common 
 libGL is basically The Last Word on the subject.  It would require that 
 nvidia expend time, effort, and money to get to the same level of 
 functionality they already have.  This applies equally to any other IHV, and 
 to ISVs like XiG and SciTech too for that matter.  You can have whatever 
 opinion you like about that stance, but it's simply an economic reality.

And it's a we shouldn't care about their economic issues.  Giving them
a branding only if they play nice with the open source world is one of
the few powers we have.

And replacing system libraries is not something we can allow anyone.
It's totally reasonable to have different 3cards in the same systems
and they're supposed to work.  Where would be get if every scsi card
came with it's own scsi stack and you could just use one brand at a
time?  Sure, we can't forbid scsi vendors to do that, but we do
everything in out power to avoid it - quite sucessfully so far.


---
This SF.Net email is sponsored by:
Power Architecture Resource Center: Free content, downloads, discussions,
and more. http://solutions.newsforge.com/ibmarch.tmpl
--
___
Dri-devel mailing list
Dri-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/dri-devel


Linux OpenGL ABI discussion

2005-09-28 Thread Andy Ritger

I apologize for the cross posting, but I'd like to call attention
to the discussion on updating the Linux OpenGL ABI that Jon Leech
initiated on the lsb-desktop email alias:


http://base4.freestandards.org/pipermail/lsb-desktop/2005-September/000146.html

Some of the topics raised include:

- minimum OpenGL version required by libGL
- SONAME change to libGL
- libGL installation path

I think the last point, in particular, deserves some attention,
and I've posted my thoughts on the subject:


http://base4.freestandards.org/pipermail/lsb-desktop/2005-September/000157.html

It would be great if those involved in developing and distributing
OpenGL implementations on Linux joined the discussion on the
lsb-desktop alias.

Thanks,
- Andy




---
This SF.Net email is sponsored by:
Power Architecture Resource Center: Free content, downloads, discussions,
and more. http://solutions.newsforge.com/ibmarch.tmpl
--
___
Dri-devel mailing list
Dri-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/dri-devel