Re: [Mesa3d-dev] gallium raw interfaces

2010-04-01 Thread Miles Bader
On Thu, Apr 1, 2010 at 12:28 AM, Xavier Bestel xavier.bes...@free.fr wrote:
 On Wed, 2010-03-31 at 13:29 +0900, Miles Bader wrote:
 Luca Barbieri luca.barbi...@gmail.com writes:
  In fact, given the Gallium architecture, it may even make sense to
  support a variant of DirectX 10 as the main Mesa/Gallium API on all
  platfoms, instead of OpenGL.

 The apparent benefit would seem to be greater compatibility with
 software written for windows -- but that benefit is unlikely to remain,
 as MS basically changes their interfaces drastically with each major
 revision.

 WINE can deal with that. The real showstopper is that WINE has to also
 work on MacOS X and Linux + NVIDIA blob, where Gallium is unavailable.

Wine can deal with that, how?

Once MS changes interfaces, then there's _no advantage_ to using DX10
internally, regardless of what WINE does, and one might as well use
OpenGL.  Wine doesn't change that.

Given that OpenGL has other advantages (portable, publicly accessible
standardization proces, etc), adopting DX10 would seem pointless and
misguided.

-Miles

-- 
Do not taunt Happy Fun Ball.

--
Download Intel#174; Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
___
Mesa3d-dev mailing list
Mesa3d-dev@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/mesa3d-dev


Re: [Mesa3d-dev] gallium raw interfaces

2010-04-01 Thread Luca Barbieri
 Once MS changes interfaces, then there's _no advantage_ to using DX10
 internally, regardless of what WINE does, and one might as well use
 OpenGL.  Wine doesn't change that.

[resent to ML, inadvertently replied only to Miles]

Note that my proposal was not to use DirectX 10 internally, but rather
to expose DirectX 10 and promote it initially as an API to test
Gallium and later as the preferred Linux graphics API instead of
OpenGL, for the technical reason that a DirectX 10 over Gallium
implementation carries much less performance overhead than an OpenGL
implementation and is much simpler, due to the superior design of
DirectX 10.

Using an extended version of DirectX 10 internally could also be an
option, but I don't think it's worth doing that right now and likely
it's not worth at all.

Also note that Microsoft does not use DirectX 10 or 11 internally
either, but rather uses the DirectX 10 DDI or DirectX 10 Device
Driver Interface, which is also publicly documented.

The last time Microsoft did an incompatible interface change (DX10),
it was to move away from fixed pipeline support with scattered state
towards a shader-only pipeline with constant state objects.

Exactly the same change was achieved by the move from the classic Mesa
architecture to the Gallium architecture: you could think of the move
to Gallium as switching to something like DX10 internally, done purely
for technical reasons, partially the same as the ones that prompted
Microsoft to make the transition.

Actually, while this is not generally explicitly stated by Gallium
designers, Gallium itself is generally evolving towards being closer
to DirectX 10.
The biggest deviations are additional features needed to support
OpenGL features not included in DirectX 10.

For instance, looking at recent changes:
- Vertex element CSOs, recently added, are equivalent to DX10 input layouts
- Sampler views, also recently added, are equivalent to DX10 shared
resource views
- Doing transfers per-context (recent work by Keith Whitwell) is what DX10 does
- Having a resource concept (also recent work by Keith Whitwell) is
what DX10 does
- Gallium format values where changed from self-describing to a set of
stock values like DX10
- Gallium format names where later changed and made identical to DX10
ones (except for the fact that the names of the former start with
PIPE_FORMAT_ and the ones of the latter with DXGI_FORMAT_, and the
enumerated values are different)
- It has been decided to follow the DX9 SM3/DX10 model for shader
semantic linkage as opposed to the OpenGL one

I recently systematically compared Gallium and DirectX 10, and found
them to be mostly equivalent, where the exceptions were usually either
additional features Gallium had for the sake of OpenGL, or Gallium
misdesigns that are being changed or looked into.

This is not likely for the sake of imitating Microsoft, but just
because they made a good design, having had made the decision to
redesign the whole API from scratch when making DirectX 10.
It's also probably because VMWare is apparently funding DirectX 10
support over Gallium, which obviously makes all discrepancies evident
for people working on that, and those are generally because DirectX 10
is better, leading those people to improve the Gallium design taking
inspiration from DirectX 10.

Presumably if Microsoft were to change interfaces incompatibly again
(notice that DX11 is a compatible change), Mesa would probably benefit
from introducing a further abstraction layer similar to new Microsoft
interface and have a Gallium-NewLayer module, since such a change
would most likely be a result of a paradigm shift in graphics hardware
itself (e.g. a switch to fully software-based GPUs like Larrabee).

Also, unless Microsoft holds patents to DirectX 10 (which would be a
showstopper, even though Gallium may violate them anyway), I don't see
any difference from having to implement DirectX 10 or OpenGL, or any
difference in openness of the APIs.
It is indeed possible to participate in the ARB standardization
process and some Mesa contributors/leaders do, but I'm not sure
whether this is particularly advantageous: decisions that work well
for Microsoft and Windows are also likely to work well for Linux/Mesa
since the hardware is the same and the software works mostly
equivalently.

And should some decisions not work well, it is technically trivial to
provide an alternative API.

--
Download Intel#174; Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
___
Mesa3d-dev mailing list
Mesa3d-dev@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/mesa3d-dev


Re: [Mesa3d-dev] gallium raw interfaces

2010-04-01 Thread Corbin Simpson
On Thu, Apr 1, 2010 at 1:32 AM, Luca Barbieri luca.barbi...@gmail.com wrote:
 Once MS changes interfaces, then there's _no advantage_ to using DX10
 internally, regardless of what WINE does, and one might as well use
 OpenGL.  Wine doesn't change that.

 [resent to ML, inadvertently replied only to Miles]

 Note that my proposal was not to use DirectX 10 internally, but rather
 to expose DirectX 10 and promote it initially as an API to test
 Gallium and later as the preferred Linux graphics API instead of
 OpenGL, for the technical reason that a DirectX 10 over Gallium
 implementation carries much less performance overhead than an OpenGL
 implementation and is much simpler, due to the superior design of
 DirectX 10.

 Using an extended version of DirectX 10 internally could also be an
 option, but I don't think it's worth doing that right now and likely
 it's not worth at all.

 Also note that Microsoft does not use DirectX 10 or 11 internally
 either, but rather uses the DirectX 10 DDI or DirectX 10 Device
 Driver Interface, which is also publicly documented.

 The last time Microsoft did an incompatible interface change (DX10),
 it was to move away from fixed pipeline support with scattered state
 towards a shader-only pipeline with constant state objects.

 Exactly the same change was achieved by the move from the classic Mesa
 architecture to the Gallium architecture: you could think of the move
 to Gallium as switching to something like DX10 internally, done purely
 for technical reasons, partially the same as the ones that prompted
 Microsoft to make the transition.

 Actually, while this is not generally explicitly stated by Gallium
 designers, Gallium itself is generally evolving towards being closer
 to DirectX 10.
 The biggest deviations are additional features needed to support
 OpenGL features not included in DirectX 10.

 For instance, looking at recent changes:
 - Vertex element CSOs, recently added, are equivalent to DX10 input layouts
 - Sampler views, also recently added, are equivalent to DX10 shared
 resource views
 - Doing transfers per-context (recent work by Keith Whitwell) is what DX10 
 does
 - Having a resource concept (also recent work by Keith Whitwell) is
 what DX10 does
 - Gallium format values where changed from self-describing to a set of
 stock values like DX10
 - Gallium format names where later changed and made identical to DX10
 ones (except for the fact that the names of the former start with
 PIPE_FORMAT_ and the ones of the latter with DXGI_FORMAT_, and the
 enumerated values are different)
 - It has been decided to follow the DX9 SM3/DX10 model for shader
 semantic linkage as opposed to the OpenGL one

 I recently systematically compared Gallium and DirectX 10, and found
 them to be mostly equivalent, where the exceptions were usually either
 additional features Gallium had for the sake of OpenGL, or Gallium
 misdesigns that are being changed or looked into.

 This is not likely for the sake of imitating Microsoft, but just
 because they made a good design, having had made the decision to
 redesign the whole API from scratch when making DirectX 10.
 It's also probably because VMWare is apparently funding DirectX 10
 support over Gallium, which obviously makes all discrepancies evident
 for people working on that, and those are generally because DirectX 10
 is better, leading those people to improve the Gallium design taking
 inspiration from DirectX 10.

 Presumably if Microsoft were to change interfaces incompatibly again
 (notice that DX11 is a compatible change), Mesa would probably benefit
 from introducing a further abstraction layer similar to new Microsoft
 interface and have a Gallium-NewLayer module, since such a change
 would most likely be a result of a paradigm shift in graphics hardware
 itself (e.g. a switch to fully software-based GPUs like Larrabee).

 Also, unless Microsoft holds patents to DirectX 10 (which would be a
 showstopper, even though Gallium may violate them anyway), I don't see
 any difference from having to implement DirectX 10 or OpenGL, or any
 difference in openness of the APIs.
 It is indeed possible to participate in the ARB standardization
 process and some Mesa contributors/leaders do, but I'm not sure
 whether this is particularly advantageous: decisions that work well
 for Microsoft and Windows are also likely to work well for Linux/Mesa
 since the hardware is the same and the software works mostly
 equivalently.

 And should some decisions not work well, it is technically trivial to
 provide an alternative API.

Is it really so surprising that an API designed to expose a generic,
programmable, shaderful pipeline (Gallium) fits well to multiple APIs
based on the same concept (D3D10, OGL 2.x)?

-- 
When the facts change, I change my mind. What do you do, sir? ~ Keynes

Corbin Simpson
mostawesomed...@gmail.com

--
Download Intel#174; Parallel Studio Eval

Re: [Mesa3d-dev] gallium raw interfaces

2010-03-31 Thread Xavier Bestel
On Wed, 2010-03-31 at 13:29 +0900, Miles Bader wrote:
 Luca Barbieri luca.barbi...@gmail.com writes:
  In fact, given the Gallium architecture, it may even make sense to
  support a variant of DirectX 10 as the main Mesa/Gallium API on all
  platfoms, instead of OpenGL.
 
 The apparent benefit would seem to be greater compatibility with
 software written for windows -- but that benefit is unlikely to remain,
 as MS basically changes their interfaces drastically with each major
 revision.

WINE can deal with that. The real showstopper is that WINE has to also
work on MacOS X and Linux + NVIDIA blob, where Gallium is unavailable.

Xav


--
Download Intel#174; Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
___
Mesa3d-dev mailing list
Mesa3d-dev@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/mesa3d-dev


Re: [Mesa3d-dev] gallium raw interfaces

2010-03-31 Thread Luca Barbieri
 WINE can deal with that. The real showstopper is that WINE has to also
 work on MacOS X and Linux + NVIDIA blob, where Gallium is unavailable.

We could actually consider making a Gallium driver that uses OpenGL to
do rendering.

If the app uses DirectX 10, this may not significantly degrade
performance, and should instead appreciably increase it if a Gallium
driver is available.

On the other hand, for DirectX 9 apps, this could decrease performance
significantly (because DirectX 9 has immediate mode and doesn't
require CSOs).

--
Download Intel#174; Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
___
Mesa3d-dev mailing list
Mesa3d-dev@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/mesa3d-dev


Re: [Mesa3d-dev] gallium raw interfaces

2010-03-30 Thread Keith Whitwell
On Sun, 2010-03-28 at 23:56 -0700, Chia-I Wu wrote:
 On Mon, Mar 29, 2010 at 1:51 AM, Keith Whitwell
 keith.whitw...@googlemail.com wrote:
  I've just pushed a variation on a theme a couple of people have
  explored in the past, ie. an interface to gallium without an
  intervening state-tracker.
  The purpose of this is for writing minimal test programs to exercise
  new gallium drivers in isolation from the rest of the codebase.
  In fact it doesn't really make sense to say without a state tracker,
  unless you don't mind creating test programs which are specific to the
  windowing system you're currently working with.  Some similar work has
  avoided window-system issues altogether by dumping bitmaps to files,
  or using eg. python to abstract over window systems.
  This approach is a little different - I've defined a super-minimal api
  for creating/destroying windows, currently calling this graw, and we
  have a tiny little co-state-tracker that each implementation provides.
   This is similar to the glut approach of abstracting over window
  systems, though much less complete.
  It currently consists of three calls:
struct pipe_screen *graw_init( void );
void *graw_create_window(...);
void graw_destroy_window( void *handle );
  which are sufficient to build simple demos on top of.  A future
  enhancement would be to add a glut-style input handling facility.
  Right now there's a single demo, clear.c which displays an ugly
  purple box.  Builds so far only with scons, using winsys=graw-xlib.
 I happened to be playing with the idea yesterday.  My take is to define an EGL
 extension, EGL_MESA_gallium.  The extension defines Gallium as a rendering API
 of EGL.  The downside of this approach is that it depends on st/egl.  The
 upside is that, it will work on whatever platform st/egl supports.
 
 I've cleaned up my work a little bit.  You can find it in the attachments.
 There is a port of clear raw demo to use EGL_MESA_gallium.  The demo 
 supports
 window resizing, and is accelerated if a hardware EGL driver is used.
 
 The demo renders into a X11 window.  It is worth noting that, when there is no
 need to render into an EGLSurface, eglCreateWindowSurface or eglMakeCurrent is
 not required.  To interface with X11, I've also borrowed some code from OpenVG
 demos and renamed it to EGLUT.
 

I'm not sure how far to take any of these naked gallium approaches.
My motivation was to build something to provide a very controlled
environment for bringup of new drivers - basically getting to the first
triangle and not much further.  After that, existing state trackers with
stable ABIs are probably preferable.

Keith


--
Download Intel#174; Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
___
Mesa3d-dev mailing list
Mesa3d-dev@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/mesa3d-dev


Re: [Mesa3d-dev] gallium raw interfaces

2010-03-30 Thread Luca Barbieri
An interesting option could be to provide a DirectX 10 implementation
using TGSI text as the shader interface, which should be much easier
than one would think at first.

DirectX 10 + TGSI text would provide a very thin binary compatible
layer over Gallium, unlike all existing state trackers.

It could even run Windows games if integrated with Wine and something
producing TGSI from either HLSL text or D3D10 bytecode (e.g. whatever
Wine uses to produce GLSL + the Mesa GLSL frontend + st_mesa_to_tgsi).

In fact, given the Gallium architecture, it may even make sense to
support a variant of DirectX 10 as the main Mesa/Gallium API on all
platfoms, instead of OpenGL.

--
Download Intel#174; Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
___
Mesa3d-dev mailing list
Mesa3d-dev@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/mesa3d-dev


Re: [Mesa3d-dev] gallium raw interfaces

2010-03-29 Thread Chia-I Wu
On Mon, Mar 29, 2010 at 1:51 AM, Keith Whitwell
keith.whitw...@googlemail.com wrote:
 I've just pushed a variation on a theme a couple of people have
 explored in the past, ie. an interface to gallium without an
 intervening state-tracker.
 The purpose of this is for writing minimal test programs to exercise
 new gallium drivers in isolation from the rest of the codebase.
 In fact it doesn't really make sense to say without a state tracker,
 unless you don't mind creating test programs which are specific to the
 windowing system you're currently working with.  Some similar work has
 avoided window-system issues altogether by dumping bitmaps to files,
 or using eg. python to abstract over window systems.
 This approach is a little different - I've defined a super-minimal api
 for creating/destroying windows, currently calling this graw, and we
 have a tiny little co-state-tracker that each implementation provides.
  This is similar to the glut approach of abstracting over window
 systems, though much less complete.
 It currently consists of three calls:
   struct pipe_screen *graw_init( void );
   void *graw_create_window(...);
   void graw_destroy_window( void *handle );
 which are sufficient to build simple demos on top of.  A future
 enhancement would be to add a glut-style input handling facility.
 Right now there's a single demo, clear.c which displays an ugly
 purple box.  Builds so far only with scons, using winsys=graw-xlib.
I happened to be playing with the idea yesterday.  My take is to define an EGL
extension, EGL_MESA_gallium.  The extension defines Gallium as a rendering API
of EGL.  The downside of this approach is that it depends on st/egl.  The
upside is that, it will work on whatever platform st/egl supports.

I've cleaned up my work a little bit.  You can find it in the attachments.
There is a port of clear raw demo to use EGL_MESA_gallium.  The demo supports
window resizing, and is accelerated if a hardware EGL driver is used.

The demo renders into a X11 window.  It is worth noting that, when there is no
need to render into an EGLSurface, eglCreateWindowSurface or eglMakeCurrent is
not required.  To interface with X11, I've also borrowed some code from OpenVG
demos and renamed it to EGLUT.

-- 
o...@lunarg.com
From 8b40fe56be030b19281971ac7db0a19625ae734e Mon Sep 17 00:00:00 2001
From: Chia-I Wu o...@lunarg.com
Date: Mon, 29 Mar 2010 01:46:16 +0800
Subject: [PATCH 1/4] egl: Export _eglCheckDisplayHandle.

---
 src/egl/main/egldisplay.h |2 +-
 1 files changed, 1 insertions(+), 1 deletions(-)

diff --git a/src/egl/main/egldisplay.h b/src/egl/main/egldisplay.h
index 21bf22b..57e7132 100644
--- a/src/egl/main/egldisplay.h
+++ b/src/egl/main/egldisplay.h
@@ -101,7 +101,7 @@ PUBLIC void
 _eglCleanupDisplay(_EGLDisplay *disp);
 
 
-extern EGLBoolean
+PUBLIC EGLBoolean
 _eglCheckDisplayHandle(EGLDisplay dpy);
 
 
-- 
1.7.0

From 636480770965187ce09d8b59a2c6bdec572dd52d Mon Sep 17 00:00:00 2001
From: Chia-I Wu o...@lunarg.com
Date: Mon, 29 Mar 2010 00:26:50 +0800
Subject: [PATCH 2/4] egl: Define EGL_MESA_gallium.

---
 include/EGL/eglext.h   |8 
 src/egl/main/eglconfig.c   |3 +++
 src/egl/main/eglcurrent.h  |6 ++
 src/gallium/include/state_tracker/st_api.h |3 +++
 src/gallium/state_trackers/egl/common/egl_g3d.c|5 +
 src/gallium/state_trackers/egl/common/egl_g3d_st.c |3 +++
 src/gallium/state_trackers/egl/common/egl_g3d_st.h |5 +
 7 files changed, 33 insertions(+), 0 deletions(-)

diff --git a/include/EGL/eglext.h b/include/EGL/eglext.h
index a9e598d..e797523 100644
--- a/include/EGL/eglext.h
+++ b/include/EGL/eglext.h
@@ -197,6 +197,14 @@ typedef const char * (EGLAPIENTRYP PFNEGLQUERYMODESTRINGMESA) (EGLDisplay dpy, E
 #endif /* EGL_MESA_screen_surface */
 
 
+/* EGL_MESA_gallium extension   PRELIMINARY  */
+#ifndef EGL_MESA_gallium
+#define EGL_MESA_gallium 1
+#define EGL_GALLIUM_API_MESA			0x30A3
+#define EGL_GALLIUM_BIT_MESA			0x0010	/* EGL_RENDERABLE_TYPE mask bits */
+#endif /* EGL_MESA_gallium */
+
+
 #ifndef EGL_MESA_copy_context
 #define EGL_MESA_copy_context 1
 
diff --git a/src/egl/main/eglconfig.c b/src/egl/main/eglconfig.c
index 21d13cb..9a4de7e 100644
--- a/src/egl/main/eglconfig.c
+++ b/src/egl/main/eglconfig.c
@@ -332,6 +332,9 @@ _eglValidateConfig(const _EGLConfig *conf, EGLBoolean for_matching)
EGL_OPENVG_BIT |
EGL_OPENGL_ES2_BIT |
EGL_OPENGL_BIT;
+#ifdef EGL_MESA_gallium
+mask |= EGL_GALLIUM_BIT_MESA;
+#endif
 break;
  default:
 assert(0);
diff --git a/src/egl/main/eglcurrent.h b/src/egl/main/eglcurrent.h
index e5c94ce..5bd11ae 100644
--- a/src/egl/main/eglcurrent.h
+++ b/src/egl/main/eglcurrent.h
@@ -13,7 +13,13 @@
 
 
 #define _EGL_API_FIRST_API EGL_OPENGL_ES_API
+
+#ifdef EGL_MESA_gallium
+#define _EGL_API_LAST_API EGL_GALLIUM_API_MESA

Re: [Mesa3d-dev] gallium raw interfaces

2010-03-28 Thread Luca Barbieri
I posted something similar some time ago, that however could use
hardware accelerated drivers with DRI2 or KMS, provided a substantial
set of helpers and offered a complement of 3 demos.

My solution to window-system issues was to simply have the application
provide a draw callback to the framework, which would automatically
create a maximized window with the application name in the title, and
call draw in a loop, presenting the results.

Then I had a path that woud use the X DRI2 interface if possible, and
another path that would use the Linux DRM KMS API (and initially some
EGL+ad-hoc extension paths that were later dropped).

It no longer works due to Gallium interface changes, but maybe it can
be resurrected and merged with graw.

However, there is a disadvantage to having Gallium programs in-tree:
they break every time the Gallium interface in changed and avoiding
that means that in addition to fixing all drivers and state trackers,
you also need to fix all programs for each change

--
Download Intel#174; Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
___
Mesa3d-dev mailing list
Mesa3d-dev@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/mesa3d-dev


Re: [Mesa3d-dev] gallium raw interfaces

2010-03-28 Thread Corbin Simpson
On Sun, Mar 28, 2010 at 12:19 PM, Luca Barbieri luca.barbi...@gmail.com wrote:
 I posted something similar some time ago, that however could use
 hardware accelerated drivers with DRI2 or KMS, provided a substantial
 set of helpers and offered a complement of 3 demos.

 My solution to window-system issues was to simply have the application
 provide a draw callback to the framework, which would automatically
 create a maximized window with the application name in the title, and
 call draw in a loop, presenting the results.

 Then I had a path that woud use the X DRI2 interface if possible, and
 another path that would use the Linux DRM KMS API (and initially some
 EGL+ad-hoc extension paths that were later dropped).

 It no longer works due to Gallium interface changes, but maybe it can
 be resurrected and merged with graw.

 However, there is a disadvantage to having Gallium programs in-tree:
 they break every time the Gallium interface in changed and avoiding
 that means that in addition to fixing all drivers and state trackers,
 you also need to fix all programs for each change

Presumably this will no longer be a problem when Gallium is a more
mature, stable interface. I much prefer this try it and see
mentality over the design-by-committee mess that has popped up
elsewhere.

--
When the facts change, I change my mind. What do you do, sir? ~ Keynes

Corbin Simpson
mostawesomed...@gmail.com

--
Download Intel#174; Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
___
Mesa3d-dev mailing list
Mesa3d-dev@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/mesa3d-dev