I finally got a chance over the weekend to try out the current DRI
code on my new 8500 card and compare it with the offering from ATI.
The following is a cheers & jeers style review.  Hopefully someone
will find it helpful.

My setup: 950MHz Athlon (early/pre-thunderbird), KT133 motherboard,
OEM 8500LE card at 250/250MHz.  I'm running code from the trunk; is
that correct?  I'm honestly pretty confused as to what the various
branches are.

First off, getting DRI to coexist with an existing X11 distribution is
a chore.  As everybody knows (translation: I only wasted 20 minutes on
this one) clients have to have an LD_LIBRARY_PATH containing the right
libGL.so, but even then you only get indirect rendering because the
client-side DRI driver isn't found.  Is it possible (via an
environment variable or compile time switch) to get the client-side
libGL to look somethere *other* that /usr/X11R6/lib/modules/dri for
drivers?  I have to move the directory out of the way and make a
symlink every time I change X servers.

This would be really good to document somewhere; I only discovered it
(via significantly more than 20 minutes of work) after running strace
on the clients and discovering that they were trying to read the wrong
r200_dri.so.  I've seen some other people complaining about not
getting direct rendering; I'll give even money that this is the
problem.  Alternatively, how about a more useful warning printed to
stderr if the client can't find the driver (or version) the server
told it to load?  Silently falling back to indirect rendering is an
awfully opaque way to handle configuration errors.  An order of
magnitude performance loss typically isn't an acceptable substitute
for what the user wants.

Likewise, I had to figure out for myself where the kernel modules were
(something like nine directories deep!), and that they didn't build
themselves.  Jeers all around.  I know this is an 18 year old code
base, but does it really have to build like one? :)

Both drivers get terribly confused if the other one touches the card
first.  I typically have to do a hard reset before changing X servers
or else the machine chokes (sometimes it's a hard lockup, other times
the X server just enters an infinite loop and I can reboot the machine
over the network).  Since you guys clearly can't be responsible for
what someone else does to the hardware, I haven't investigated this
very much.

ATI blows you guys away in glxgears.  I see 38% faster frame rates
with their drivers.  Since I doubt gears is doing anything but
glVertex calls (someone correct me if I'm wrong), I take this to mean
that there's significant room for improvement in the current vertex
transport.  I suspect that's a good thing, overall.

Running a real application (FlightGear), things were different.  You
guys are getting 71% better frame rates (in a thorougly unscientific
test, we don't have a useful benchmark capability in FlightGear).
Something is broken with the ATI drivers; I've submitted a report to
them on this.  The specific fps numbers I'm seeing are roughly in line
with what I saw with the GeForce 2MX that used to live in this
machine.  I have no idea if that reflects a CPU-limited situation or
driver slowness.  Clearly the 8500 hardware "should" be faster.  For
the record, FlightGear does almost all of its rendering using 1.1-era
OpenGL, mostly out of standard, unaugmented (i.e. in process RAM)
vertex arrays.  No multitexture.

You also don't exhibit a texture border bug that the ATI drivers have
(also submitted to them; no response yet), so at least I know it's not
a hardware limitation.  Good work. :)

There are sporadic rendering bugs in FlightGear, however.  Every ~40
frames or so, I'll see a large triangle or two flash on the screen.
It looks a little like random memory in the texture, I think.  I don't
think the triangles are actual game geometry; they appear to be random
garbage.  Is there a race with a DMA buffer somewhere?  I see no
glitches at all with gears.

I'm also seeing some stability trouble in FlightGear.  Within a few
minutes, I routinely see an X Server lockup.  The server can be killed
from an ssh session, but the card doesn't recover if it is restarted
The behavior is broadly similar to what I see if I run the ATI server
first.  As above, I don't see this with gears or the various OpenGL
screensavers which I have left running for a few tens of minutes.

And a random nit: The following appears in r200_state.c:

 static void r200PointSize( GLcontext *ctx, GLfloat size )
 {
   fprintf(stderr, "%s: %f\n", __FUNCTION__, size );
 }

I don't know what the intent was here, but I can verify that the
printf works just fine.  It also causes about a 5% loss in framerate
in FlightGear when flying at night due to all the console traffic
(runway lights are points). :)

Andy

-- 
Andrew J. Ross                NextBus Information Systems
Senior Software Engineer      Emeryville, CA
[EMAIL PROTECTED]              http://www.nextbus.com
"Men go crazy in conflagrations.  They only get better one by one."
 - Sting (misquoted)



-------------------------------------------------------
This sf.net email is sponsored by:
With Great Power, Comes Great Responsibility 
Learn to use your power at OSDN's High Performance Computing Channel
http://hpc.devchannel.org/
_______________________________________________
Dri-devel mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/dri-devel

Reply via email to