man, 04.08.2003 kl. 20.05 skrev Ian Romanick:
> I think the conclusion was that as a tool for anything other than 
> development / debugging, such a thing was not terribly useful.

Then I beg to differ. Many advanced 3D engines can switch between
several types of texture rendering techniques depending on what the
hardware can support. It has nothing to do with debugging, since this is
all about making an application or game give maximum performance on any
*user's* system, not on the developer's system. Perhaps the idea of
having apps automatically tune themselves to the user's system is an
unheard one for the DRI, but it's not in the real world. For example, I
believe the Max Payne engine could scale itself from taking advantage of
any cool environment-mapping features present in a high-end graphics
card, through relatively simple and flat graphics on a low-end card, and
even all the way down to doing its own software rendering on a 2D card
(at least the version of MAX-FX used in 3DMark2000 could). I'm also
pretty sure that the Grand Theft Auto 3 engine is similarly adaptable to
the texturing capabilities of the user's system. And the fact that some
Linux games have also a need for this feature, which they've worked
around by checking vendor/renderer strings, should also have spoken for
itself.

> The problem with it is, especially in the case of the MGA, changing one 
> subtle thing (like changing the texture constant color from 0x00ffffff 
> to 0x00ffff00) can change whether or not there is a fallback.  I don't 
> think that apps should or would, at run time, detect these types of 
> things and change their behavior.

But Direct3D allows them to do exactly that, and this facility is
*used*. A lot. Maybe you don't like it, but game developers are not
interested in having users complain about "your game runs like crap on
my G400, but it runs quake3 fine, can't you program?" if they can avoid
it by simply checking at runtime that "using this nifty texturing method
causes a software fallback in the user's driver, probably because of
hardware limitations, switch to this other less cool method instead",
and thus get a decent framerate even on low-end cards, and still be able
to take advantage of high-end cards.

> Either it needs a constant color of 
> 0x00ffff00 or it doesn't.

If you mean the app, then this is a naive view. In most cases, the
engine may not *need* such a constant color, it can work fine without
it, it'll just disable the particular effect that needs it, or replace
it with a less realistic and less demanding one, or just a more
compatible multipass technique, as software fallbacks are still worse.
But it'd still be nice for the engine to know when this constant color
does *not* cause a software fallback, so that the more demanding
technique can be used if the user upgrades his card.

Just look at GTA3 to see how important this is - it does not have *any*
3D options whatsoever - it's designed to Just Work on the user's system,
autodetecting its capabilities and tuning itself to it. And Direct3D
lets it - why can't OpenGL implementations be as end-user-friendly? Can
Linux really win on the desktop if 3D games can't be made this simple?




-------------------------------------------------------
This SF.Net email sponsored by: Free pre-built ASP.NET sites including
Data Reports, E-commerce, Portals, and Forums are available now.
Download today and enter to win an XBOX or Visual Studio .NET.
http://aspnet.click-url.com/go/psa00100003ave/direct;at.aspnet_072303_01/01
_______________________________________________
Dri-devel mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/dri-devel

Reply via email to