new s3tc patch...
Since the patch broke again... a new version together with some information how to install it can be found here (that also means I no longer have to spam the list for future versions). http://homepage.hispeed.ch/rscheidegger/dri_experimental/s3tc_index.html Roland --- This SF.Net email sponsored by Black Hat Briefings Training. Attend Black Hat Briefings Training, Las Vegas July 24-29 - digital self defense, top technical experts, no vendor pitches, unmatched networking opportunities. Visit www.blackhat.com -- ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] new s3tc patch
I think I get what your saying here, about white and black being your max and min. The answer is simple, to my simple mind. You need to compress your max and min based uppon your full data set. Apply the means-extream, if applicable, to get your new max and min. Bottom line, make Black into Brown with red and blue then. --- Roland Scheidegger [EMAIL PROTECTED] wrote: Dieter Nützel wrote: (needed for QuakeIII/rtcw, don't know about rtcw:et). This time I haven't tested the r200 patch, only radeon so use at your own risk. Using 8/8/8 Color bits, 24 depth, 8 stencil display. GL_RENDERER: Mesa DRI R200 20030328 AGP 4x x86/MMX+/3DNow!+/SSE TCL Initializing OpenGL extensions ...using GL_S3_s3tc ...using GL_EXT_texture_env_add ...using GL_ARB_multitexture ...using GL_EXT_compiled_vertex_array ~160 fps at 640x480 fullscreen ~156 fps at 640x480 window On 1280x1024x24 desktop both without sound. Did you enable r_ext_compressed_textures 1? Otherwise, quakeIII won't use compressed textures. Note that the alpha decompression of DXT5 (in txformat_tmp.h) is horribly broken - stupid bug, the version from the earlier patch is actually correct (games likely never need this). You will also get quite a few color banding artifacts. It's kinda funny, you can get absolutely HUGE color errors in theory (because a 16-pixel block needs to be encoded with only 4 colors, of which only 2 are specified directly, the others are calculated and are somewhere in-between the other 2 colors), but what you actually notice are (comparatively) very minimal color banding errors because the encoded colors are only 16bit, not 32bit. Though a newer version of the encoder reduces this problem quite a bit (some stupid design flaw which led to rounding errors), a really clever encoder could reduce this problem further (you can get better than 16bit color accuracy by means of the calculated colors, provided the GPU does the decoding in more than 16bit). Another problem is the encoding can sometimes fail pretty horribly. If you have one white and one black pixel in your 16-pixel block, those will get chosen as the 2 base values. If the other 14 pixels are all green for instance, well tough luck - they are now all either black, dark gray, light gray or white... The improved encoder I have here improves this a bit, but fundamentally it seems to be a hard problem to get the best (or at least a decent) encoding - not to mention you'd need to figure out how to define what a good encoding actually is... Alpha (DXT5) encding is also improved in the new version. There's no new patch available yet - trust me, you don't want to see the code ;-). Furthermore, I'll convert this to use the dlopen/dlsym stuff as discussed already before I'll submit another patch. Roland __ Do you Yahoo!? Yahoo! Hotjobs: Enter the Signing Bonus Sweepstakes http://hotjobs.sweepstakes.yahoo.com/signingbonus --- This SF.net email is sponsored by: Perforce Software. Perforce is the Fast Software Configuration Management System offering advanced branching capabilities and atomic changes on 50+ platforms. Free Eval! http://www.perforce.com/perforce/loadprog.html -- ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] new s3tc patch
I think I get what your saying here, about white and black being your max and min. The answer is simple, to my simple mind. You need to compress your max and min based uppon your full data set. Apply the means-extream, if applicable, to get your new max and min. Bottom line, make Black into Brown with red and blue then. --- Roland Scheidegger [EMAIL PROTECTED] wrote: Dieter Nützel wrote: (needed for QuakeIII/rtcw, don't know about rtcw:et). This time I haven't tested the r200 patch, only radeon so use at your own risk. Using 8/8/8 Color bits, 24 depth, 8 stencil display. GL_RENDERER: Mesa DRI R200 20030328 AGP 4x x86/MMX+/3DNow!+/SSE TCL Initializing OpenGL extensions ...using GL_S3_s3tc ...using GL_EXT_texture_env_add ...using GL_ARB_multitexture ...using GL_EXT_compiled_vertex_array ~160 fps at 640x480 fullscreen ~156 fps at 640x480 window On 1280x1024x24 desktop both without sound. Did you enable r_ext_compressed_textures 1? Otherwise, quakeIII won't use compressed textures. Note that the alpha decompression of DXT5 (in txformat_tmp.h) is horribly broken - stupid bug, the version from the earlier patch is actually correct (games likely never need this). You will also get quite a few color banding artifacts. It's kinda funny, you can get absolutely HUGE color errors in theory (because a 16-pixel block needs to be encoded with only 4 colors, of which only 2 are specified directly, the others are calculated and are somewhere in-between the other 2 colors), but what you actually notice are (comparatively) very minimal color banding errors because the encoded colors are only 16bit, not 32bit. Though a newer version of the encoder reduces this problem quite a bit (some stupid design flaw which led to rounding errors), a really clever encoder could reduce this problem further (you can get better than 16bit color accuracy by means of the calculated colors, provided the GPU does the decoding in more than 16bit). Another problem is the encoding can sometimes fail pretty horribly. If you have one white and one black pixel in your 16-pixel block, those will get chosen as the 2 base values. If the other 14 pixels are all green for instance, well tough luck - they are now all either black, dark gray, light gray or white... The improved encoder I have here improves this a bit, but fundamentally it seems to be a hard problem to get the best (or at least a decent) encoding - not to mention you'd need to figure out how to define what a good encoding actually is... Alpha (DXT5) encding is also improved in the new version. There's no new patch available yet - trust me, you don't want to see the code ;-). Furthermore, I'll convert this to use the dlopen/dlsym stuff as discussed already before I'll submit another patch. Roland __ Do you Yahoo!? Yahoo! Hotjobs: Enter the Signing Bonus Sweepstakes http://hotjobs.sweepstakes.yahoo.com/signingbonus --- This SF.net email is sponsored by: Perforce Software. Perforce is the Fast Software Configuration Management System offering advanced branching capabilities and atomic changes on 50+ platforms. Free Eval! http://www.perforce.com/perforce/loadprog.html -- ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] new s3tc patch
I think I get what your saying here, about white and black being your max and min. The answer is simple, to my simple mind. You need to compress your max and min based uppon your full data set. Apply the means-extream, if applicable, to get your new max and min. Bottom line, make Black into Brown with red and blue then. --- Roland Scheidegger [EMAIL PROTECTED] wrote: Dieter Nützel wrote: (needed for QuakeIII/rtcw, don't know about rtcw:et). This time I haven't tested the r200 patch, only radeon so use at your own risk. Using 8/8/8 Color bits, 24 depth, 8 stencil display. GL_RENDERER: Mesa DRI R200 20030328 AGP 4x x86/MMX+/3DNow!+/SSE TCL Initializing OpenGL extensions ...using GL_S3_s3tc ...using GL_EXT_texture_env_add ...using GL_ARB_multitexture ...using GL_EXT_compiled_vertex_array ~160 fps at 640x480 fullscreen ~156 fps at 640x480 window On 1280x1024x24 desktop both without sound. Did you enable r_ext_compressed_textures 1? Otherwise, quakeIII won't use compressed textures. Note that the alpha decompression of DXT5 (in txformat_tmp.h) is horribly broken - stupid bug, the version from the earlier patch is actually correct (games likely never need this). You will also get quite a few color banding artifacts. It's kinda funny, you can get absolutely HUGE color errors in theory (because a 16-pixel block needs to be encoded with only 4 colors, of which only 2 are specified directly, the others are calculated and are somewhere in-between the other 2 colors), but what you actually notice are (comparatively) very minimal color banding errors because the encoded colors are only 16bit, not 32bit. Though a newer version of the encoder reduces this problem quite a bit (some stupid design flaw which led to rounding errors), a really clever encoder could reduce this problem further (you can get better than 16bit color accuracy by means of the calculated colors, provided the GPU does the decoding in more than 16bit). Another problem is the encoding can sometimes fail pretty horribly. If you have one white and one black pixel in your 16-pixel block, those will get chosen as the 2 base values. If the other 14 pixels are all green for instance, well tough luck - they are now all either black, dark gray, light gray or white... The improved encoder I have here improves this a bit, but fundamentally it seems to be a hard problem to get the best (or at least a decent) encoding - not to mention you'd need to figure out how to define what a good encoding actually is... Alpha (DXT5) encding is also improved in the new version. There's no new patch available yet - trust me, you don't want to see the code ;-). Furthermore, I'll convert this to use the dlopen/dlsym stuff as discussed already before I'll submit another patch. Roland __ Do you Yahoo!? Yahoo! Hotjobs: Enter the Signing Bonus Sweepstakes http://hotjobs.sweepstakes.yahoo.com/signingbonus --- This SF.net email is sponsored by: Perforce Software. Perforce is the Fast Software Configuration Management System offering advanced branching capabilities and atomic changes on 50+ platforms. Free Eval! http://www.perforce.com/perforce/loadprog.html -- ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] new s3tc patch
Am Mittwoch, 14. Januar 2004 06:29 schrieb Dieter Nützel: Am Sonntag, 28. Dezember 2003 23:00 schrieb Roland Scheidegger: ok, here it is, the long-awaited, highly controversial new patch ;-). (patches against current Mesa cvs, if you used the older version you need to reverse it first). The radeon/r200 patches have a texture alignment problem (with small textures/mipmaps whose height is bigger than width) fixed. I believe it is now 100% correct except possibly for rgba_dxt1 textures (still lack a test app). They also enable the old, undocumented, dead and buried GL_S3_s3tc extension Can't find that string. Oh boy, stupid me. I've forgotten to extract and apply the radeon_txc.diff and r200_txc.diff. Overseen them in your post ;-))) (needed for QuakeIII/rtcw, don't know about rtcw:et). This time I haven't tested the r200 patch, only radeon so use at your own risk. Using 8/8/8 Color bits, 24 depth, 8 stencil display. GL_RENDERER: Mesa DRI R200 20030328 AGP 4x x86/MMX+/3DNow!+/SSE TCL Initializing OpenGL extensions ...using GL_S3_s3tc ...using GL_EXT_texture_env_add ...using GL_ARB_multitexture ...using GL_EXT_compiled_vertex_array ~160 fps at 640x480 fullscreen ~156 fps at 640x480 window On 1280x1024x24 desktop both without sound. I can test rtcw on r200 for you if we get it going. Later. The mesa patch now contains a DXTn compressor (and has some bug fixed with the alpha decoding of DXT5, though this is untested). Per default, all software decompression/compression is disabled, but I'm sure you can figure out how to change that ;-). I've also done some mesa changes, since mesa based it's decision if it's a compressed format (and which one) on the suggested format supplied by the application, not the format which actually got used. This will lead to crashes if for instance the application suggests a compressed format and the driver decides (which is perfectly legal) it wants to store it uncompressed. Though I'm not convinced if this is an elegant fix, maybe it would be better just to remove the texImage-IsCompressed and texImage-CompressedSize fields (just use texImage-TexFormat-TexelSize instead, the CompressedSize could also be calculated whenever it's needed). Issues: - Not sure if the GL_S3_s3tc formats are really exactly the same as the GL_ext_texture_compression_s3tc formats. Mesa assumed this, so I just assumed it too. I have the quake3-smp binary, too. But sadly something is broken since Ian changed something in the gl context code some months ago. It worked before and gave additional ~10 fps and very nice system load. I only get a black (empty) window. He posted some hints several weeks ago but my system disk crashed and it took some time to get on pair, again... I need some sleep... ...such beautifully work! Cheers, Dieter --- This SF.net email is sponsored by: Perforce Software. Perforce is the Fast Software Configuration Management System offering advanced branching capabilities and atomic changes on 50+ platforms. Free Eval! http://www.perforce.com/perforce/loadprog.html -- ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] new s3tc patch
Dieter Nützel wrote: (needed for QuakeIII/rtcw, don't know about rtcw:et). This time I haven't tested the r200 patch, only radeon so use at your own risk. Using 8/8/8 Color bits, 24 depth, 8 stencil display. GL_RENDERER: Mesa DRI R200 20030328 AGP 4x x86/MMX+/3DNow!+/SSE TCL Initializing OpenGL extensions ...using GL_S3_s3tc ...using GL_EXT_texture_env_add ...using GL_ARB_multitexture ...using GL_EXT_compiled_vertex_array ~160 fps at 640x480 fullscreen ~156 fps at 640x480 window On 1280x1024x24 desktop both without sound. Did you enable r_ext_compressed_textures 1? Otherwise, quakeIII won't use compressed textures. Note that the alpha decompression of DXT5 (in txformat_tmp.h) is horribly broken - stupid bug, the version from the earlier patch is actually correct (games likely never need this). You will also get quite a few color banding artifacts. It's kinda funny, you can get absolutely HUGE color errors in theory (because a 16-pixel block needs to be encoded with only 4 colors, of which only 2 are specified directly, the others are calculated and are somewhere in-between the other 2 colors), but what you actually notice are (comparatively) very minimal color banding errors because the encoded colors are only 16bit, not 32bit. Though a newer version of the encoder reduces this problem quite a bit (some stupid design flaw which led to rounding errors), a really clever encoder could reduce this problem further (you can get better than 16bit color accuracy by means of the calculated colors, provided the GPU does the decoding in more than 16bit). Another problem is the encoding can sometimes fail pretty horribly. If you have one white and one black pixel in your 16-pixel block, those will get chosen as the 2 base values. If the other 14 pixels are all green for instance, well tough luck - they are now all either black, dark gray, light gray or white... The improved encoder I have here improves this a bit, but fundamentally it seems to be a hard problem to get the best (or at least a decent) encoding - not to mention you'd need to figure out how to define what a good encoding actually is... Alpha (DXT5) encding is also improved in the new version. There's no new patch available yet - trust me, you don't want to see the code ;-). Furthermore, I'll convert this to use the dlopen/dlsym stuff as discussed already before I'll submit another patch. Roland --- This SF.net email is sponsored by: Perforce Software. Perforce is the Fast Software Configuration Management System offering advanced branching capabilities and atomic changes on 50+ platforms. Free Eval! http://www.perforce.com/perforce/loadprog.html -- ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
RE: [Dri-devel] new s3tc patch
[replying to multiple people] They also enable the old, undocumented, dead and buried GL_S3_s3tc extension http://oss.sgi.com/projects/ogl-sample/registry/S3/s3tc.txt Note that the alpha decompression of DXT5 (in txformat_tmp.h) is horribly broken - stupid bug, the version from the earlier patch is actually correct (games likely never need this). The sole reason of using DXT5 over DXT1, apart from working around below mentioned HW features on GeForce 1-4 cards, is using the alpha channel and we most certainly make a lot of use of the alpha channel of DXT3/5 textures and I would expect other games do the same. further (you can get better than 16bit color accuracy by means of the calculated colors, provided the GPU does the decoding in more than 16bit). To the best of my knowledge the only GPUs that don't do this are NVIDIA's GeForce 1-4 cards with DXT1 (they use 32 bit precision for DXT3/5). You definitely want to use 32 bit for doing the color1/color2 blending as the artifacts are quite noticeable for e.g. lightmaps. dark gray, light gray or white... The improved encoder I have here improves this a bit, but fundamentally it seems to be a hard Check out NVIDIA's encoder as a reference as it is the best out there. http://developer.nvidia.com/object/nv_texture_tools.html -- Daniel, Epic Games Inc. --- This SF.net email is sponsored by: Perforce Software. Perforce is the Fast Software Configuration Management System offering advanced branching capabilities and atomic changes on 50+ platforms. Free Eval! http://www.perforce.com/perforce/loadprog.html -- ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] new s3tc patch
Daniel Vogel wrote: [replying to multiple people] They also enable the old, undocumented, dead and buried GL_S3_s3tc extension http://oss.sgi.com/projects/ogl-sample/registry/S3/s3tc.txt Yes, but everything there just says unknown - no compression format, nothing interesting at all (except the format tokens), even how to submit compressed textures is just a guess there! This is not what I'd call documentation. Note that the alpha decompression of DXT5 (in txformat_tmp.h) is horribly broken - stupid bug, the version from the earlier patch is actually correct (games likely never need this). The sole reason of using DXT5 over DXT1, apart from working around below mentioned HW features on GeForce 1-4 cards, is using the alpha channel and we most certainly make a lot of use of the alpha channel of DXT3/5 textures and I would expect other games do the same. Yes certainly. But software decompression of textures really should never happen in games (performance of software rasterization really isn't good...), and I don't think current games have much use for reading back decompressed textures. further (you can get better than 16bit color accuracy by means of the calculated colors, provided the GPU does the decoding in more than 16bit). To the best of my knowledge the only GPUs that don't do this are NVIDIA's GeForce 1-4 cards with DXT1 (they use 32 bit precision for DXT3/5). You definitely want to use 32 bit for doing the color1/color2 blending as the artifacts are quite noticeable for e.g. lightmaps. dark gray, light gray or white... The improved encoder I have here improves this a bit, but fundamentally it seems to be a hard Check out NVIDIA's encoder as a reference as it is the best out there. http://developer.nvidia.com/object/nv_texture_tools.html I did take a look there earlier but 1) I couldn't figure out what the original license for the source code is (nvidia's license wouldn't permit the use of the source code for anything) 2) I have no idea if this is the same algorithm which is used in the DXT_TOOLS anyway 3) Being an offline solution it might not be suitable for online compression (might emphasize quality over performance too much) 4) at first look it seemed to miss the DXT5 encoder 5) It's way more fun to write an own implementation ;-) Roland --- This SF.net email is sponsored by: Perforce Software. Perforce is the Fast Software Configuration Management System offering advanced branching capabilities and atomic changes on 50+ platforms. Free Eval! http://www.perforce.com/perforce/loadprog.html -- ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
RE: [Dri-devel] new s3tc patch
Yes certainly. But software decompression of textures really should never happen in games (performance of software rasterization really FWIW, that's how we handle support for cards that don't support texture compression as we store our textures pre- compressed. I agree though that it doesn't make sense to expose the extension if the hardware can't handle it. The app is much better suited to work around the lack of texture compression than DRI/Mesa. BTW, Kyro I/II is an interesting example as they expose GL_EXT_texture_compression_s3tc though only implement DXT1 in hardware. isn't good...), and I don't think current games have much use for reading back decompressed textures. Correct. I did take a look there earlier but Sorry, I meant as a quality reference. -- Daniel, Epic Games Inc. --- This SF.net email is sponsored by: Perforce Software. Perforce is the Fast Software Configuration Management System offering advanced branching capabilities and atomic changes on 50+ platforms. Free Eval! http://www.perforce.com/perforce/loadprog.html -- ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] new s3tc patch
Am Sonntag, 28. Dezember 2003 23:00 schrieb Roland Scheidegger: ok, here it is, the long-awaited, highly controversial new patch ;-). (patches against current Mesa cvs, if you used the older version you need to reverse it first). The radeon/r200 patches have a texture alignment problem (with small textures/mipmaps whose height is bigger than width) fixed. I believe it is now 100% correct except possibly for rgba_dxt1 textures (still lack a test app). They also enable the old, undocumented, dead and buried GL_S3_s3tc extension Can find that string. (needed for QuakeIII/rtcw, don't know about rtcw:et). This time I haven't tested the r200 patch, only radeon so use at your own risk. r200. How can I enable this? ...loading libGL.so.1: Initializing OpenGL display ...setting mode 3: 640 480 Using XFree86-VidModeExtension Version 2.2 XF86DGA Mouse (Version 2.0) initialized XFree86-VidModeExtension: Ignored on non-fullscreen/Voodoo Using 8/8/8 Color bits, 24 depth, 8 stencil display. GL_RENDERER: Mesa DRI R200 20030328 AGP 4x x86/MMX+/3DNow!+/SSE TCL Initializing OpenGL extensions ...GL_S3_s3tc not found ...using GL_EXT_texture_env_add ...using GL_ARB_multitexture ...using GL_EXT_compiled_vertex_array XF86 Gamma extension initialized Trying SMP acceleration... ^3ERROR: SMP support was disabled at compile time ...failed. GL_VENDOR: Tungsten Graphics, Inc. GL_RENDERER: Mesa DRI R200 20030328 AGP 4x x86/MMX+/3DNow!+/SSE TCL I can test rtcw on r200 for you if we get it going. The mesa patch now contains a DXTn compressor (and has some bug fixed with the alpha decoding of DXT5, though this is untested). Per default, all software decompression/compression is disabled, but I'm sure you can figure out how to change that ;-). I've also done some mesa changes, since mesa based it's decision if it's a compressed format (and which one) on the suggested format supplied by the application, not the format which actually got used. This will lead to crashes if for instance the application suggests a compressed format and the driver decides (which is perfectly legal) it wants to store it uncompressed. Though I'm not convinced if this is an elegant fix, maybe it would be better just to remove the texImage-IsCompressed and texImage-CompressedSize fields (just use texImage-TexFormat-TexelSize instead, the CompressedSize could also be calculated whenever it's needed). Issues: - Not sure if the GL_S3_s3tc formats are really exactly the same as the GL_ext_texture_compression_s3tc formats. Mesa assumed this, so I just assumed it too. I have the quake3-smp binary, too. But sadly something is broken since Ian changed something in the gl context code some months ago. It worked before and gave additional ~10 fps and very nice system load. I only get a black (empty) window. He posted some hints several weeks ago but my system disk crashed and it took some time to get on pair, again... -Dieter --- This SF.net email is sponsored by: Perforce Software. Perforce is the Fast Software Configuration Management System offering advanced branching capabilities and atomic changes on 50+ platforms. Free Eval! http://www.perforce.com/perforce/loadprog.html -- ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] new s3tc patch
Well well, i just tried the patched version of the CVS tree with ut2003 and it's simply great, textures are just looking fine and the game is far more playable than it used to be, yet i had to disable TCL to avoid having purple textures sometimes. It really would be a shame if this patch wasn't integrated in some way in the dri project (or elswhere) because now the dri drivers really can be compared to the ati official drivers (for me the dri are way more stable and now offer the same visual quality (but not yet the same performance...)) P.S. just to let you know i played ut2003 in 1024x768 with the default quality values and tcl off on a radeon 9000 Mobility, a p4 2Ghz and 512Mo ram... P.P.S. CVS date is 26/12/2003 Really Steve you've done a great job with this patch, thanks. Le lun 29/12/2003 à 04:37, [EMAIL PROTECTED] a écrit : Hi, Patch is working very nicely on my 9200, textures look great in NWN. q3a runs fine as well although I didn't play around with the texture settings much. Many thanks. Regards Steve. --- This SF.net email is sponsored by: IBM Linux Tutorials. Become an expert in LINUX or just sharpen your skills. Sign up for IBM's Free Linux Tutorials. Learn everything from the bash shell to sys admin. Click now! http://ads.osdn.com/?ad_id=1278alloc_id=3371op=click -- ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel --- This SF.net email is sponsored by: IBM Linux Tutorials. Become an expert in LINUX or just sharpen your skills. Sign up for IBM's Free Linux Tutorials. Learn everything from the bash shell to sys admin. Click now! http://ads.osdn.com/?ad_id78alloc_id371op=click -- ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] new s3tc patch
Hi, Patch is working very nicely on my 9200, textures look great in NWN. q3a runs fine as well although I didn't play around with the texture settings much. Many thanks. Regards Steve. --- This SF.net email is sponsored by: IBM Linux Tutorials. Become an expert in LINUX or just sharpen your skills. Sign up for IBM's Free Linux Tutorials. Learn everything from the bash shell to sys admin. Click now! http://ads.osdn.com/?ad_id=1278alloc_id=3371op=click -- ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel