Bernhard Wymann wrote:
Hi, nice to hear from you,

If you'd use glutInitDisplayString with depth>=16 you _should_ get a 24bit depth buffer if available though, at least in theory - if
not that would be a bug in glut or the glx visual matching code.


That might be true, but the code runs also on Windows and proprietary
 linux drivers, and at least my TNT and GeForce 2 defaults to 16 bit
 depth buffer in Linux (with 16 bit color depth), also the ATI Radeon
 Mobility and GeForce Go are on 16 bit depth buffers even at 32 bit
color depth (in Windows).
Ah yes, I forgot everybody optimizes for frame rates nowadays instead of
quality. IIRC there are driver settings available to change the "default" z-buffer depth, but those might not even be accessible by the driver control panel. Well, the driver behaviour is probably not outright illegal, so I guess you really need to specifically ask for 24bit z-buffer.


Also, the OP is using the drivers from matrox, he should try the "normal" drivers. If this bug is only in the matrox version of the
driver (I don't think it is, but it might be possible), then
there's little the dri developers could do to help.


Ah, ok, I did not know about how proprietary the matrox driver
finally is (how did you see that anyway, glxinfo talks about Mesa and
VA?).
It is mentioned in the thread you linked to ;-). Though the parts of the mga driver which are not open-source should not be relevant to this bug AFAIK. But matrox might have screwed up something, who knows?

Roland


------------------------------------------------------- SF.Net is sponsored by: Speed Start Your Linux Apps Now. Build and deploy apps & Web services for Linux with a free DVD software kit from IBM. Click Now! http://ads.osdn.com/?ad_id=1356&alloc_id=3438&op=click -- _______________________________________________ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel

Reply via email to