Just an update, but the only way I could get it to work was to include the 
sample_buffers kwarg.

Works:
Config(sample_buffers=1, depth_size=16, double_buffer=True,)

Doesn't work:
Config(depth_size=16, double_buffer=True,)



On Saturday, July 21, 2018 at 12:32:58 PM UTC-5, Charlie wrote:
>
> I'm not entirely sure, my drivers are up to date. If I edit my config to 
> force 16 bit depth I get the following:
>
>   File "c:\python27\lib\site-packages\pyglet\window\win32\__init__.py", 
> line 131, in __init__
>     super(Win32Window, self).__init__(*args, **kwargs)
>   File "c:\python27\lib\site-packages\pyglet\window\__init__.py", line 559
> , in __init__
>     self._create()
>   File "c:\python27\lib\site-packages\pyglet\window\win32\__init__.py", 
> line 263, in _create
>     self.context.attach(self.canvas)
>   File "c:\python27\lib\site-packages\pyglet\gl\win32.py", line 265, in 
> attach
>     super(Win32ARBContext, self).attach(canvas)
>   File "c:\python27\lib\site-packages\pyglet\gl\win32.py", line 210, in 
> attach
>     raise gl.ContextException('Unable to share contexts')
> pyglet.gl.ContextException: Unable to share contexts
>
>
> On Sunday, June 17, 2018 at 8:19:35 PM UTC-5, Benjamin Moran wrote:
>>
>> That's an interesting find, Charles. Does it seem like a driver bug, or 
>> dropped support? It's possibly an unnoticed bug, since most 3D games 
>> probably don't use 16bit depth anymore for anything. 2D games likely leave 
>> this as default. 
>>
>>
>>
>> On Saturday, June 16, 2018 at 3:46:40 AM UTC+9, Charles wrote:
>>>
>>> Just so people know, I tried setting 16 bit manually and got an OpenGL 
>>> context error. (Nvidia GTX 1070)
>>>
>>> Changing it back to 24 bit works without issue. I'd hold off on forcing 
>>> it as default as it seems some configurations do not support this.
>>>
>>> On Monday, May 28, 2018 at 4:32:55 PM UTC-5, Daniel Gillet wrote:
>>>>
>>>> Hi Adam,
>>>>
>>>> I purposefully used vsync=True to show the mode I was using. If you try 
>>>> the code yourself, you will notice that just changing vsync to False won't 
>>>> make the flip function any faster. Although it will wait for the next 
>>>> vsync, the time is actually mainly spent in transferring the data to the 
>>>> GPU and swapping buffers. Vsync was one of the things I tried when I was 
>>>> investigating.
>>>>
>>>> I agree with everybody above. This would probably should find a place 
>>>> in the documentation, but I also would refrain from setting a 16 bits 
>>>> depth 
>>>> buffer as default. Now if used for 2D application, there won't be any 
>>>> polygon z-fighting AFAIK. But I might be wrong here. I'm still learning. 
>>>> :) 
>>>> Now Pyglet probably does not assume anything regarding the projection, and 
>>>> a perspective projection would exhibit that issue. Reading a bit on the 
>>>> subject, it seems that there is a nice technique using the invert of the z 
>>>> value for better depth buffer precision. But anyway this is irrelevant for 
>>>> the current conversation.
>>>>
>>>> I fully agree with Benjamin. The documentation could use some love.
>>>>
>>>> Cheers,
>>>> Daniel
>>>>
>>>

-- 
You received this message because you are subscribed to the Google Groups 
"pyglet-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/pyglet-users.
For more options, visit https://groups.google.com/d/optout.

Reply via email to