I agree it's a simple placeholder rule for now until a better solution is 
created (especially since anyone can override this rule manually). To 
expand on this, I would say the minimum size inclusion really isn't a good 
system by just going off of width or height.

Whether its 512x512, 1024x256, 2048x128 you are still at the same amount of 
pixels. Limiting it to a set square size just discriminates the kind of 
images you can pack. Not sure the best solution, possibly go off of pixel 
count instead? Maybe something to look into is how some other frameworks 
handle this?

My question is why put a limit on the size of objects that can be packed in 
the first place (other than the maximum the texture allows)? 

On Wednesday, February 8, 2017 at 4:12:02 AM UTC-6, Benjamin Moran wrote:
>
> I made some additional changes, and I think it makes sense now. There is a 
> new _get_max_texture_size() function inside the atlas module, which both 
> TextureAtlas and TextureBin validate against. If you request an impossible 
> size, you will get back a the biggest size possible instead of crashing.  I 
> also changed the default width/height to 2048, after much searching and 
> after seeing this:  
> http://feedback.wildfiregames.com/report/opengl/feature/GL_MAX_TEXTURE_SIZE
> It looks like 99% of people have cards that can provide 2048, so this 
> seemed reasonable. OpenGL 3.0 requires this size as well, so it seems a 
> good choice.
>
> The resource module now defaults to 512x512 as the maximum size for 
> inclusion in the default TextureBin. Perhaps 1024 would also make sense. 
> I'll have to consider that some more. 
>
> -Ben
>
>
> On Wednesday, February 8, 2017 at 11:29:54 AM UTC+9, Benjamin Moran wrote:
>>
>> Hi Charles, you're right about that. I've been doing some tests with 
>> 4096x4096 altases, and I've also increased those values in my tests out of 
>> necessity. Currently, it's excluding images that are 1/2 of the atlas size. 
>> That seems like a reasonable rule to keep for now. Long term, I think it 
>> would be great to enhance the pyglet.image.atlas.Allocator class that is 
>> used by the TextureAtlases. This is a nice little bit of pure-python code, 
>> that could be improved by someone who likes math :)
>>
>> I've actually just had a thought, and I think it may make sense to put 
>> the MAX_TEXTURE_SIZE checks into the TextureAtlas class itself. It could 
>> default to 4096x4096, with an internal check to confirm what it's capable 
>> of. If you can't get the atlas size you requested, it will simply return 
>> whatever smaller atlas it's capable of (while possibly making a call to the 
>> warning module).  Having the code in there would mean that you can create a 
>> TextureAtlas without having to think about the size. This could also mean 
>> that your code may not crash if run on an older GPU. I'm going to explore 
>> this idea. 
>>
>>
>> On Wednesday, February 8, 2017 at 11:05:36 AM UTC+9, Charles wrote:
>>>
>>> Another consideration I saw when I was looking at the requirements is 
>>> the 128 width OR height limit in the texture loading with a 256 sized Bin.
>>>
>>> If I load two 128x256 images, they would both fit in a 256x256 image, 
>>> but in the code since it's > 128, they would each be given their own 
>>> texture ID instead of being able to put it into 1.
>>>
>>> Maybe there needs to be smarter packing checks?
>>>
>>>
>>> On Monday, January 30, 2017 at 1:26:16 AM UTC-6, Benjamin Moran wrote:
>>>>
>>>> Hi guys,
>>>>
>>>> Thinking about this issue on the tracker: 
>>>> https://bitbucket.org/pyglet/pyglet/issues/107/texture-atlas-sizes-too-small-for-most-gfx
>>>> Currently, the default texture size for a TextureAtlas is 
>>>> 256x256xRGBA.  The pyglet.resource module only places images into a 
>>>> TextureAtlas if it's smaller than 128x128.
>>>>
>>>> Even for older cards, these numbers are very small. We should probably 
>>>> bump this up so that it better fits current resolutions. This could cause 
>>>> issues on some older laptop hardware, but maybe it can fall back to a 
>>>> lower 
>>>> resolution if it throws an error. Doing a query for the maximum texture 
>>>> size probably wouldn't work, since modern cards can support some 
>>>> rediculously large textures. Perhaps we can do the following:
>>>>
>>>> 1. Raise the default size to something like 1024x1024, or 2048x2048, 
>>>> (or?), and if there is a texture creation exception, we can fall back to 
>>>> 512x512.
>>>> 2. Bump the minimum pyglet.resource size for adding to an Atlas to 
>>>> 256x256 (or?). 
>>>>
>>>> Any thoughts?
>>>>
>>>

-- 
You received this message because you are subscribed to the Google Groups 
"pyglet-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/pyglet-users.
For more options, visit https://groups.google.com/d/optout.

Reply via email to