I think the main limitation here is that the pyglet.image.atlas.Allocator class uses a fairly simple packing alghorithm. It works great for a bunch of 32x32 pixel sprites, but not so well for oblong sized images. In those cases it would be best to pre-pack the textures with another program, since pyglet doesn't (currently) repack textures depending on the dimensions of new textures coming in. (Or just make a TextureAtlas/Bin manually to suite the sizes).
If we go with 2048x2048 as the new default, and considering the simple packing alghorithm, 1024 seems OK to me as the max size for inclusion. I've added a query of the new pyglet.image.atlas.get_max_texture_size() function to the resource module as well, so that it will default to 1024, OR, the max size // 2. I think that everything should scale down smoothly without crashing on old cards, even though we've increase the default size by 8x. About Power-of-Two textures, pyglet doesn't place any restrictions on that. If you make your own Textures, or TextureAtlas/Bins, you can (try to) request whatever you want. It just happens that pyglet internally defaults to POT textures for stuff like the resource module, just because it makes sense. On Thursday, February 9, 2017 at 5:32:49 AM UTC+9, Charles wrote: > > After thinking about this I can see the need for a limit, doesn't make > sense to blit a texture that's 2000x2000 onto a 2048x2048. Just a waste of > IO and blitting time. I'm unsure on what might be best here, but I know > that 25% is too little, but 50% minimum (on pixels) I think is a good > baseline. > > Only reason I mention this a lot of frameworks support and recommend some > sort of texture packing externally, but even with packed images, you can > still get benefits. > > Here is an example, when I pack my sprites using another program, I > specify the maximum width of the images to be 1024. Now this image can > handle a couple hundred sprites. Much more preferable to loading 80 > different sprites through I/O and blitting them to a texture. Even so, you > can still get benefits of already packed sprites. If I have 3 different > images that are 1024x256 one contains grass-ground tiles, one contains > rock-ground tiles, the last are my water tiles, they could all cleanly fit > in one bigger texture. It wouldn't make sense to combine them externally > since they are different sets and not all maps would use them. Internally > it would, since all of the ground gets drawn at the same time. > > This just may be wildly specific to just me though. However, I think > people looking to improve load times may already be externally packing (as > loading many images can take a while, and using threads to load images > through pyglet is not thread safe) and this could affect their performance. > > > On Wednesday, February 8, 2017 at 2:07:42 PM UTC-6, Charles wrote: >> >> I agree it's a simple placeholder rule for now until a better solution is >> created (especially since anyone can override this rule manually). To >> expand on this, I would say the minimum size inclusion really isn't a good >> system by just going off of width or height. >> >> Whether its 512x512, 1024x256, 2048x128 you are still at the same amount >> of pixels. Limiting it to a set square size just discriminates the kind of >> images you can pack. Not sure the best solution, possibly go off of pixel >> count instead? Maybe something to look into is how some other frameworks >> handle this? >> >> My question is why put a limit on the size of objects that can be packed >> in the first place (other than the maximum the texture allows)? >> >> On Wednesday, February 8, 2017 at 4:12:02 AM UTC-6, Benjamin Moran wrote: >>> >>> I made some additional changes, and I think it makes sense now. There is >>> a new _get_max_texture_size() function inside the atlas module, which both >>> TextureAtlas and TextureBin validate against. If you request an impossible >>> size, you will get back a the biggest size possible instead of crashing. I >>> also changed the default width/height to 2048, after much searching and >>> after seeing this: >>> http://feedback.wildfiregames.com/report/opengl/feature/GL_MAX_TEXTURE_SIZE >>> It looks like 99% of people have cards that can provide 2048, so this >>> seemed reasonable. OpenGL 3.0 requires this size as well, so it seems a >>> good choice. >>> >>> The resource module now defaults to 512x512 as the maximum size for >>> inclusion in the default TextureBin. Perhaps 1024 would also make sense. >>> I'll have to consider that some more. >>> >>> -Ben >>> >>> >>> On Wednesday, February 8, 2017 at 11:29:54 AM UTC+9, Benjamin Moran >>> wrote: >>>> >>>> Hi Charles, you're right about that. I've been doing some tests with >>>> 4096x4096 altases, and I've also increased those values in my tests out of >>>> necessity. Currently, it's excluding images that are 1/2 of the atlas >>>> size. >>>> That seems like a reasonable rule to keep for now. Long term, I think it >>>> would be great to enhance the pyglet.image.atlas.Allocator class that is >>>> used by the TextureAtlases. This is a nice little bit of pure-python code, >>>> that could be improved by someone who likes math :) >>>> >>>> I've actually just had a thought, and I think it may make sense to put >>>> the MAX_TEXTURE_SIZE checks into the TextureAtlas class itself. It could >>>> default to 4096x4096, with an internal check to confirm what it's capable >>>> of. If you can't get the atlas size you requested, it will simply return >>>> whatever smaller atlas it's capable of (while possibly making a call to >>>> the >>>> warning module). Having the code in there would mean that you can create >>>> a >>>> TextureAtlas without having to think about the size. This could also mean >>>> that your code may not crash if run on an older GPU. I'm going to explore >>>> this idea. >>>> >>>> >>>> On Wednesday, February 8, 2017 at 11:05:36 AM UTC+9, Charles wrote: >>>>> >>>>> Another consideration I saw when I was looking at the requirements is >>>>> the 128 width OR height limit in the texture loading with a 256 sized Bin. >>>>> >>>>> If I load two 128x256 images, they would both fit in a 256x256 image, >>>>> but in the code since it's > 128, they would each be given their own >>>>> texture ID instead of being able to put it into 1. >>>>> >>>>> Maybe there needs to be smarter packing checks? >>>>> >>>>> >>>>> On Monday, January 30, 2017 at 1:26:16 AM UTC-6, Benjamin Moran wrote: >>>>>> >>>>>> Hi guys, >>>>>> >>>>>> Thinking about this issue on the tracker: >>>>>> https://bitbucket.org/pyglet/pyglet/issues/107/texture-atlas-sizes-too-small-for-most-gfx >>>>>> Currently, the default texture size for a TextureAtlas is >>>>>> 256x256xRGBA. The pyglet.resource module only places images into a >>>>>> TextureAtlas if it's smaller than 128x128. >>>>>> >>>>>> Even for older cards, these numbers are very small. We should >>>>>> probably bump this up so that it better fits current resolutions. This >>>>>> could cause issues on some older laptop hardware, but maybe it can fall >>>>>> back to a lower resolution if it throws an error. Doing a query for the >>>>>> maximum texture size probably wouldn't work, since modern cards can >>>>>> support >>>>>> some rediculously large textures. Perhaps we can do the following: >>>>>> >>>>>> 1. Raise the default size to something like 1024x1024, or 2048x2048, >>>>>> (or?), and if there is a texture creation exception, we can fall back to >>>>>> 512x512. >>>>>> 2. Bump the minimum pyglet.resource size for adding to an Atlas to >>>>>> 256x256 (or?). >>>>>> >>>>>> Any thoughts? >>>>>> >>>>> -- You received this message because you are subscribed to the Google Groups "pyglet-users" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To post to this group, send email to [email protected]. Visit this group at https://groups.google.com/group/pyglet-users. For more options, visit https://groups.google.com/d/optout.
