The C++ code example appears to be using multitexture. Multitexture support is in the pyglet trunk, thanks to Alex, as a vertex list attribute. I'm not sure what you're ultimately trying to do, but multitexturing might work for you, depending. Of course, the resulting texture will still not have alpha preserved - and that may be why nobody suggested multitexture as an option for you. I posted a crude multitexturing example here earlier, that works in pyglet 1.1.
With regards to your distorted pink square, you are blitting the pink square first, one time, before the window is ever cleared - then you enter your app loop, and every time on_draw is called you're blitting the pink square, but you are not clearing the window. Every time the loop runs, the buffers are being flipped, but the pink square is only in one of them (along with who knows what else). If you place a 'screen.clear()' line before the green_square.blit line, the result is as expected - the pink square is blitted and then almost instantly covered by the green one, as the pink square is never drawn again. Pyglet excels at giving easy access to OpenGL, cross-platform, and stably. If that was all it was, it would be great, but it also has grown to have nifty things like graphics batches, sprites, etc. All of these are python-based but are ultimately using OpenGL to make things happen - just in a way that is pythonic and surprisingly speedy. When you want something which is outside the scope of OpenGL, then you are faced with using other modules (introducing dependencies), or using pure python (probably slow). Collision detection is an example. While there may be tricks to get OpenGL to help with collision detection, it really boils down to a task that requires boatloads of comparisons. This places a significant limitation on how many sprites you could animate, i you want to detect collisions unilaterally. Adding collision detection as a feature in pyglet would require introducing a dependency, and not one like OpenGL that is nigh-universal. But, there happens to be a module called pymunk, which uses ctypes to access chipmunk physics, with which a pyglet app can do collision detection and animation of hundreds of sprites. It might be very useful to many people. Alex could decide to integrate support for chimpunk physics into pyglet, but that would radically change what pyglet is. On the other hand, nothing stops me from doing that myself. -price On Sep 8, 4:55 am, 3TATUK <[EMAIL PROTECTED]> wrote: > :) Thanks for the reply Alex! > > Nice to see there's actually some code for mode changing going and > it's on a wishlist. I just thought you were very inflexible as to the > whole idea when, from my perspective, and not only to myself, the idea > is rather significant. > > Please to add alpha blit_into()'ing to the wishlist? ^_^ > > And as far as working directly with framebuffers.. it's not quite as > easy as I thought. > > First off, I don't want to create off-screen FBO's because then the > framebuffer extension will be required. So that leaves working with > the current active framebuffer as a temporary 'drawing board' to blit > to (with alpha) then save from.. > > The issue that pops up right away with that is what if the destination > texture is bigger than the window size? My 'solution' is to 'simply' > work in 'pieces' which are the size of the window, then blit piece-by- > piece into a new resultant texture.. > > Obviously this can potentially still require lots of blit_into() calls > for a single blit_into() call with desired alpha. Slow. > > And that's also not the only issue with using the active framebuffer > as a temp board for alpha blitting.. > > I've already started to try doing this 'piece construction' but > something strange happens when blitting textures outside of the > context of @window.even or the function specified by scheduling... The > texture gets like misplaced and distorted and stretched out? So it's > definitely not too feasible. > > But on the other hand.. there _has_ to be some lower-level OpenGL > method of blitting one texture to another with alpha. I've even seen > the C/++ code for it and tried with pyglet but get that same > distortion result. > > Here's the C/++ > code:http://www.gamedev.net/community/forums/topic.asp?topic_id=372568&whi... > > And here's my simplified distortion example: > > from pyglet import * > > the_color_pink = image.SolidColorImagePattern( ( 255, 0, 255, 255 ) ) > the_color_green = image.SolidColorImagePattern( ( 0, 255, 0, 255 ) ) > > pink_square = the_color_pink.create_image( 100, 100 ) > green_square = the_color_green.create_image( 100, 100 ) > > screen = window.Window() > > @screen.event > def on_draw(): > green_square.blit( 0, 0 ) > pass > > pink_square.blit( 0, 0 ) > > app.run() > > # Notice both green_square and pink_square are 100x100 pixels but the > pink one is 'distorted' > # whereas the green box turns out okay > > # Thank you for your work on Pyglet, Alex! > > On Sep 8, 2:35 am, "Alex Holkner" <[EMAIL PROTECTED]> wrote: > > > On Mon, Sep 8, 2008 at 4:37 PM, Brian Fisher <[EMAIL PROTECTED]> wrote: > > > If you really want this stuff, I would suggest you go start writing it. > > > Resolution changing isn't that hard really, there is plenty of source for > > > it, SDL has it for tons of platforms, and ctypes makes it easy to call > > > platform stuff from python. > > > If you're interested in monitor resolution switching, there's (mostly) > > working code in trunk/experimental/modeswitch. The code there even > > handles segfaults cleanly on Linux. It's not yet stable enough for > > inclusion into pyglet, nor am I willing to support the code that's > > there in its current state. > > > > Customized texture to texture blitting is > > > actually even easier to write as long as it can go super slow. > > > Well, it's impossible at the texture level. If the texture you're > > blitting to is also in memory (or you're willing to pull it out of > > graphics memory), you can use PIL or another graphics toolkit to > > recompose the image, and then upload it again. There's not much > > functionality pyglet could grow here, unless it's growing a complete > > image toolkit (which would be great! but infeasible in Python in my > > opinion). As I suggested to the OP, it's usually much faster/easier > > to composite on the framebuffer instead. > > > See also the wishlist > > athttp://code.google.com/p/pyglet/wiki/ReleaseScheduleifyou haven't > > already; I collect requests/ideas here. > > > Alex. --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "pyglet-users" group. To post to this group, send email to [email protected] To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/pyglet-users?hl=en -~----------~----~----~----~------~----~------~--~---
