I'm currently writing an iOS backend for a cross-platform program whose
platform-independent engine writes all of its graphics into 32-bit pixel
buffers, in RGBA order. The alpha byte isn't used. The graphics are always
opaque so I don't need alpha blending.

What is the most efficient option to draw and scale these pixel buffers to
my CGContextRef inside my drawRect method? The pixel buffers are usually
only 320x240 pixels and need to be scaled to completely fill my view's
dimensions, e.g. 1024x768 on non-Retina iPads and 2048x1536 on Retina iPads.
This is a whole lot of work so it's best done using the GPU. But how can I
force iOS to draw and scale using the GPU without using OpenGL?

I've tried using CGContextDrawImage() but this is really slow, probably because
everything is done using the CPU.

I've also had a look at the CIImage APIs because these are apparently GPU
optimized but the problem is that CIImage objects are immutable so I'd have to
constantly create new CIImage objects for each frame I need to draw which will
probably kill the performance as well.

I could go with OpenGL of course but I'd like to get some feedback on whether
there is an easier solution to get what I want here.

Thanks for any ideas!

-- 
Best regards,
 Andreas Falkenhahn                          mailto:andr...@falkenhahn.com

_______________________________________________

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com

Reply via email to