Hello everyone, and please forgive a newcomer's question. I need to observe the visible contents of a UIWebView, rendered in my iOS application, and each time the visible contents changes, I need to extract the contents, and transmit it to our server. A bit like screen-sharing on the Mac, but not for the whole screen contents, just a single view.
I started looking for the right delegate call, and found the differences between MacOS and iOS confusing me. I DO NOT need to re-transmit just because the user rotated his iOS device and the view rotated. but I DO want to re-transmit when the user pinches to zoom in and out, or scrolls the view. I do not want the whole contents transmitted. just the visible part of it. The contents of this UIWebView might have sub-view's I'm not aware of, and it may be rendering a URL to some file (PDF or other). I already know how to extract the image-buffer out of the CALAyer's context and I can set up a timer to do this N times a second. However, I'd like to avoid transmitting anything when the view is still, and I want to avoid frame-differencing 30 times a second to do this. The "right" way for me, would be to get notified when the UIWebView decides to re-display (or redraw?) its contents after some manipulation (drawing, zooming, rotating, animating etc.). Can anyone hint on where to start here? Thanks. Motti Shneor. Motti Shneor e-mail: motti.shn...@gmail.com phone: +972-8-9267730 mobile: +972-54-3136621 ---------------------------------------- Ceterum censeo Microsoftinem delendam esse _______________________________________________ Cocoa-dev mailing list (Cocoa-dev@lists.apple.com) Please do not post admin requests or moderator comments to the list. Contact the moderators at cocoa-dev-admins(at)lists.apple.com Help/Unsubscribe/Update your Subscription: https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com This email sent to arch...@mail-archive.com