On Sun, Mar 15, 2015, at 02:22 AM, Patrick J. Collins wrote:
> My next approach was to save my drawn waveform to an NSImage and use
> that as a background for my view...  If you have a better suggestion for
> how I could handle this, I'd love to hear it.

This is a good idea. It would be better yet if the waveform were
rendered in a view that returned YES from -canDrawConcurrently, but you
have to be careful about syncing access to your audio data between the
view's background thread and the thread that's generating your audio
data.

> And to answer your question, the selection would be selecting frames of
> audio, so that when it is played, the playhead starts there and ends at
> the end of the rectangle.

The problem with this approach is that it assumes that everything draws
into the same bitmap context. To get an XOR effect to work correctly
over an arbitrary region of a window, you need to read back the
flattened contents of the view hierarchy, invert it, and write it back
out. In modern GUI environments (and especially with Core Animation
layer-backed views), not everything renders into the same graphics
context; they might render into separate bitmaps and let the window
manager composite them together.

As you drag a selection across your waveform, you could take a note of
the selection region and send -setNeedsDisplay to your view, and have
-drawRect take care of inverting the colors of the waveform image that
it draws in the dirty regions that intersect the selection region.

--Kyle Sluder
_______________________________________________

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com

Reply via email to