Re: On NSIncrementalStore UUID Uniqueness

2017-01-15 Thread Daryle Walker

> On Jan 14, 2017, at 2:32 PM, Jens Alfke  wrote:
> 
> 
>> On Jan 14, 2017, at 2:41 AM, Daryle Walker > > wrote:
>> 
>>  I’m seemingly stuck since the data format doesn’t have a UUID field within 
>> it and I can’t base a UUID off of a hash of the file since it would change 
>> after each edit.
> 
> There’s really no way to store any custom metadata in the file? I assume it’s 
> some sort of database-like file (since it can be used to store CoreData 
> objects), so couldn’t you create an extra record and store a UUID in it?

No, my file format is straight-up dumb data.

I’ve read for years that Core Data can support custom storage formats. Looking 
into it, I see that there are caveats. My first thought experiment, e-mail 
messages, was stymied by each non-primitive data block needing to have a 
database-ish ID. My second thought experiment, mbox files, is now stymied that 
the file as a whole needs a database-ish ID too.  (Since mbox files can be 
multi-gigabyte, I’d make their loading read-only, letting me use each record’s 
byte offset as the base for an ID.)  These IDs need to be consistently 
derivable; randomly-generated IDs are no good.

If I continue this idea, I’ll stick in a constant UUID and hope Core Data 
doesn’t really need universally-unique IDs for all potential stores.

— 
Daryle Walker
Mac, Internet, and Video Game Junkie
darylew AT mac DOT com 

___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com

Re: making a video from still frames that don't change that often plus audio

2017-01-15 Thread davelist

> On Jan 15, 2017, at 3:12 PM, Quincey Morris 
>  wrote:
> 
> On Jan 15, 2017, at 09:22 , davel...@mac.com wrote:
>> 
>> I have an iOS presentation app 
>> (https://itunes.apple.com/app/redraw/id1114820588?mt=8) that I currently 
>> make videos from by AirPlaying it to my Mac and using Screenflow on the Mac 
>> to show the iPad screen and record my audio from a microphone (and then 
>> edit). I'd like to build this functionality into my app directly
> 
> AVFoundation doesn’t seem to have the ability of capturing screen video on 
> iOS — AVCaptureScreenInput is documented as Mac only. That would rule out 
> AVFoundation for the basic video capture within your app. You might be able 
> to capture a series of screen shots, but it has to be done in real time, and 
> that’s going to be tricky to get right on iOS where you’ll need to buffer the 
> captured images to storage that might not be fast enough.
> 
> If you mean you want to write a companion Mac app, then I guess you can use 
> AVCaptureScreenInput to capture the raw video, and then you could use 
> AVAssetWriter to export your final, composed video. However, AVAssetWriter is 
> *not* a real-time function, so you couldn’t rely on it keeping up if you 
> tried to export as the user interleaves the still images with the raw video. 
> What you’d need to do is add a playback/edit phase, where you played the raw 
> video, captured the timing of the user edits (letting the playback skip 
> frames if the edits held up the playback), then export the “composition” when 
> the user is done. (Or, you could export in the background *during* editing, 
> which would mean it would be done soon after the user finishes, but this may 
> have adverse effects on playback on a lower-end Mac.)
> 
> AVCaptureScreenInput does let you choose the screen, though.
> 
> FWIW, since I’m not sure I properly understood exactly what solution you’re 
> looking for.

I'm talking about doing this on the iPad (not with a separate Mac app). I know 
the only option for recording the screen itself is using ReplayKit. I don't 
really need to record the screen. I want to write a video in real-time that 
consists of the audio from the microphone and an image that changes 
periodically (and that image happens to be shown on a UIImageView of the second 
screen of my app). So given that AVAssetWriter is not real time, I think my 
best option (if I want to do it all the work on the iPad without a separate Mac 
app) is to use ReplayKit.

The other option would be to let the user navigate to a specific image on the 
screen, record audio for that image, navigate to the next image, record audio 
for that image, and so on and then I could probably use AVAssetWriter to write 
that audio and the fixe image and then the next audio segment and the next 
fixed image, etc.

Thanks,
Dave Reed


___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com

Re: making a video from still frames that don't change that often plus audio

2017-01-15 Thread Quincey Morris
On Jan 15, 2017, at 09:22 , davel...@mac.com wrote:
> 
> I have an iOS presentation app 
> (https://itunes.apple.com/app/redraw/id1114820588?mt=8 
> ) that I currently 
> make videos from by AirPlaying it to my Mac and using Screenflow on the Mac 
> to show the iPad screen and record my audio from a microphone (and then 
> edit). I'd like to build this functionality into my app directly

AVFoundation doesn’t seem to have the ability of capturing screen video on iOS 
— AVCaptureScreenInput is documented as Mac only. That would rule out 
AVFoundation for the basic video capture within your app. You might be able to 
capture a series of screen shots, but it has to be done in real time, and 
that’s going to be tricky to get right on iOS where you’ll need to buffer the 
captured images to storage that might not be fast enough.

If you mean you want to write a companion Mac app, then I guess you can use 
AVCaptureScreenInput to capture the raw video, and then you could use 
AVAssetWriter to export your final, composed video. However, AVAssetWriter is 
*not* a real-time function, so you couldn’t rely on it keeping up if you tried 
to export as the user interleaves the still images with the raw video. What 
you’d need to do is add a playback/edit phase, where you played the raw video, 
captured the timing of the user edits (letting the playback skip frames if the 
edits held up the playback), then export the “composition” when the user is 
done. (Or, you could export in the background *during* editing, which would 
mean it would be done soon after the user finishes, but this may have adverse 
effects on playback on a lower-end Mac.)

AVCaptureScreenInput does let you choose the screen, though.

FWIW, since I’m not sure I properly understood exactly what solution you’re 
looking for.

___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com

Re: making a video from still frames that don't change that often plus audio

2017-01-15 Thread davelist
I misunderstood the 8 minute limitation - I thought it would interrupt you 
after 8 minutes of continuous recording, but trying it, I was able to record 
over 8 minutes without the dialog appearing. The problem I still see is that it 
only records the main screen. I want it to record what I'm showing on a second 
screen. I don't see anything in the ReplayKit API that would support that.

The relatively easy option would be to have a record mode where I only show 
what I want recorded. There are also some restrictions on what you can do with 
the ReplayKit video when you're done - you can share it but your app itself 
doesn't have access to the video. That may not be a problem for me, but I was 
thinking about including some editing functionality too.

I'll have to give it some more thought as I would prefer the ability to make 
the video directly with the second screen images but the ease of ReplayKit may 
win out.

Thanks,
Dave Reed



> On Jan 15, 2017, at 12:29 PM, Saagar Jha  wrote:
> 
> The 8 minute limitation is for the time the user’s choice is remembered for 
> the authorization prompt, is it not? Is there a limit for the actual 
> recording as well?
> 
> Saagar Jha
> 
>> On Jan 15, 2017, at 9:22 AM, davel...@mac.com wrote:
>> 
>> I have an iOS presentation app 
>> (https://itunes.apple.com/app/redraw/id1114820588?mt=8) that I currently 
>> make videos from by AirPlaying it to my Mac and using Screenflow on the Mac 
>> to show the iPad screen and record my audio from a microphone (and then 
>> edit). I'd like to build this functionality into my app directly. I see 
>> ReplayKit which would probably work (haven't checked if it can record the 
>> external screen my app makes) but the 8 minute limitation is probably a deal 
>> breaker. If that were 20 minutes, I would probably use it.
>> 
>> Basically what I want to do is to have the app record some audio (while the 
>> user is interacting with the app) and make a video that includes this audio 
>> and a series of sporadically updated images. For example, I have an image 
>> that is being shown on the screen of the app and want that image to be in 
>> the video until the user presses a button to change the image and then I 
>> want that image to be the image that is shown in the video (as the audio 
>> continues). Most of the time the same image might be shown for 10 or so 
>> seconds at a time, but occasionally the image might be updated at 30 fps.
>> 
>> So basically I want to make a video from this audio and periodically I will 
>> update the image that should be displayed and that frame should continue to 
>> be shown in the movie until I tell it to use a new image for the subsequent 
>> frames.
>> 
>> It looks like AVFoundation and specifically AVAssetWriter may be what I 
>> need? Is that the correct approach? 
>> 
>> I found this that sort of looks like what I want:
>> 
>> http://stackoverflow.com/questions/3741323/how-do-i-export-uiimage-array-as-a-movie
>> 
>> Although in my case I don't have all the images saved ahead of time (and 
>> don't want to save the images in memory while recording the audio because I 
>> would likely run out of memory). I just want to write the move to the flash 
>> storage as the audio is recorded.
>> 
>> My app is written in Swift although I wrote Objective-C code for 6 or so 
>> years so pointers to example code written in Objective-C are fine also.
>> 
>> Thanks,
>> Dave Reed


___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com

Re: making a video from still frames that don't change that often plus audio

2017-01-15 Thread Saagar Jha
The 8 minute limitation is for the time the user’s choice is remembered for the 
authorization prompt, is it not? Is there a limit for the actual recording as 
well?

Saagar Jha

> On Jan 15, 2017, at 9:22 AM, davel...@mac.com wrote:
> 
> I have an iOS presentation app 
> (https://itunes.apple.com/app/redraw/id1114820588?mt=8) that I currently make 
> videos from by AirPlaying it to my Mac and using Screenflow on the Mac to 
> show the iPad screen and record my audio from a microphone (and then edit). 
> I'd like to build this functionality into my app directly. I see ReplayKit 
> which would probably work (haven't checked if it can record the external 
> screen my app makes) but the 8 minute limitation is probably a deal breaker. 
> If that were 20 minutes, I would probably use it.
> 
> Basically what I want to do is to have the app record some audio (while the 
> user is interacting with the app) and make a video that includes this audio 
> and a series of sporadically updated images. For example, I have an image 
> that is being shown on the screen of the app and want that image to be in the 
> video until the user presses a button to change the image and then I want 
> that image to be the image that is shown in the video (as the audio 
> continues). Most of the time the same image might be shown for 10 or so 
> seconds at a time, but occasionally the image might be updated at 30 fps.
> 
> So basically I want to make a video from this audio and periodically I will 
> update the image that should be displayed and that frame should continue to 
> be shown in the movie until I tell it to use a new image for the subsequent 
> frames.
> 
> It looks like AVFoundation and specifically AVAssetWriter may be what I need? 
> Is that the correct approach? 
> 
> I found this that sort of looks like what I want:
> 
> http://stackoverflow.com/questions/3741323/how-do-i-export-uiimage-array-as-a-movie
> 
> Although in my case I don't have all the images saved ahead of time (and 
> don't want to save the images in memory while recording the audio because I 
> would likely run out of memory). I just want to write the move to the flash 
> storage as the audio is recorded.
> 
> My app is written in Swift although I wrote Objective-C code for 6 or so 
> years so pointers to example code written in Objective-C are fine also.
> 
> Thanks,
> Dave Reed
> 
> 
> ___
> 
> Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)
> 
> Please do not post admin requests or moderator comments to the list.
> Contact the moderators at cocoa-dev-admins(at)lists.apple.com
> 
> Help/Unsubscribe/Update your Subscription:
> https://lists.apple.com/mailman/options/cocoa-dev/saagar%40saagarjha.com
> 
> This email sent to saa...@saagarjha.com

___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com

making a video from still frames that don't change that often plus audio

2017-01-15 Thread davelist
I have an iOS presentation app 
(https://itunes.apple.com/app/redraw/id1114820588?mt=8) that I currently make 
videos from by AirPlaying it to my Mac and using Screenflow on the Mac to show 
the iPad screen and record my audio from a microphone (and then edit). I'd like 
to build this functionality into my app directly. I see ReplayKit which would 
probably work (haven't checked if it can record the external screen my app 
makes) but the 8 minute limitation is probably a deal breaker. If that were 20 
minutes, I would probably use it.

Basically what I want to do is to have the app record some audio (while the 
user is interacting with the app) and make a video that includes this audio and 
a series of sporadically updated images. For example, I have an image that is 
being shown on the screen of the app and want that image to be in the video 
until the user presses a button to change the image and then I want that image 
to be the image that is shown in the video (as the audio continues). Most of 
the time the same image might be shown for 10 or so seconds at a time, but 
occasionally the image might be updated at 30 fps.

So basically I want to make a video from this audio and periodically I will 
update the image that should be displayed and that frame should continue to be 
shown in the movie until I tell it to use a new image for the subsequent frames.

It looks like AVFoundation and specifically AVAssetWriter may be what I need? 
Is that the correct approach? 

I found this that sort of looks like what I want:

http://stackoverflow.com/questions/3741323/how-do-i-export-uiimage-array-as-a-movie

Although in my case I don't have all the images saved ahead of time (and don't 
want to save the images in memory while recording the audio because I would 
likely run out of memory). I just want to write the move to the flash storage 
as the audio is recorded.

My app is written in Swift although I wrote Objective-C code for 6 or so years 
so pointers to example code written in Objective-C are fine also.

Thanks,
Dave Reed


___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com