I think you need to write your own Flash or Java Applet based chunked
uploader. Or use an existing one and let us know, so that we can use
it too.

On Aug 12, 11:36 pm, Stakka <henrik.lindqv...@gmail.com> wrote:
> I'am working on an browser based accounting app which has a feature to
> import ledger transactions through file uploads. I'am currently only
> running on the local dev server, but from what I've read datastore
> puts -- even batch -- is very slow and CPU (quota) intensive when
> deployed live.
>
> How do I overcome this problem if the user uploads a large file with
> thousands transaction?
>
> I've seen solutions where you batch put entities in chunks of 500.
> That only works if you run a custom upload tool on your computer, not
> from a browser since the request is limited to 30 seconds. Am I forced
> to use the Task Queue? But where do I store the raw uploaded file or
> the preferably parsed interim transaction entities when the task isn't
> executing?
>
> Funny App Engine has a 10 megabyte request (file upload) size limit
> when storing 10 megabyte worth of entities seems to be so hard.
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to