I'am working on an browser based accounting app which has a feature to
import ledger transactions through file uploads. Currently it's only
running on the local dev server, but from what I've read datastore
puts -- even batch -- is very slow and CPU (quota) intensive when
deployed live.

So, how do I overcome this problem when a user uploads a file with
thousands of transactions?

I've seen solutions where you batch put entities in chunks of 500.
That only works if you run a custom upload tool on your computer, not
from a browser since the request is limited to 30 seconds. Am I forced
to use the Task Queue? If so, where do I store the raw uploaded file
or preferably the parsed interim transaction entities when the task
isn't executing?

It's funny that App Engine has a 10 megabytes request (file upload)
size limit when storing 10 megabytes worth of entities is such a
hazzle.

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to