[google-appengine] Re: Massive datastore batch put, how to?

2009-08-21 Thread Mark Jones
That is similar to what I am seeing. Writing to the datastore is VERY expensive. 130K items for me consumes nearly 6.5 hours of CPU. Not very efficient On Aug 16, 5:36 pm, Stakka henrik.lindqv...@gmail.com wrote: I implemented a rough version of my solution, and it seems to work up to ~15k

[google-appengine] Re: Massive datastore batch put, how to?

2009-08-21 Thread Stakka
I wonder how Google thinks the app providers should bill their customers? A flat fee isn't feasible when a single user action costs soo much of the app profit. Limiting the number of actions (imports) is a possibility, but thats bad for the app's productiveness. Google should remove the cost of

[google-appengine] Re: Massive datastore batch put, how to?

2009-08-16 Thread Stakka
I implemented a rough version of my solution, and it seems to work up to ~15k entities. Above that I hit the undocumented transaction write limit you mention when trying to commit 36408 entities serialized into 24 blobs of 60 bytes: java.lang.IllegalArgumentException: datastore transaction

[google-appengine] Re: Massive datastore batch put, how to?

2009-08-15 Thread Juraj Vitko
I agree with everything you said. Just one thing to consider: by first storing the uploaded data, then retrieving that data for reprocessing and then storing the processed data again will consume additional resources / quotas of your app. GAE really appears to be designed for apps with very high

[google-appengine] Re: Massive datastore batch put, how to?

2009-08-14 Thread Juraj Vitko
I think you need to write your own Flash or Java Applet based chunked uploader. Or use an existing one and let us know, so that we can use it too. On Aug 12, 11:36 pm, Stakka henrik.lindqv...@gmail.com wrote: I'am working on an browser based accounting app which has a feature to import ledger

[google-appengine] Re: Massive datastore batch put, how to?

2009-08-14 Thread Stakka
Thanks for the tip, but why write a web app when Java Applets are required, that whouldn't be a good solution. Also, the uploaded file needs to be parsed in it's entirety (CRC check, value references, etc.), and it's not XML. I think I have to parse the file server-side, populate (Java) Entity

[google-appengine] Re: Massive datastore batch put, how to?

2009-08-13 Thread Nick Johnson (Google)
Hi Stakka, My suggestion would be to do something like this: - Split the uploaded file into 'jobs'. One job per 500k might be about right; it depends on your processing overhead. In any case, the job needs to be less than 1MB. - Insert the jobs into the datastore. - Add a task queue job for each