The typical answer is to split up your operation.
You solve this particular problem in a few ways:
 * Split the data--if it only takes about 30 seconds to parse your
entire file, then splitting it into two should easily get you under
the 30 second limit on the server. Do note that the performance
characteristics of the datastore are quite different on the server
than in dev_appserver.
 * Use remote_api -- write your translation tool with remote_api in
mind and run it locally but pointing at your production datastore.
 * Use the bulkloader -- effectively the same thing, wrapped in some
code which handles backoff and such for you. However this may be
difficult.
 * Run it locally, importing into dev_appserver, and use the new --
dump and --restore feature of the bulkloader to move the data to your
app on App Engine.

http://code.google.com/appengine/docs/python/tools/uploadingdata.html
http://code.google.com/appengine/articles/remote_api.html

--Matthew

On Sep 3, 11:43 am, dogan kaya berktas <dkberk...@gmail.com> wrote:
> I am converting a legacy project to a new developed java project on
> GAE. I exported the old data into an XML file and want to parse it and
> add to GAE once. However it takes around 30 seconds to read and create
> new data structure on my local machine and this is why I can not run
> it on GAE even if, it should only called once.
>
> Any suggestions for long batch operations on GAE?
>
> Thanks
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to