Check ikai lan's blog about the bulk load mapper.
Using a mapreduce job is the solution you are looking for.

On Jun 22, 4:58 am, finder-auto_admin <gontran.mag...@gmail.com>
wrote:
> Hello all,
>
> I just tried to import data in my datastore from a big csv file and like
> others, I got the Deadline exception because my request took more than 30
> seconds.
>
> I know nobody would wait for 30s to get a page loaded but this load of data
> is triggered when I click on a "load data" on an admin part of my
> application (not the official app engine admin part, just a page I did).
>
> A solution for this is to divide this task into several tasks. However I
> don't really know how to do it. In my example it's called in a servlet right
> before including a jsp page. Should I end my servlet and call other
> instances with part of the job to do for each? How can this be done?
>
> Another solution I thought of is using cron job or task queues. Are task
> queues limited by the 30s limit too? Same question about the cron jobs.
>
> What is the best way to do this?
>
> Many thanks,

-- 
You received this message because you are subscribed to the Google Groups 
"Google App Engine for Java" group.
To post to this group, send email to google-appengine-java@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine-java+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine-java?hl=en.

Reply via email to