There's an explanation of this exact problem, and a proposed solution from 2010 Google IO, "Building high-throughput data pipelines on Google App Engine" http://www.google.com/events/io/2010/sessions/high-throughput-data-pipelines-appengine.html
You use task queue as part of a data pipeline. The first task is put in to the task queue, say by a cron job. That task queries the data store for a list of emails to send, limited to 10 email addresses. If the query returns 10 email addresses assume that there are more than 10 emails to send and the task creates a child task which queries the data store for the next 10 email addresses. After the child task has been created the parent task actually sends the email to each of the 10 recipients. That way you load the task queue with email sending tasks which more than likely work in parallel improving throughput. Just ensure that you set the task name property so that if for some reason one of the tasks is called again you don't get a fork bomb. On Jan 21, 7:57 pm, Mayumi Liyanage <mayumi.liyan...@coldwin.com> wrote: > Hi, I need to iterate over all the entities in the Datastore and send > out emails once a day asynchronously to the actual app. > Usual way to do this would be to invoke a Servlet using cron which > would iterate over all the entities to send emails out. However, our > data is growing at the rapid rate and sooner or later we will have a > issue with 30 sec limit problem. > What would be the best way to do this operation using app engine > without worrying about 30 sec limit? > > Can we do above using the Mapper API? If so how can we invoke a mapper > from the servlet? > > Thanks. -- You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com. To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en.