A feature we have in the pipeline is batch inserting of tasks, i.e.
inserting multiple tasks with a single API call.  This will
significantly increase the number of tasks that can be inserted in the
context of a single request.

On Nov 3, 1:07 pm, Eli <eli.jo...@gmail.com> wrote:
> Modify your task to allow emails to be sent out in batches 10 or 100
> or 1000 users at a time..
>
> This seems to be the most common approach to all bottlenecks..
> inserting to the DB.. etc.
>
> On Nov 2, 8:55 pm, Greg Tracy <greg.tr...@att.net> wrote:
>
>
>
> > I've found that creating and adding Task objects is quite expensive.
> > Consider the following...
>
> >          for u in users:
> >              task = Task(url='/emailqueue', params=
> > {'email':u.email,'body':body})
> >              task.add('emailqueue')
>
> > Each user in this loop adds another 34.5 "cpu usages". It is by far
> > the most expensive operation I have. Right now, I don't think it will
> > scale much higher than 300 loop iterations before I start running into
> > timeouts.
>
> > Is there a better way to do this?
>
> > Thanks.
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to