Each pageview and each task which is waiting on URLFetch to return
will be eating into your "simultaneous dynamic request limit." So this
won't work if you have several users accessing it at once.

This was my experience, anyway. Especially if you have URLFetchs
timing out--they consume one of your allotted processes for the entire
ten seconds they are waiting. I've got logs full of requests
terminated with 500s :-(

My advice is to be careful with URLFetch if you use it. You can
control how fast your apps execute, but you can't control how long
remote servers will take to respond, and both sorts of delays bring
you closer to 500-vile
.

On Jan 21, 7:31 pm, tiburondude <david.jonathan.nel...@gmail.com>
wrote:
> Let's say I want to build an app that for each http request a task is
> placed in the task queue.  This task does 16 parrallel http calls per
> run, and is run constantly, writing results to memcache and the
> datastore.  Will this work?
>
> Another use case.  Let's say an http request comes in and I do a
> urlfetch for 16 parrallel http calls, then join the results and output
> to the browser.  Let's say each call takes 300 milliseconds and the
> joining/parsing of the results takes 500 milliseconds or less.  The
> total http request would take 800 milliseconds to process.  Will this
> work?
>
> Thanks,
> David

-- 
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appeng...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en.

Reply via email to