As Nick said, 200 requests may be a bit high for a single request, so 2000
will very likely time out, yes. Of course, this partly depends on the
response time from the remote server.

While the try-catch idea is certainly more elegant, you may just want to set
up multiple Cron jobs at the beginning to hit only a subset of the URLs in
order to avoid possible time outs.

- Jason

2009/4/28 Roberto López <roberto.lopez.del...@gmail.com>

> if i want to request 2000 urls i will recive timeout ?
>
> 2009/4/28 Tom Wu <service.g2...@gmail.com>
>
>> 30 seconds
>>
>>
>> 2009/4/28 Nick Johnson <nick.john...@google.com>
>>
>>
>>> You may be interested in the support for asynchronous URL fetching.
>>> Pubsubhubbub has a module for it here:
>>>
>>> http://code.google.com/p/pubsubhubbub/source/browse/trunk/hub/urlfetch_async.py
>>> . Examples of how to use it can be found elsewhere in the code.
>>>
>>> 200 may still be too many for a single request, I'm not sure, but it's
>>> certainly more practical than fetching them serially. :)
>>>
>>> -Nick Johnson
>>>
>>>
>>>
>>
>>
>>
>
>
> --
> Roberto López del Río
>
>
> >
>

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to