>
> Case 1:
> User request a search => you (web2py) dispatch all the 5 (up to 20) 
> sub-external-searches, *only* after finishing all 5 - 20 
> sub-external-searches, send the response for the user.
>
> Case 2:
> User request a search => you (web2py) put all the 5 - 20 
> sub-external-searches in a background tasks, send a response to the user 
> like "Your search is being performed " (you can use javascript to polling 
> the server. And show the final result after the background tasks have 
> finshed.
>

A couple other options:

   - Similar to Case 1, but make the request to web2py via Ajax, and flash 
   the "Your search is being performed" message to the user while waiting for 
   the Ajax request to complete. Similar user experience to Case 2, but 
   without a background task.
   - Assuming you don't really need to process any of the results on the 
   server (i.e., to store in the db, etc.), you might consider doing the whole 
   thing from the browser in Javascript (i.e., have the browser directly fetch 
   the URLs via Ajax and assemble the results using Javascript).

Also, check 
out 
http://stackoverflow.com/questions/3490173/how-can-i-speed-up-fetching-pages-with-urllib2-in-python.
 
The Requests library also does async fetching using 
gevent: 
http://docs.python-requests.org/en/v0.10.6/user/advanced/#asynchronous-requests.

Anthony

Reply via email to