On Apr 21, 2012, at 11:02 AM, Carlos wrote:
> My environment: latest web2py trunk, ubuntu, postgresql, nginx, uwsgi.
> 
> Following are some requirements for my new potential project (where SYS is my 
> web2py system, and CLIENT and SERVER are two completely different / 
> independent non-web2py remote servers):
> 
> (for each web services call ...)
> 
> + CLIENT initiates the communication to SYS via web services to request some 
> data.
> 
> + during this same connection (in real time), SYS connects to the remote 
> SERVER to execute some SQL commands (in a MySQL database) and get the 
> requested data.
> 
> + finally SYS responds to CLIENT with the data.
> 
> In summary, SYS will act as the middle-man between CLIENT and SERVER.
> 
> Questions:
> 
> Is this doable?, should I be aware of any issues that might arise?, 
> concurrency, others?.
> 
> Do you recommend other ways to accomplish this?.
> 
> Do you know how can my web2py server connect remotely to a non-web2py server 
> to execute some SQL commands?.
> 
> Will SYS have enough time for this process to complete on each web service 
> call?.
> 
> I would just like to know your general (or specific) recommendations about 
> this scenario.
> 

It can work well, but there are some performance issues to consider. Your SYS 
application will have to set up a new connection to SERVER for each request, 
and this might or might not be expensive. Web2py itself implements this kind of 
architecture (where SERVER is the database server), but it goes to considerable 
lengths with connection pooling to reduce the overhead of repeated 
reconnections.

I've used this kind of architecture for a couple of apps, where CLIENT is a 
mobile device and SERVER is some third-party service (geolocation, video 
meta-data) that the CLIENT needs mediated access to. In both cases, the calls 
to SERVER are lightweight JSON-RPC requests, and I use caching where possible 
to reduce the need for calls to SERVER.

Reply via email to