I'm running a web2py website with public articles, and there are ocasional 
peaks in traffic. 
I use Nginx as a webserver, and uWsgi to run my web2py application.

Considering the articles are public (the HTML page of an article is the 
same for every visitor), I'm already doing some caching in order to improve 
performance (I'm using @cache.action decorator with Redis model).
However, and please correct me if I'm wrong, for every request made to the 
URL of an article, before the caching can be done, the models need to be 
executed.
So I thought I could improve performance even more, caching the HTML 
directly from Nginx, that way I would save resources in my server.

However I'm having a hard time getting it, and I wanted to know if I should 
modify the response.headers.
I've read that they come set by default:
http://web2py.com/books/default/chapter/29/04/the-core#response

To do some tests, I have this simple web2py function:

def test():
    from datetime import datetime

    return datetime.now().strftime('%H:%M:%S')



In the ngin'x side, the server block configuration is this:

uwsgi_cache_path /tmp/nginx_cache/ levels=1:2 keys_zone=mycache:10m max_size
=10g inactive=10m use_temp_path=off;

server {
    ...

    location / {
        # response header to check if cache is a HIT or a MISS
        add_header              X-uWSGI-Cache $upstream_cache_status;

        # server cache
        uwsgi_cache  mycache;
        uwsgi_cache_valid  15m;
        uwsgi_cache_key  $request_uri;

        # client cache
        expires 3m;

        uwsgi_pass      unix:///tmp/web2py.socket;
        include         uwsgi_params;
        uwsgi_param     UWSGI_SCHEME $scheme;
    }
}


But every time I hit the test page, I check the response headers and I see 
always a MISS.
In other words, nginx still sends the requests to uwsgi, and the page is 
generated in every request.
I've found this forum post where someone says this:

*"...it looks to me like the issue is that the upstream server is just not 
sending response that contain an expiration date (Expires:) or a cache 
validator (for instance, Last-Modified:). (The cookie expiration time has 
nothing to do with caching.)*
*The HTTP 1.1 spec 
<http://www.w3.org/Protocols/rfc2616/rfc2616-sec13.html#sec13.4> says: 'If 
there is neither a cache validator nor an explicit expiration time 
associated with a response, we do not expect it to be cached, but certain 
caches MAY violate this expectation (for example, when little or no network 
connectivity is available).'"*


So I thought I would still needed to use the @cache.action decorator (with 
model=None in order to only set response headers to allow client caching):

@cache.action(time_expire=222, cache_model=None, session=False, vars=False, 
public=True)
def test():
    from datetime import datetime

    return datetime.now().strftime('%H:%M:%S')


However I sill can't get it to work.
I set up time_expire=222 to check if the directive "expires 3m;" in nginx's 
configuration would overwrite it, and yes it does, the responses have a 
Cache-Control: max-age=180 (that is 3 minutes, not 222 seconds).

*I don't intend to talk about nginx's configuration variables, but I'm 
tempted to ask: am I missing something on the web2py's side?* 
Do I need to modify response.headers in another way to let nginx cache the 
response from uwsgi?


-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
"web2py-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to