> Hi!
>
> On Tue, Apr 28, 2009 at 7:54 AM, Wilson Hernandez - MSD, S. A.
> <w...@msdrd.com> wrote:
>> Hello everybody.
>>
>> I am somewhat confused on how squid helps to save bandwidth. I know it
>> saves visited websites to cache and when someone else request the same
>> site it will serve it from the cache. Please correct me if that is
>> wrong.
>
> Yes, but I think that it "verifies" if the page is outdated (hence the
> request).

Correct. These are seen as 3xx responses and IMS_HIT or REFRESH_HIT in the
logs. Bandwidth the size of the headers is spent in order to save
additional bandwidth to the size of object being requested.

>
>>
>> Now, I've been checking my traffic before (external nic) and after
>> (inside network) squid. Eveerytime I request a page (google.com) the
>> request is sent to the internet as well so, in this case there isn't
>> much saving done. But, if I have offline_mode on I get the old page
>> stored locally. It seems that if I have offline_mode enabled is when
>> bandwidth is been saved.
>
> In offline mode, I think it no longer verifies if the pages are
> outdated.  Also, due to the "interactive" web pages these days, there
> are lots of sites which include the "no-cache" directive on the
> headers, and squid have to honor these.....
>
> You should also verify the maximum and minimum object size, in order
> to tune the cache for your particular case.
>
> Please somebody correct me if I'm wrong.

You are correct. Google.com being one such websites which explicitly
specify their page must be reloaded after 0 seconds is a particularly bad
test of caching.

Use http://www.ircache.net/cgi-bin/cacheability.py to see what pages are
able to be saved, and why if not.

FWIW: offline mode assumes that no bandwidth can be expended to update
information. A lot of the web simply breaks, some well designed parts
remain working for a time.

Amos

Reply via email to