On 23/12/10 07:32, Michael Cole wrote:
Hi,

I'd like to cache files locally, but only from a small list of
domains.  This setup might be this:

http://wiki.squid-cache.org/ConfigExamples/Intercept/LinuxLocalhost

I haven't found a way to cache only certain requests (e.g.
http://ftp.drupal.org/*), but not cache any other traffic (web
development isn't improved with caching).

http://www.squid-cache.org/Doc/config/cache

  acl someDomains dstdomain ...
  cache deny !someDomains


How does squid know when the cached version has expired?  Some requests are:
  - http://ftp.drupal.org/files/projects/views-6.x-2.12.tar.gz
    -   this file is always the same data
  - http://ftp.drupal.org/files/projects/views-7.x-3.x-dev.tar.gz
    -   this file might change daily

Would this work?  Is this how Squid should be used?  Is there a better way?

> How does squid know when the cached version has expired?

The HTTP server sends Cache-Control and Expiry information about how old the object is and when it expires.

In this case it is probably your Squid contacting the FTP server via FTP protocol so things are a little different...

> Would this work?

An FTP server which supports the MTDM feature will supply Squid with the timestamp to use as Last-Modified in HTTP.

The rest (expiry) has to be estimated. This is done by refresh_pattern the same as if there was a web server which only provided Last-Modified.

Is this how Squid should be used?

You will see in the default "refresh_pattern -i ftp://.* "...

This sets the min, percentage-of-past-age and max times to store the objects. Default is to store for the same length of time as the object was old when fetched, but no less than a day and no more than a week.

You can set our own regex patterns above the default to store the matching URLs for longer or shorter periods as desired.


Amos
--
Please be using
  Current Stable Squid 2.7.STABLE9 or 3.1.9
  Beta testers wanted for 3.2.0.3

Reply via email to