On Wed, 12 Jan 2005, Anders Nordby wrote:

Because of this I'm considering to not use Squid for caching these files
at all. The application vendor says they do not recommend using
non-random URLs for the files, which effectively prevents Squid from
caching them easily.

Which is probably why they do not recommend using non-random URLs in the first place.. they do not wan't proxies to cache the requests as this would give the users a better experience visiting your site ;-)


The reason to this is most likely that they have got too many questions from uninformed webmasters running into cache problems when updating previously published URLs with new content. Another reason could be too low request ratio in the statistics, indicating the content is viewed less often than it really is due to proxies caching the requests to the joy of everyone involved except for the usage statistics.

A third reason could obviously be that they have this recommendation because many use their software to publish non-static content frequently changing and their software can not express this properly in the HTTP headers..

Regards
Henrik

Reply via email to