Any ideas on this.  I have looked through some of the FAQ's and haven't
found what I am looking for.  If it's covered somewhere in the doc's/faq
can someone point me to it?  

Thanks,

Gary Wayne Smith

> -----Original Message-----
> From: Gary W. Smith [mailto:[EMAIL PROTECTED]
> Sent: Thursday, July 13, 2006 2:32 PM
> To: squid-users@squid-cache.org
> Subject: [squid-users] Odd caching problem
> 
> Hello,
> 
> I am using squid 2.0 that comes with RedHat EL 4.  We have set it up
to
> become a transparent proxy for our network (via iptables).  It seems
to
> work great for most sites but recently I have found a couple sites
that
> I cannot access through the proxy server.  One of those sites is
> www.newegg.com.  We use the stock configuration file with the
following
> changes:
> 
> httpd_accel_host virtual
> httpd_accel_port 80
> httpd_accel_with_proxy on
> httpd_accel_uses_host_header on
> 
> acl PURGE method PURGE
> acl localhost src 127.0.0.1
> http_access allow PURGE localhost
> http_access deny PURGE
> 
> acl CGI url_regex .cgi$
> acl PHP url_regex .php$
> acl ASP url_regex .asp$
> acl ASPNET url_regex .aspx$
> no_cache deny CGI
> no_cache deny PHP
> no_cache deny ASP
> no_cache deny ASPNET
> 
> We assumes that it had something to do with the dynamic ASP being
cached
> so we added it to the list of no_cache.  But it doesn't seem to make a
> difference.  When the users go to the page we see the entry in the log
> file for squid but in the browser it just sits there.
> 
> Here is a log example:
> 
> 1152760837.202  37190 10.0.16.85 TCP_MISS/000 0 GET
> http://www.newegg.com/ - DIRECT/204.14.213.185 -
> 
> But if I remove the transparent proxy setting from iptables and go
> direct it works.  If later I re-enable the setting it will continue to
> work for a little while (not sure how long, haven't timed it) but then
> it will eventually fail with the same TCP_MISS entries in the log
file.
> 
> Any ideas?
> 
> Gary Smith

Reply via email to