[squid-users] Authentication problem
Hi I try to access a page that requires a username and a password. The page is hosted on IIS. 1) If I bypass Squid completely, I get through (after the XP authentication dialog). 2) If I use Squid, I am asked for the username and password three times (auth dialog by web browser) and then I get "HTTP Error 401.1 - Unauthorized: Access is denied due to invalid credentials". Each of these three attempts generates a TCP_MISS/401 in access.log. 3) I then logged in to the proxy server and used "lynx -auth=user:password www.domain.com" Messages from Lynx: Alert!: Invalid header 'WWW-Authenticate: Negotiate' Alert!: Invalid header 'WWW-Authenticate: NTLM' 401.2 Unauthorized: Access is denied due to server configuration. Any ideas how to solve this?
[squid-users] Best non-graphical Squid log analysis tool for Linux shell?
Hi What Squid logfile analysis tool would you recommend that 1) runs from Linux command line 2) produces clear human readable text format reports (no html, no graphics) 3) is capable of listing most used domains (causing most traffic measured in MB/day) 4) is capable of listing most used single objects (causing most traffic measured in MB/day) 5) is capable of producing traffic statistics (history)(MB by hour or MB by weekday) Recommendations for cache optimization tools and helpers would be welcome, too. Thanks
[squid-users] Squid+radio/satellite: How to deny updating a cached object until bandwidth increases?
Hi, I need to install Squid on a vessel that uses slow radio and satellite connections. Sometimes there is enough bandwidth, but usually it is below modem speeds. There are a few (dozen) very large files that need to be cached in a special way: Squid must ignore all attempts to fetch newer versions of these listed files from source unless... - the object is completely missing from cache OR - there is enough of bandwidth (or alternatively: the time is between xx:yy and zz:vv) The fact that these files might have changed does not play a role. The old version must be served from the cache untill the bandwidth is back again. How do I implement this? I could use some wget + scripting magic, if needed... Thanks.