The mime.conf file employs a dollar at the end of regex for file extensions.
e.g. \.exe$
I omitted the dollar in my own blacklist of filetypes, and the ACL seems to
deny the
download of filetypes that I don't want.
How does the dollar make a difference?
... I can do direct proxying if I specify the proxy server on a machine,
this works. But it will be tedious to upgrade all desktops to do this
and not to mention complaints if someone tampers with it or why this
needs to be done or new machines being added...
I understand
I have a user who went to look up a winnebago (no...really..) as a
reference for artwork. He kept getting page cannot be retrieved access
denied. The strange thing is that it won't even look up winnebago as a
web search although I can type hard drive in images and web and they
both
On my Win XP Home clients, I expunged all traces of DNS (Disabled DNS client
and server
services, and deleted IP addresses from DNS fields. There is no DNS on the
internal LAN),
and Firefox still happily browses the web through Squid.
The DNS settings in squid.conf are default.
This system seems to work, but I will be grateful for your observations and
suggestions.
My Security Primer for school classroom.
Classroom internet access is for academic research, not entertainment and
definitely not
titillation.
HIERARCHY OF HARDWARE
WAN termination (campus
Web browsing clients must specify one IP address as their proxy, so I suppose
this box
must be available for browsing to work.
If there was a second and more responsive Squid box available, most but not all
of the
time, how might the two Squid caches be related so that the more
How might I write an ACL to catch all numeric IP destdomain addresses so that I
may
deny attempts to circumvent URL regex filters?
Is there a difference in function between the statements
Socket created at 0.0.0.0, and
Accepting connections at 0.0.0.0
It seems that the Netbios command net view \\munro4 issued from a Win XP box
shows up in the Squid cache log, on the munro4 box, as The request OPTIONS
http://munro4/ is DENIED because it matched ...
This is more mysterious because the LAN protocols are Netbios and TCP/IP, not
Netbios
How can I determine which squid options are compiled into a binary?
John Sutherland
Phone Fax +61 2 4683 1511
9 Meryla Street, Couridjah NSW 2571 Australia
Is there a difference in function between the statements
Socket created at 0.0.0.0, and
Accepting connections at 0.0.0.0
Is squid socksified; i.e. able to work with a socks daemon on a bastion?
** Reply to note from Carles gine [EMAIL PROTECTED] Thu, 24 Feb 2005 23:19:30
+0100
From: Carles gine [EMAIL PROTECTED] To: squid-users@squid-cache.org Date:
Thu,
24 Feb 2005 23:19:30 +0100 Subject: [squid-users] squid in schoool
Hello every one.
I need to put squid in a
If I place my ACL definitions in a text file, and add URLs to the file during
working hours,
is it sufficient to just save the file for the new URLs to be allowed, or is it
necessary to
do something like rotating logs or restarting Squid?
Interleaving the acls and http_access lines should work just fine. I'd
change the dstdom_regex to dstdomain, because as it stands now, anything
with .gov anywhere in the domain (where the dot can represent any
character i.e. thegovenator.com), will be allowed through. Same thing for
the
ACLs don't seem to be checked when squid serves cached content (likely in
the interest of speed).
Many thanks Chris for your generous offer and your suggestions. Also to Henrik
for
clarifying the structure of URLs. My frustration, which I tried to conceal in
my posts, but
In Squid 2.5.s8_OS2_VAC my squid.conf included this example from FAQ 10.11
acl xxx dst 0.0.0.0/0.0.0.0
http_access deny xxx
However, web pages not previously allowed in the sequence of rules, were
nevertheless
allowed to be served from cache, contrary to my wishes.
I understand that
It seems that Squid allows us to place our ACL definitions in a file separate
from
squid.conf, but I see no mention of similarly placing http_access rules in a
separate file.
Is this how it is?
Date: Sat, 5 Feb 2005 23:26:41 +0100 (CET)
From: Henrik Nordstrom [EMAIL PROTECTED]
To: Martin Joseph [EMAIL PROTECTED]
Cc: Squid Users squid-users@squid-cache.org
Subject: Re: [squid-users] ACL defaults
On Sat, 5 Feb 2005, Martin Joseph wrote:
If you have http_access lines but
Date: Sat, 5 Feb 2005 12:09:04 +0100 (CET)
From: Henrik Nordstrom [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Cc: Squid Users squid-users@squid-cache.org
Subject: Re: [squid-users] Failing to serve cached objects
On Sat, 5 Feb 2005 [EMAIL PROTECTED] wrote:
I moved my squid.conf to a
Squid.conf seems not to change much over recent versions, so these remarks
probably apply
to the .conf you are using.
For the tag http_access, my .conf says:-
NOTE on default values:
If there are no 'access' lines present, the default is to deny the request.
This implies
I moved my squid.conf to a newer build of Squid and, Voila! Cached objects are
now
being served.
Pity ACL seems to be broken (all urls are accessible).
I was hoping to employ Squid in a classroom situation so that when all students
wanted to
open the same web page at the same time, only one copy of the page would be
downloaded, and all subsequent requests would be served from Squid cache.
However I have not been able to persuade
I have reverted to the default squid.conf and I am getting the same as when I
first
installed Squid. It runs without error but fails to serve cached objects.
The access log reports TCP_REFRESH_MISS/200.
How can the objects be STALE half a minute after being refreshed from the
origin
What is the intent of http_access deny to_localhost?
I have reverted to the default squid.conf and I am getting the same as when I
first
installed Squid. It runs without error but fails to serve cached objects.
The access log reports TCP_REFRESH_MISS/200.
How can the objects be STALE half a minute after being refreshed from the
O'Reilly's Definitive Guide to HTML and XHTML has no reference in the index
for either
Expires or Cache-control, so why would any web page contain such information?
A web page wouldn't, but the HTTP response containing the HTML object/page
may.
** Reply to note from Henrik Nordstrom [EMAIL PROTECTED] Thu, 20 Jan 2005
14:49:25 +0100 (CET)
I have edited all.js and disabled and zeroed disk and memory caches,
pref(browser.cache.disk.enable, false);
pref(browser.cache.disk.capacity, 0);
I have searched faq and google but can't find any reference to this
problem.
I got into this situation by changing the refresh_pattern because all the cache
entries were
being reported as stale, when I did not think they were.
Now my access log shows tcp_ims_hit, indicating
29 matches
Mail list logo