On 10/12/10 07:16, ant2ne wrote:

AWESOME it is working mostly flawlessly!!

I notice that the whitelist file (/etc/squid3/whitelist1.sites) doesn't take
comments or duplications or reduntant info. Like .ftp.debian.org when there
is already a .debian.org. It errors and don't work. But once I got over that
it seems to be working nicely. As long as you surf the white list you aren't
prompted for a password. But if you go off white list you are!!

Is it possible to direct browsers that fail to authenticate to a website? I
could direct them to the internal web server with instructions on how to get
valid credentials.

Yes, you can do this:

  acl bump src all
  acl users proxy_auth REQUIRED
  deny_info http://example.com/ bump
  http_access deny !AUTH_users bump

It produces a 302 response instead of 407. So if you are not careful the browser will never get the challenge to send credentials. The user will always get the rejection page.

It's best done when you have a custom authenticator plugged in to do auth outside of Squid. ie stuff in the page which triggers the browser to send real auth credentials *to the proxy*.

(Sadly I can't present the problem solution right now due to client contracts in the area. If they permit it to go public I will be adding a config example to the wiki)

Amos
--
Please be using
  Current Stable Squid 2.7.STABLE9 or 3.1.9
  Beta testers wanted for 3.2.0.3

Reply via email to