I don't see how this rule makes sense. If you want to block all sites except blacklist sites, then the rule would be:

   pass blacklist none

The redirect line only applies to blocked sites.

More typical is something like:

   pass oksites !badsites all

This will pass sites in oksites even if they are also listed in badsites, but will block sites listed in badsites. If a site is not in oksites and not in badsites, it is passed. The rule exits on a specific match but continues if a site is not blocked. I would expect your rule to block everything, but maybe I don't understand how "pass ... none" works when "none" terminates a rule with other lists in it.

Somebody correct me if I'm wrong.
Marco Simon wrote:

Thanks,

some solutions can be so easy...
The Problem was the missing redirect-line...

But anyway:
It seems as if I had missunderstood the following
lines:
 default
  { pass !blacklist none
        redirect http://myurl
  }

What I thougt it means:
        If no other rule fits, then pass nothing, but the domains given by
blacklist.
What happens effectively:
        All URL's are passed, but the ones which are written in blacklist/domains

Where is my thinking-mistake ?!




-----Urspr�ngliche Nachricht----- Von: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] Auftrag von Rick Matthews Gesendet: Samstag, 21. August 2004 05:29 An: Fernando Henrique Giorgetti; [EMAIL PROTECTED] Betreff: RE: SquidGuard (redirect cache problem)


Try using a '302:' on the redirect in your default acl. (That's the one that goes to login.cgi, right?) So it would be:

        redirect        302:http://yourserver.com/login.cgi

Without the 302: the browser shows the redirected page (login.cgi?)
while displaying the requested address (www.squidguard.org, in your
example).  Adding the 302: will cause the browser to correctly
show the address of the login page.

Although it seems odd at first, redirecting without the 302:
is preferred for most redirects.  If a user clicks on a link and
doesn't realize that it links to a page on www.playboy.com, he'll be
surprised when he receives a block page.  Without the 302: the user
can look at the address and see where the link was trying to send
him.  That's much more informative than the address of the block
page itself.  And in the case of a block, it doesn't really matter
that the block page is being cached for www.playboy.com.

Your login.cgi is an entirely different matter.  I think adding the
302: will solve the problem for you.

Rick

Fernando Henrique Giorgetti wrote:



I am using SquidGuard (since 2001) to control my users access.
But, its not its only task. Its used too, to login my users
before them start their navigation, to apply dynamically an
individual policy to each user by this way: When a not logged
user (that does not have an src acl in squidguard.conf) tries
to access something, he is redirected to a login.cgi that
requests for user and password. After the user posts that
information, he will be redirected again to a pre-defined website
(for example: www.squidguard.org).

The problem occurs when a user has a start page defined in his
browser (for example www.squidguard.org). When the browser is
opened, the user is redirected by squidguard to our login.cgi.
So, the user posts his user/pass and after this the cgi will log
him and redirect again to www.squidguard.org. Then, the browser
begins an endless loop because it caches the start page as the
login.cgi (that redirects the logged user to
www.squidguard.org <- that is login.cgi to the browser).

I had already put the cache headers in my html and cgi but I
can't eliminate this problem.

Do you know what's going on?

Thanks,

Fernando.







Reply via email to