I think your URL list can not contain html pages. URL should end with a
directory name (for instance, somedomain.com/allowedsubdir). Any file
located in the "allowedsubdir" directory will then match. 

As said in SquidGuard web site :
URLlists 
The urllist file format is simply URLs separated by newline but with the
"proto://((www|web|ftp)[0-9]*)?" and "(:port)?" parts and normally also the
ending "(/|/[^/]+\.[^/]+)$" part (i.e. ending "/" or "/filename") choped
off. (i.e. "http://www3.foo.bar.com:8080/what/ever/index.html"; =>
"foo.bar.com/what/ever") 

In  other words, squidGuard do not filter out based on html file names, but
just on directory name. In your example, your URL and domain lists are
exactly the same.


HTH,
Remi.

> -----Message d'origine-----
> De : Michael Wray [mailto:[EMAIL PROTECTED] 
> Envoy� : lundi 28 mars 2005 19:26
> � : SquidGuard List
> Objet : Blocking most of a domain, and allowing other parts.
> 
> 
> Is it possible to block a a domain, then allow certain urls 
> in a domain?
> 
> I thought I had previously done this and now it's not working.
> 
> I.E.  
> 
> I have a urllist that has:
> 
> somedomain.com/allowedpage.html
> 
> 
> and I have a domainlist that has:
> 
> somedomain.com
> dest  allows{
>   urllist allows/urls
> }
> dest blocks {
>   domainlist blocks/domains
> }
> 
> 
> My acl reads
> 
> acl {
>      default {
>    pass allows !blocks all
>    redirect http://notallowed.com/page.cgi?blah=blah   
>     }
> }
> 
> 
> No matter what I do, either the whole domain is blocked or 
> the whole domain is 
> allowed.  Have even tested in a scenario where those were the 
> only things in 
> the files.  It seems to work if I want to allow the domain, but block 
> specific portions, but not the other way around.
> 

Reply via email to