Re: [squid-users] url blocking

2011-02-14 Thread Marcus Kool

Zartash,

can you upload the files
cache.log
ufdbguardd.log
ufdbGuard.conf

to http://upload.urlfilterdb.com ?
In case that the files are small you can send them directly to me.

Marcus


Zartash . wrote:

Thanks, I have installed ufdbGuard and defined it in squid but it doesnt
 seem to redirect anything to ufdbGuard, following is what I have 
defined in squid.conf:


url_rewrite_program /usr/local/ufdbguard/bin/ufdbgclient
url_rewrite_children 64

Please help..



Date: Thu, 10 Feb 2011 12:36:59 -0200
From: marcus.k...@urlfilterdb.com
To: squ...@treenet.co.nz
CC: squid-users@squid-cache.org; zart...@hotmail.com
Subject: Re: [squid-users] url blocking

ufdbGuard is a URL filter for Squid that does exactly what Zartash needs.
It transforms codes like %xx to their respective characters and does
URL matching based on the normalised/translated URLs.
It also supports regular expressions, Google Safesearch enforcement and more.

Marcus


Amos Jeffries wrote:

On 10/02/11 18:25, Zartash . wrote:

So is there any way to block %?

If it actually exists in the URL (not just the browser display version) 
using '%' in the pattern will match it. Block with that ACL.



If its encoding something then no, you can't block it directly. It's a 
URL wire-level encoding byte.


You could decode the %xx code and figure out what character it is 
hiding. Match and block on that.


Or, if you don't care what character its encoding use '.' regex control 
to match any single byte.




Amos
 		 	   		  





RE: [squid-users] url blocking

2011-02-13 Thread Zartash .

Thanks, I have installed ufdbGuard and defined it in squid but it doesnt
 seem to redirect anything to ufdbGuard, following is what I have 
defined in squid.conf:

url_rewrite_program /usr/local/ufdbguard/bin/ufdbgclient
url_rewrite_children 64

Please help..


 Date: Thu, 10 Feb 2011 12:36:59 -0200
 From: marcus.k...@urlfilterdb.com
 To: squ...@treenet.co.nz
 CC: squid-users@squid-cache.org; zart...@hotmail.com
 Subject: Re: [squid-users] url blocking
 
 ufdbGuard is a URL filter for Squid that does exactly what Zartash needs.
 It transforms codes like %xx to their respective characters and does
 URL matching based on the normalised/translated URLs.
 It also supports regular expressions, Google Safesearch enforcement and more.
 
 Marcus
 
 
 Amos Jeffries wrote:
  On 10/02/11 18:25, Zartash . wrote:
 
  So is there any way to block %?
 
  
  If it actually exists in the URL (not just the browser display version) 
  using '%' in the pattern will match it. Block with that ACL.
  
  
  If its encoding something then no, you can't block it directly. It's a 
  URL wire-level encoding byte.
  
  You could decode the %xx code and figure out what character it is 
  hiding. Match and block on that.
  
  Or, if you don't care what character its encoding use '.' regex control 
  to match any single byte.
  
  
  
  Amos
  


Re: [squid-users] url blocking

2011-02-10 Thread Amos Jeffries

On 10/02/11 18:25, Zartash . wrote:


So is there any way to block %?



If it actually exists in the URL (not just the browser display version) 
using '%' in the pattern will match it. Block with that ACL.



If its encoding something then no, you can't block it directly. It's a 
URL wire-level encoding byte.


You could decode the %xx code and figure out what character it is 
hiding. Match and block on that.


Or, if you don't care what character its encoding use '.' regex control 
to match any single byte.




Amos
--
Please be using
  Current Stable Squid 2.7.STABLE9 or 3.1.11
  Beta testers wanted for 3.2.0.4


Re: [squid-users] url blocking

2011-02-10 Thread Marcus Kool

ufdbGuard is a URL filter for Squid that does exactly what Zartash needs.
It transforms codes like %xx to their respective characters and does
URL matching based on the normalised/translated URLs.
It also supports regular expressions, Google Safesearch enforcement and more.

Marcus


Amos Jeffries wrote:

On 10/02/11 18:25, Zartash . wrote:


So is there any way to block %?



If it actually exists in the URL (not just the browser display version) 
using '%' in the pattern will match it. Block with that ACL.



If its encoding something then no, you can't block it directly. It's a 
URL wire-level encoding byte.


You could decode the %xx code and figure out what character it is 
hiding. Match and block on that.


Or, if you don't care what character its encoding use '.' regex control 
to match any single byte.




Amos


[squid-users] url blocking

2011-02-09 Thread Zartash .


Dear All,We are blocking urls using url_regex feature (urls are stored in a 
file), but we are unable to block urls having special characters (like complete 
youtube video links or urls having % sign or ? etc). Can any one let me know 
how can we block specific urls (rather then blocking whole domain)?
  


Re: [squid-users] url blocking

2011-02-09 Thread Marcus Kool

You can use ufdbGuard. It is a URL filter for Squid.

ufdbGuard accepts URLs (domain and path), domains and expressions.

Marcus


Zartash . wrote:


Dear All,We are blocking urls using url_regex feature (urls are stored in a 
file), but we are unable to block urls having special characters (like complete 
youtube video links or urls having % sign or ? etc). Can any one let me know 
how can we block specific urls (rather then blocking whole domain)?
  		 	   		  





Re: [squid-users] url blocking

2011-02-09 Thread Amos Jeffries
On Wed, 9 Feb 2011 13:37:26 +, Zartash . zart...@hotmail.com
wrote:
 Dear All,We are blocking urls using url_regex feature (urls are stored
in
 a file), but we are unable to block urls having special characters (like
 complete youtube video links or urls having % sign or ? etc). Can any
one
 let me know how can we block specific urls (rather then blocking whole
 domain)?

To regex match characters which are reserved in regex you need to escape
them with \ characters:
  
  url_regex http://example.com/\?hello=world

matches:  http://example.com/?hello=world


  % are slightly different, in URLs they are used to encode raw binary
characters. These are decoded back into binary form for the match. So your
pattern wanting to detect a specific one of these should match the binary
form. eg. %20 encodes hex binary 0x20 or character value 32 (spacebar).

Amos



RE: [squid-users] url blocking

2011-02-09 Thread Zartash .

So is there any way to block %?


 Date: Wed, 9 Feb 2011 21:27:25 +
 From: squ...@treenet.co.nz
 To: squid-users@squid-cache.org
 Subject: Re: [squid-users] url blocking

 On Wed, 9 Feb 2011 13:37:26 +, Zartash . 
 wrote:
  Dear All,We are blocking urls using url_regex feature (urls are stored
 in
  a file), but we are unable to block urls having special characters (like
  complete youtube video links or urls having % sign or ? etc). Can any
 one
  let me know how can we block specific urls (rather then blocking whole
  domain)?

 To regex match characters which are reserved in regex you need to escape
 them with \ characters:

 url_regex http://example.com/\?hello=world

 matches: http://example.com/?hello=world


 % are slightly different, in URLs they are used to encode raw binary
 characters. These are decoded back into binary form for the match. So your
 pattern wanting to detect a specific one of these should match the binary
 form. eg. %20 encodes hex binary 0x20 or character value 32 (spacebar).

 Amos
 


[squid-users] url blocking using url_regex not working on squid2.5

2009-09-01 Thread g f
Hello all,
I am running squid2.5STABLE14 on RHEL4.
I am close to rolling out squid3 on debian but unfortunately I still
need to support the above RHEL build.
Redhat doesnt seem to have a 2.6 rpm for RHEL4 so I cannot go to 2.6.

All is working fine but I need to implement url blocking.
I followed docs and numerous posts to attempt to implement url
blocking but squid just seems to ignore these acls.

Here is a snippet of my config:
acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl to_localhost dst 127.0.0.0/8
acl SSL_ports port 443 563
acl Safe_ports port 80# http
acl Safe_ports port 21# ftp
acl Safe_ports port 443 563# https, snews
acl Safe_ports port 70# gopher
acl Safe_ports port 210# wais
acl Safe_ports port 1025-65535# unregistered ports
acl Safe_ports port 280# http-mgmt
acl Safe_ports port 488# gss-http
acl Safe_ports port 591# filemaker
acl Safe_ports port 777# multiling http
acl CONNECT method CONNECT
http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports

acl our_networks src 10.150.15.0/24
http_access allow our_networks
acl our_servers src 10.150.7.0/24
http_access allow our_servers
acl msn url_regex toyota
http_access deny msn

http_access allow localhost
http_access deny all
http_reply_access allow all
icp_access allow all


Now I also tried the following:
acl msn dstdomain .toyota.com
http_access deny msn

acl msn_file url_regex /etc/squid/blocker.txt
http_access deny msn_file

I started squid using debug /usr/sbin/squid -NCd10 and get no errors.
It seems to just ignore these acls.

Any ideas?
Thanks in advance.
Graham


Re: [squid-users] url blocking using url_regex not working on squid2.5

2009-09-01 Thread Chris Robertson

g f wrote:

Hello all,
I am running squid2.5STABLE14 on RHEL4.
I am close to rolling out squid3 on debian but unfortunately I still
need to support the above RHEL build.
Redhat doesnt seem to have a 2.6 rpm for RHEL4 so I cannot go to 2.6.

All is working fine but I need to implement url blocking.
I followed docs and numerous posts to attempt to implement url
blocking but squid just seems to ignore these acls.

Here is a snippet of my config:
acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl to_localhost dst 127.0.0.0/8
acl SSL_ports port 443 563
acl Safe_ports port 80# http
acl Safe_ports port 21# ftp
acl Safe_ports port 443 563# https, snews
acl Safe_ports port 70# gopher
acl Safe_ports port 210# wais
acl Safe_ports port 1025-65535# unregistered ports
acl Safe_ports port 280# http-mgmt
acl Safe_ports port 488# gss-http
acl Safe_ports port 591# filemaker
acl Safe_ports port 777# multiling http
acl CONNECT method CONNECT
http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports

acl our_networks src 10.150.15.0/24
http_access allow our_networks
  


With this, you allow all traffic (that hasn't already been denied) from 
10.150.15.0/24.  For clients in this IP range, no more access rules will 
be checked.  Have a look at the FAQ 
(http://wiki.squid-cache.org/SquidFaq/SquidAcl) for more.



acl our_servers src 10.150.7.0/24
http_access allow our_servers
acl msn url_regex toyota
http_access deny msn

http_access allow localhost
http_access deny all
http_reply_access allow all
icp_access allow all


Now I also tried the following:
acl msn dstdomain .toyota.com
http_access deny msn

acl msn_file url_regex /etc/squid/blocker.txt
http_access deny msn_file

I started squid using debug /usr/sbin/squid -NCd10 and get no errors.
It seems to just ignore these acls.

Any ideas?
Thanks in advance.
Graham
  


Chris



[squid-users] url blocking

2003-02-18 Thread Glen Hernaez Supan
dear all
i'm a newbie linux user, anybody here help me how to configure my squid to
allow only specific url? ex. yahoo.com hotmail.com.
my squid is configured without restriction to all url, meaning all url are
accessible to all clients with internet access.

more power

glen