Re: [squid-users] Invoked sites by allowed websites.

2007-12-17 Thread Cody Jarrett
So this is what I have now, and the way I see it, it says allow all  
goodsites and sites that have referers, but it still doesn't appear to  
work properly.


acl goodsites dstdom_regex "/etc/squid/allowed-sites.squid"
acl has_referer referer_regex .
http_access allow goodsites has_referer

On Dec 14, 2007, at 2:21 PM, Henrik Nordstrom wrote:


On fre, 2007-12-14 at 11:45 -0600, Cody Jarrett wrote:

I think I almost have it. I can access sites in my allowed file. But
when I access a site that isn't, it gives me the "Error, access  
denied

when trying to retrieve the url: http://google.com/";, but If I click
the link, http://www.google.com, it takes me to the site which isn't
wanted. I think there is something wrong with the order of acl's or I
need to combine them on one line maybe.

#allow only the sites listed in the following file
acl goodsites dstdom_regex "/etc/squid/allowed-sites.squid"
acl has_referer referer_regex .
http_access allow goodsites
http_access allow has_referer


This says allow access to follow any link, no matter where that link  
is

or no matter where it was found.

You need to make patterns of the sites from where following links /
loading inlined content is allowed.

Regards
Henrik





Re: [squid-users] Invoked sites by allowed websites.

2007-12-14 Thread Henrik Nordstrom
On fre, 2007-12-14 at 11:45 -0600, Cody Jarrett wrote:
> I think I almost have it. I can access sites in my allowed file. But
> when I access a site that isn't, it gives me the "Error, access denied
> when trying to retrieve the url: http://google.com/";, but If I click
> the link, http://www.google.com, it takes me to the site which isn't
> wanted. I think there is something wrong with the order of acl's or I
> need to combine them on one line maybe.
> 
> #allow only the sites listed in the following file
> acl goodsites dstdom_regex "/etc/squid/allowed-sites.squid"
> acl has_referer referer_regex .
> http_access allow goodsites
> http_access allow has_referer

This says allow access to follow any link, no matter where that link is
or no matter where it was found.

You need to make patterns of the sites from where following links /
loading inlined content is allowed.

Regards
Henrik


signature.asc
Description: This is a digitally signed message part


Re: [squid-users] Invoked sites by allowed websites.

2007-12-14 Thread Cody Jarrett

I think I almost have it. I can access sites in my allowed file. But
when I access a site that isn't, it gives me the "Error, access denied
when trying to retrieve the url: http://google.com/";, but If I click
the link, http://www.google.com, it takes me to the site which isn't
wanted. I think there is something wrong with the order of acl's or I
need to combine them on one line maybe.

#allow only the sites listed in the following file
acl goodsites dstdom_regex "/etc/squid/allowed-sites.squid"
acl has_referer referer_regex .
http_access allow goodsites
http_access allow has_referer

On Dec 13, 2007, at 6:18 PM, Adrian Chadd wrote:


On Thu, Dec 13, 2007, Cody Jarrett wrote:

Do you know how I would allow access based on the referer? I'm
searching for how to do this and would like to try it out.


acl aclname referer_regex [-i] regexp ...



adrian


On Dec 12, 2007, at 6:52 PM, Adrian Chadd wrote:


On Wed, Dec 12, 2007, Cody Jarrett wrote:

I'm using squid 2.6 and have it configured to block all websites
except for a few that I specify are ok. The problem I'm having is,
several sites that are fine to access, such as kbb.com, have  
content

invoked from other sites. So when I view kbb.com for example, the
page
is missing most it's content and looks really messed up in firefox,
and this happens with other sites. Is there some way to allow  
access

to approved sites, and further sites that are invoked?


There's no easy way for squid (or any proxy, really!) to properly
determine "and further sites that are invoked."

You could possibly allow access based on referrer URL as well -  
which
should show up as having been referred by your list of approved  
URLs -

but referrer URLs can't be trusted as anyone can just fake them.



Adrian


http_port 10.1.0.1:3128
http_port 127.0.0.1:3128
visible_hostname server.blah.com
hierarchy_stoplist cgi-bin ?
acl QUERY urlpath_regex cgi-bin \?
no_cache deny QUERY
cache_dir ufs /var/spool/squid 400 16 256
refresh_pattern ^ftp:   144020% 10080
refresh_pattern ^gopher:14400%  1440
refresh_pattern .   0   20% 4320
acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl to_localhost dst 127.0.0.0/8
acl SSL_ports port 443 563
acl Safe_ports port 80  # http
acl Safe_ports port 21  # ftp
acl Safe_ports port 443 563 # https, snews
acl Safe_ports port 70  # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535  # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT

#allow only the sites listed in the following file
acl goodsites dstdom_regex "/etc/squid/allowed-sites.squid"
http_access allow goodsites
http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access deny to_localhost

acl lan_network src 10.1.1.0/24

#deny http access to all other sites
http_access deny lan_network
http_access deny itfreedom_network
http_access allow localhost
http_access deny all
acl to_lan_network dst 10.1.45.0/24
http_access allow to_lan_network
http_reply_access allow all
icp_access allow all


--
- Xenion - http://www.xenion.com.au/ - VPS Hosting - Commercial
Squid Support -
- $25/pm entry-level VPSes w/ capped bandwidth charges available in
WA -





--
- Xenion - http://www.xenion.com.au/ - VPS Hosting - Commercial  
Squid Support -
- $25/pm entry-level VPSes w/ capped bandwidth charges available in  
WA -






Re: [squid-users] Invoked sites by allowed websites.

2007-12-13 Thread Adrian Chadd
On Thu, Dec 13, 2007, Cody Jarrett wrote:
> Do you know how I would allow access based on the referer? I'm  
> searching for how to do this and would like to try it out.

acl aclname referer_regex [-i] regexp ...



adrian
> 
> On Dec 12, 2007, at 6:52 PM, Adrian Chadd wrote:
> 
> >On Wed, Dec 12, 2007, Cody Jarrett wrote:
> >>I'm using squid 2.6 and have it configured to block all websites
> >>except for a few that I specify are ok. The problem I'm having is,
> >>several sites that are fine to access, such as kbb.com, have content
> >>invoked from other sites. So when I view kbb.com for example, the  
> >>page
> >>is missing most it's content and looks really messed up in firefox,
> >>and this happens with other sites. Is there some way to allow access
> >>to approved sites, and further sites that are invoked?
> >
> >There's no easy way for squid (or any proxy, really!) to properly
> >determine "and further sites that are invoked."
> >
> >You could possibly allow access based on referrer URL as well - which
> >should show up as having been referred by your list of approved URLs -
> >but referrer URLs can't be trusted as anyone can just fake them.
> >
> >
> >
> >Adrian
> >
> >>http_port 10.1.0.1:3128
> >>http_port 127.0.0.1:3128
> >>visible_hostname server.blah.com
> >>hierarchy_stoplist cgi-bin ?
> >>acl QUERY urlpath_regex cgi-bin \?
> >>no_cache deny QUERY
> >>cache_dir ufs /var/spool/squid 400 16 256
> >>refresh_pattern ^ftp:   144020% 10080
> >>refresh_pattern ^gopher:14400%  1440
> >>refresh_pattern .   0   20% 4320
> >>acl all src 0.0.0.0/0.0.0.0
> >>acl manager proto cache_object
> >>acl localhost src 127.0.0.1/255.255.255.255
> >>acl to_localhost dst 127.0.0.0/8
> >>acl SSL_ports port 443 563
> >>acl Safe_ports port 80  # http
> >>acl Safe_ports port 21  # ftp
> >>acl Safe_ports port 443 563 # https, snews
> >>acl Safe_ports port 70  # gopher
> >>acl Safe_ports port 210 # wais
> >>acl Safe_ports port 1025-65535  # unregistered ports
> >>acl Safe_ports port 280 # http-mgmt
> >>acl Safe_ports port 488 # gss-http
> >>acl Safe_ports port 591 # filemaker
> >>acl Safe_ports port 777 # multiling http
> >>acl CONNECT method CONNECT
> >>
> >>#allow only the sites listed in the following file
> >>acl goodsites dstdom_regex "/etc/squid/allowed-sites.squid"
> >>http_access allow goodsites
> >>http_access allow manager localhost
> >>http_access deny manager
> >>http_access deny !Safe_ports
> >>http_access deny CONNECT !SSL_ports
> >>http_access deny to_localhost
> >>
> >>acl lan_network src 10.1.1.0/24
> >>
> >>#deny http access to all other sites
> >>http_access deny lan_network
> >>http_access deny itfreedom_network
> >>http_access allow localhost
> >>http_access deny all
> >>acl to_lan_network dst 10.1.45.0/24
> >>http_access allow to_lan_network
> >>http_reply_access allow all
> >>icp_access allow all
> >
> >-- 
> >- Xenion - http://www.xenion.com.au/ - VPS Hosting - Commercial  
> >Squid Support -
> >- $25/pm entry-level VPSes w/ capped bandwidth charges available in  
> >WA -
> 
> 

-- 
- Xenion - http://www.xenion.com.au/ - VPS Hosting - Commercial Squid Support -
- $25/pm entry-level VPSes w/ capped bandwidth charges available in WA -


Re: [squid-users] Invoked sites by allowed websites.

2007-12-13 Thread Cody Jarrett
Do you know how I would allow access based on the referer? I'm  
searching for how to do this and would like to try it out.


On Dec 12, 2007, at 6:52 PM, Adrian Chadd wrote:


On Wed, Dec 12, 2007, Cody Jarrett wrote:

I'm using squid 2.6 and have it configured to block all websites
except for a few that I specify are ok. The problem I'm having is,
several sites that are fine to access, such as kbb.com, have content
invoked from other sites. So when I view kbb.com for example, the  
page

is missing most it's content and looks really messed up in firefox,
and this happens with other sites. Is there some way to allow access
to approved sites, and further sites that are invoked?


There's no easy way for squid (or any proxy, really!) to properly
determine "and further sites that are invoked."

You could possibly allow access based on referrer URL as well - which
should show up as having been referred by your list of approved URLs -
but referrer URLs can't be trusted as anyone can just fake them.



Adrian


http_port 10.1.0.1:3128
http_port 127.0.0.1:3128
visible_hostname server.blah.com
hierarchy_stoplist cgi-bin ?
acl QUERY urlpath_regex cgi-bin \?
no_cache deny QUERY
cache_dir ufs /var/spool/squid 400 16 256
refresh_pattern ^ftp:   144020% 10080
refresh_pattern ^gopher:14400%  1440
refresh_pattern .   0   20% 4320
acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl to_localhost dst 127.0.0.0/8
acl SSL_ports port 443 563
acl Safe_ports port 80  # http
acl Safe_ports port 21  # ftp
acl Safe_ports port 443 563 # https, snews
acl Safe_ports port 70  # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535  # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT

#allow only the sites listed in the following file
acl goodsites dstdom_regex "/etc/squid/allowed-sites.squid"
http_access allow goodsites
http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access deny to_localhost

acl lan_network src 10.1.1.0/24

#deny http access to all other sites
http_access deny lan_network
http_access deny itfreedom_network
http_access allow localhost
http_access deny all
acl to_lan_network dst 10.1.45.0/24
http_access allow to_lan_network
http_reply_access allow all
icp_access allow all


--
- Xenion - http://www.xenion.com.au/ - VPS Hosting - Commercial  
Squid Support -
- $25/pm entry-level VPSes w/ capped bandwidth charges available in  
WA -






Re: [squid-users] Invoked sites by allowed websites.

2007-12-12 Thread Adrian Chadd
On Wed, Dec 12, 2007, Cody Jarrett wrote:
> I'm using squid 2.6 and have it configured to block all websites  
> except for a few that I specify are ok. The problem I'm having is,  
> several sites that are fine to access, such as kbb.com, have content  
> invoked from other sites. So when I view kbb.com for example, the page  
> is missing most it's content and looks really messed up in firefox,  
> and this happens with other sites. Is there some way to allow access  
> to approved sites, and further sites that are invoked?

There's no easy way for squid (or any proxy, really!) to properly
determine "and further sites that are invoked."

You could possibly allow access based on referrer URL as well - which
should show up as having been referred by your list of approved URLs -
but referrer URLs can't be trusted as anyone can just fake them.



Adrian

> http_port 10.1.0.1:3128
> http_port 127.0.0.1:3128
> visible_hostname server.blah.com
> hierarchy_stoplist cgi-bin ?
> acl QUERY urlpath_regex cgi-bin \?
> no_cache deny QUERY
> cache_dir ufs /var/spool/squid 400 16 256
> refresh_pattern ^ftp: 144020% 10080
> refresh_pattern ^gopher:  14400%  1440
> refresh_pattern . 0   20% 4320
> acl all src 0.0.0.0/0.0.0.0
> acl manager proto cache_object
> acl localhost src 127.0.0.1/255.255.255.255
> acl to_localhost dst 127.0.0.0/8
> acl SSL_ports port 443 563
> acl Safe_ports port 80# http
> acl Safe_ports port 21# ftp
> acl Safe_ports port 443 563   # https, snews
> acl Safe_ports port 70# gopher
> acl Safe_ports port 210   # wais
> acl Safe_ports port 1025-65535# unregistered ports
> acl Safe_ports port 280   # http-mgmt
> acl Safe_ports port 488   # gss-http
> acl Safe_ports port 591   # filemaker
> acl Safe_ports port 777   # multiling http
> acl CONNECT method CONNECT
> 
> #allow only the sites listed in the following file
> acl goodsites dstdom_regex "/etc/squid/allowed-sites.squid"
> http_access allow goodsites
> http_access allow manager localhost
> http_access deny manager
> http_access deny !Safe_ports
> http_access deny CONNECT !SSL_ports
> http_access deny to_localhost
> 
> acl lan_network src 10.1.1.0/24
> 
> #deny http access to all other sites
> http_access deny lan_network
> http_access deny itfreedom_network
> http_access allow localhost
> http_access deny all
> acl to_lan_network dst 10.1.45.0/24
> http_access allow to_lan_network
> http_reply_access allow all
> icp_access allow all

-- 
- Xenion - http://www.xenion.com.au/ - VPS Hosting - Commercial Squid Support -
- $25/pm entry-level VPSes w/ capped bandwidth charges available in WA -


[squid-users] Invoked sites by allowed websites.

2007-12-12 Thread Cody Jarrett
I'm using squid 2.6 and have it configured to block all websites  
except for a few that I specify are ok. The problem I'm having is,  
several sites that are fine to access, such as kbb.com, have content  
invoked from other sites. So when I view kbb.com for example, the page  
is missing most it's content and looks really messed up in firefox,  
and this happens with other sites. Is there some way to allow access  
to approved sites, and further sites that are invoked?


http_port 10.1.0.1:3128
http_port 127.0.0.1:3128
visible_hostname server.blah.com
hierarchy_stoplist cgi-bin ?
acl QUERY urlpath_regex cgi-bin \?
no_cache deny QUERY
cache_dir ufs /var/spool/squid 400 16 256
refresh_pattern ^ftp:   144020% 10080
refresh_pattern ^gopher:14400%  1440
refresh_pattern .   0   20% 4320
acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl to_localhost dst 127.0.0.0/8
acl SSL_ports port 443 563
acl Safe_ports port 80  # http
acl Safe_ports port 21  # ftp
acl Safe_ports port 443 563 # https, snews
acl Safe_ports port 70  # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535  # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT

#allow only the sites listed in the following file
acl goodsites dstdom_regex "/etc/squid/allowed-sites.squid"
http_access allow goodsites
http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access deny to_localhost

acl lan_network src 10.1.1.0/24

#deny http access to all other sites
http_access deny lan_network
http_access deny itfreedom_network
http_access allow localhost
http_access deny all
acl to_lan_network dst 10.1.45.0/24
http_access allow to_lan_network
http_reply_access allow all
icp_access allow all