[squid-users] How to prevent illegal activity on squid?

2010-08-11 Thread Al - Image Hosting Services

Hi,

I have a squid server setup to block objectionable content from the  
web. It is publicly available, and free. All anyone needs to do to  
use it is signup for a free account. The problem is that so far  
everyone who has signed up either wants to send spam through the  
squid proxy, go to sights that are banned in the country of their  
origin, or do something else illegal. What are all the different  
things that squid can be used for that are illegal and how do I  
prevent people from using it for that?


Sincerely,
Al


Re: [squid-users] Help with accelerated site

2010-03-25 Thread Al - Image Hosting Services

Hi,

Although you can't have apache and squid listening on port 80 on the same 
IP, you can have them both running on port 80 on the same machine. Just do 
this:


Change your apache config to:
"Listen 127.0.0.1:80"

Change your squid config to:
"cache_peer 127.0.0.1 parent 80 0 no-query originserver" 
"http_port 1.2.3.4:80 accel vhost"


Where 1.2.3.4 is, put your public IP.

-Al






On Thu, 25 Mar 2010, a...@gmail wrote:


Date: Thu, 25 Mar 2010 16:30:33 -
From: "a...@gmail" 
To: Ron Wheeler 
Cc: Amos Jeffries , squid-users@squid-cache.org
Subject: Re: [squid-users] Help with accelerated site

Hi All,
Thank you guys for your help
I have tried your suggestions,
Yes Ron I know that two programmes can't both listen on the same port at the 
same time
but I thought the Apache was essential for the Proxy server, so thanks for 
the suggestion,
I am including bits of my config here, because now I am getting "Access 
Denied" even from a local network:
Can you guys please take a look at it and see if you can spot what's causing 
the access denied.
note I have tried to allow everything and removed all the "deny" directives 
and yet it's still denies any access from my local network.
That is why I get so confused with Squid, I don't understand it's logic to be 
perfectly honest, and let me remind you that this config used to work just 
fine at least it used to allow access to the internet to all the clients on 
my local network.



#
# Other Access Controls
#
acl manager proto cache_object
acl localhost src 127.0.0.1/32
acl to_localhost dst 127.0.0.0/8 0.0.0.0/32
acl our_networks dst 192.168.1.0/32
acl our_sites dstdomain www.mysite.org
acl localnet src 10.0.0.0/8  # RFC1918 possible internal network
acl localnet src 172.16.0.0/12  # RFC1918 possible internal network
# acl localnet src 192.168.0.0/32 # RFC1918 possible internal network
acl localnet src 192.168.1.0/32  #Local Network
acl myaccelport port 80

# acl FTP proto FTP
acl SSL_ports port 443
acl Safe_ports port 80  # http
acl Safe_ports port 21  # ftp
acl Safe_ports port 443  # https
acl Safe_ports port 70  # gopher
acl Safe_ports port 210  # wais
acl Safe_ports port 1025-65535 # unregistered ports
acl Safe_ports port 280  # http-mgmt
acl Safe_ports port 488  # gss-http
acl Safe_ports port 591  # filemaker
acl Safe_ports port 777  # multiling http
acl CONNECT method CONNECT

http_access allow manager localhost
#http_access deny manager
# http_access deny !Safe_ports
http_access allow localnet
#http_access deny all
# http_access allow intranet
# http_access deny all
http_access allow our_networks

icp_access allow localnet
#icp_access deny all
htcp_access allow localnet
#htcp_access deny all
http_acceess allow CONNECT
#http_access deny all
hosts_file /etc/hosts
visible_hostname proxy

http_port  3128

hierarchy_stoplist cgi-bin ?

cache_effective_user squid
access_log /usr/local/squid/var/logs/access.log squid
cache_log /usr/local/squid/var/logs/cache.log
cache_store_log /usr/local/squid/var/logs/store.log
pid_filename /usr/local/squid/var/logs/squid.pid

refresh_pattern ^ftp:  1440 20% 10080
refresh_pattern ^gopher: 1440 0% 1440
refresh_pattern .  0 20% 4320

icp_port 3130
htcp_port 4827
# allow_underscore on

coredump_dir /usr/local/squid/var/cache


Can anyone see what's wrong with this config and if possible to point it out 
to me, your help would be much appreciated


Thanking you in advance
Regards
Adam

- Original Message - From: "Ron Wheeler" 


To: "a...@gmail" 
Cc: "Amos Jeffries" ; 
Sent: Thursday, March 25, 2010 1:58 AM
Subject: Re: [squid-users] Help with accelerated site



a...@gmail wrote:

Hello there,
Thanks for the reply Ron and Amos


Maybe my original e-mail wasn't clear a bit confusing I am sorry if I 
confused you


I have squid running on Machine A with let's say local ip 192.168.1.4
the backend server is running on machine B and ip address 192.168.1.3

Now, instead of getting the website that is located on Machine B 
192.168.1.3 which is listening on port 81 not 80.
I am getting the default Apache Page on the Proxy server Machine which is 
192.168.1.4


And I do have the vhost in my configuration
Well there are two apaches running on the two machines, the proxy machine 
and the web-server machine, except the web-server apache listens on port 
81, logically (technically) speaking it should work, but for some reason 
it doesn't.

I hope it makes more sense to you what I am trying to describe here


Very helpful.
You can not have apache listening for port 80 on 192.168.1.4 and Squid 
trying to do the same thing.

Only one process can have port 80.
You will very likely find a note in the squid logs that says something to 
the effect that squid can not bind to port 80.
If you shutdown apache on 192.168.1.4 and restart squid, your proxy will 
work (if the rest of the configuration is correct)
If you then try to start apache on 192.168.1.4 it will certainly complain 
loud

Re: [squid-users] reverse proxy question

2010-03-25 Thread Al - Image Hosting Services

Hi,

I just emailed "squid-users@squid-cache.org". I would think that they 
would use majordemo to forward the email to everyone on their "list". You 
are on their "list". So, I didn't have your email address, but I do now 
(because you emailed me directly).


To unsubscribe send a message to: 
squid-users-unsubscr...@squid-cache.org.


I hope this helps!

-Al





On Thu, 25 Mar 2010, b...@billfair.com wrote:


Date: Thu, 25 Mar 2010 17:54:21 -0500
From: b...@billfair.com
To: Al - Image Hosting Services 
Subject: Re: [squid-users] reverse proxy question

Can you tell me how you got my email? I want to stop receiving information 
about squid and can't seem to get unsubscribed.

Thanks,

Bill

b...@billfair.com

Auction!...the most accurate Price Discovery Mechanism today!

for upcoming auctions see www.billfair.com

785-887-6966 Desk Phone
800-887-6929 Anytime

Bill Fair and Company, Inc.
478 N. 1950 Rd.
Lecompton, KS 66050







On Mar 22, 2010, at 2:33 PM, Al - Image Hosting Services wrote:


Hi,

I have a reverse proxy setup. It has worked well except now the apache server 
is getting overloaded. I would like to change my load balancing so that I send 
all the dynamic content to one server like php to the apache server and all the 
static content like .gif, .jpg, .html to another webserver. Is there a way to 
do this and where is it documented? Also, could someone recommend a light 
weight server for static content?

Thanks,
Al





[squid-users] reverse proxy question

2010-03-22 Thread Al - Image Hosting Services

Hi,

I have a reverse proxy setup. It has worked well except now the apache 
server is getting overloaded. I would like to change my load balancing so 
that I send all the dynamic content to one server like php to the apache 
server and all the static content like .gif, .jpg, .html to another 
webserver. Is there a way to do this and where is it documented? Also, 
could someone recommend a light weight server for static content?


Thanks,
Al


Re: [squid-users] FileDescriptor Issues

2010-03-22 Thread Al - Image Hosting Services

Hi,

Did you try using ulimit?

Best Regards,
Al


On Mon, 22 Mar 2010, a...@gmail wrote:


Date: Mon, 22 Mar 2010 17:42:47 -
From: "a...@gmail" 
To: squid-users@squid-cache.org
Subject: [squid-users] FileDescriptor Issues

Hi All,

I have tried everything so far I definitely have increased my file 
descriptors on my Ubuntu OS

from 1024 to 46622
But when I start Squid 3.0 STABLE25 I doesn't seem to detect the real 
descriptor's size


I have checked the sysctl.conf, and I have checked the system to make sure 
that the correct size
/etc/sysctl.confWhen I run this I more /proc/sys/fs/file-maxI get 46622But 
Squid3.0 seem to only detect 1024Is there anything that I am not doing 
please?

I don't know what else to do
Thank you
Regards
Adam


Re: [squid-users] Mail client with squid

2010-03-22 Thread Al - Image Hosting Services

Hi,

You need a pop3/smtp proxy. Maybe someone else can recommend one.

Best Regards,
Al




On Mon, 22 Mar 2010, Impact Services wrote:


Date: Mon, 22 Mar 2010 23:21:45 +0530
From: Impact Services 
To: Nyamul Hassan 
Cc: Squid Users 
Subject: Re: [squid-users] Mail client with squid

Hi,

Thanks for your reply. I know squid is a proxy server. May be I didnt
explain the problem correctly.

I am able to manage my internet traffic through squid - filtering,
logging etc. The problem is that I want to enable the client computers
to use outlook express as well but since internet access is routed
through squid and squid doesnt handle pop and smtp traffic, outlook
express on client computers are unable to send or receive emails. What
is the workaround for client computers to access internet through
squid but at the same time enable outlook express to operate as well?

Regards
Gorav

On Mon, Mar 22, 2010 at 7:46 PM, Nyamul Hassan  wrote:

Hi,
Your mail suggests that you are attempting to use email clients.  This
is a Squid mailing list, and Squid is a HTTP proxy, with some support
FTP and HTTPS.
If you meant something else, please be more specific.
Regards
HASSAN


On Mon, Mar 22, 2010 at 7:37 PM, Impact Services
 wrote:


Hi all,

How do I enable mail clients like outlook express on client machines
while filtering internet access through squid? I went through lot of
forums and looks like it is possible through iptables. I tried with
that but am not getting the correct settings.

Also I want to enable one particular mac address, if possible, else ip
address to access internet without routing through squid but all other
clients on the network to access through squid. Is it possible?

Any help would be highly appreciated.

--
Regards
Gorav







--
Regards
Gorav
Impact Services
Manapakkam, Chennai
Mob. 98401 64646


Re: [squid-users] custom url_rewrite_program

2010-03-05 Thread Al - Image Hosting Services

Hi,


Re: [squid-users] custom url_rewrite_program

Amos Jeffries
Mon, 01 Mar 2010 14:36:02 -0800

On Mon, 1 Mar 2010 14:26:21 -0600 (CST), Al - Image Hosting Services
 wrote:

> Hi,
>
> I could not get a rewriter that would do the things we wanted, so we 
> wrote our own. It uses mysql as the database for its list, which is 
> great because I can now update mysql and instantly you can see the 
> change in what is being blocked and what is not. The issue we have is 
> that if the mysql server goes down then the rewrite program tries to 
> reconnect, but if it can't it dies. Ideally, if it could not connect 
> to mysql then it would send a url of an error page, then try to 
> reconnect on the next request. My question is, how long will squid 
> wait for the rewrite program before it will kill the process and start 
> a new one?

>
> Maybe someone has a better idea on how to deal with this.
>
> Best Regards,
> Al

Squid will wait until the next shutdown or the client goes away.

My helpers that do this try 2 connection attempts with 1 second sleep
between, so as not to cause too much client annoyance. That may differ 
with the impatience of your clients.


Amos



We rewrote our helpers (both the url_rewrite_program and the auth program) 
so that now they will try to reconnect if they lose contact with the mysql 
server. Of course, authenication fails or an error page is displayed 
without mysql and as soon as mysql is restarted then everything works just 
like nothing ever happened. I wanted to thank everyone who helped me with 
this and I have one last question: Will squid go down if there is a lot of 
requests and the helpers are very slow to respond?


Thanks,
Al


[squid-users] custom url_rewrite_program

2010-03-01 Thread Al - Image Hosting Services

Hi,

I could not get a rewriter that would do the things we wanted, so we wrote 
our own. It uses mysql as the database for its list, which is great 
because I can now update mysql and instantly you can see the change in 
what is being blocked and what is not. The issue we have is that if the 
mysql server goes down then the rewrite program tries to reconnect, but if 
it can't it dies. Ideally, if it could not connect to mysql then it would 
send a url of an error page, then try to reconnect on the next request. My 
question is, how long will squid wait for the rewrite program before it 
will kill the process and start a new one?


Maybe someone has a better idea on how to deal with this.

Best Regards,
Al


Re: [squid-users] (SOLVED) setting up different filtering based on port number

2010-02-23 Thread Al - Image Hosting Services

Hi,

I have a solution:

acl custom-auth proxy_auth REQUIRED
acl mysite dstdomain .zickswebventures.com
acl blocklistA dstdomain .facebook.com .youtube.com
acl blocklistB dstdomain .youtube.com
acl portA myport 8100
acl portB myport 8101
acl portC myport 8102
acl portJ myport 8109
http_access deny blocklistA portA
http_access deny blocklistB portB
url_rewrite_access allow portA
url_rewrite_access allow portB
url_rewrite_access allow portC
url_rewrite_program /bin/squidGuard -c /etc/squid/squidGuard.conf
url_rewrite_children 3
http_access allow mysite
http_access allow custom-auth all
http_access deny all

I copied in the same page as I have squidGuard redirect to 
ERR_ACCESS_DENIED, but is there a way to remove where it says "Generated 
Tue, 23 Feb 2010 17:21:36 GMT by ..." so it will look the same regardless 
of how it was blocked?


Also, what is this going to do to performance on a heavily loaded served 
and is there a smarter way to do this?


Best Regards,
Al






On Mon, 15 Feb 2010, linuxlo...@gmail.com wrote:


Date: Mon, 15 Feb 2010 01:45:09 +
From: linuxlo...@gmail.com
To: Al - Image Hosting Services ,
squid-users@squid-cache.org
Subject: Re: [squid-users] setting up different filtering based on port number

Need to know a bit more about the origins of the user requests.

Sounds like a good candidate for external helper, a pre-screening of the 
inbound to proxy request to determine which proxy port - thereby ACL's - to 
direct to.

Perhaps a primary proxy port 8082 which would do such decision making, with reverse proxy 
mappings to you 8080 and 8081 ports, so it would be seamless to the end user and that way 
you have a single "master" proxy service for all users.


--Original Message--
From: Al - Image Hosting Services
To: squid-users@squid-cache.org
Subject: [squid-users] setting up different filtering based on port number
Sent: Feb 14, 2010 6:21 PM

Hi,

I know that this is a little bit off topic for this list, but I asked on
the squidguard list and they said that I need to run 2 instances of squid.
I know that squid can listen on 2 ports very easily, and I have setup
squid to listen on 2 different ports. Port 8080 uses squidguard to filter,
but port 8081 doesn't. What I would really like to be able to do is to
have less restrictive filtering on port 8081. For example, I would like to
block youtube on port 8080, but not on port 8081. Still I would like to be
able to block porn on port 8081. Could someone give me some assistance on
how to do this or point me to a how to?

Best Regards,
Al





Sent via BlackBerry by AT&T


Re: [squid-users] setting up different filtering based on port number

2010-02-15 Thread Al - Image Hosting Services

Hi,

On Mon, 15 Feb 2010, Amos Jeffries wrote:

On Sun, 14 Feb 2010 18:21:25 -0600 (CST), Al - Image Hosting Services
 wrote:

Hi,

I know that this is a little bit off topic for this list, but I asked on



the squidguard list and they said that I need to run 2 instances of

squid.

I know that squid can listen on 2 ports very easily, and I have setup
squid to listen on 2 different ports. Port 8080 uses squidguard to

filter,

but port 8081 doesn't. What I would really like to be able to do is to
have less restrictive filtering on port 8081. For example, I would like

to

block youtube on port 8080, but not on port 8081. Still I would like to

be

able to block porn on port 8081. Could someone give me some assistance

on

how to do this or point me to a how to?

Best Regards,
Al


Use of the "myport" ACL type and url_rewrite_access to prevent things
being sent to the squidguard re-writer.

http://www.squid-cache.org/Doc/config/url_rewrite_access/


I should have explained that differently, so I will give it another try.

This is what I have in my squid.conf now:

acl custom-auth proxy_auth REQUIRED
acl mysite dstdomain .zickswebventures.com
acl portA myport 8080
acl portB myport 8081
url_rewrite_access allow portA
url_rewrite_program /bin/squidGuard -c /etc/squid/squidGuard.conf
url_rewrite_children 3
http_access allow mysite
http_access allow custom-auth all
http_access deny all

It works perfectly, requests sent to portA are filtered and requests that 
are sent to portB are not, but I need to add sort of an intermediate level 
of filtering.


Solution 1: It looks like squidguard can filter based on IP. If I created 
a portC in squid.conf, should I be able to add this to my squidguard.conf:


 src portC {
 ip0.0.0.0:8082
 }

 src portA {
 ip0.0.0.0:8080
 }

My question is, does squid pass the port along with the IP address to 
squidguard? If it does, then is my config wrong or does squidguard just 
not know what to do with the port information?


Solution 2: Call 2 instances of squidguard with a different config. 
Although, I don't know if this is possible without knowing more about how 
squid passes information to squidguard.


Solution 3: Create a blocklist within squid of maybe 5 to 30 sites, so my 
squid.conf would like:


acl custom-auth proxy_auth REQUIRED
acl mysite dstdomain .zickswebventures.com
acl block dstdomain .facebook.com .twitter.com
acl portA myport 8080
acl portB myport 8081
acl portB myport 8082

url_rewrite_access allow portA portB
url_rewrite_program /bin/squidGuard -c /etc/squid/squidGuard.conf
url_rewrite_children 3
http_access allow mysite
http_access allow custom-auth all
http_access deny all

Of course, the blank line is where I would need to tell squid to redirect 
to the the zickswebvenutres.com/blocked.html if it sees a one of the urls 
being blocked, but only on portA. Could this be done?


Best Regards,
Al




[squid-users] setting up different filtering based on port number

2010-02-14 Thread Al - Image Hosting Services

Hi,

I know that this is a little bit off topic for this list, but I asked on 
the squidguard list and they said that I need to run 2 instances of squid. 
I know that squid can listen on 2 ports very easily, and I have setup 
squid to listen on 2 different ports. Port 8080 uses squidguard to filter, 
but port 8081 doesn't. What I would really like to be able to do is to 
have less restrictive filtering on port 8081. For example, I would like to 
block youtube on port 8080, but not on port 8081. Still I would like to be 
able to block porn on port 8081. Could someone give me some assistance on 
how to do this or point me to a how to?


Best Regards,
Al





Re: [squid-users] https in transparent mode (fwd)

2009-10-15 Thread Al - Image Hosting Services

Hi,

On Wed, 14 Oct 2009, Amos Jeffries wrote:


WPAD/PAC will do that.

http://wiki.squid-cache.org/SquidFaq/ConfiguringBrowsers#Fully_Automatically_Configuring_Browsers_for_WPAD



YES! This is just what I am looking for. Thankyou!!! Thankyou!!! 
Thankyou!!!




Re: [squid-users] https in transparent mode (fwd)

2009-10-13 Thread Al - Image Hosting Services

Hi,

On Wed, 14 Oct 2009, Amos Jeffries wrote:


Date: Wed, 14 Oct 2009 13:22:48 +1300
From: Amos Jeffries 
To: Andres Salazar 
Cc: squid 
Subject: Re: [squid-users] https in transparent mode

On Tue, 13 Oct 2009 18:28:15 -0500, Andres Salazar 
wrote:

Hello,

Ive been searching for ways to conduct httpd through the transparent
mode of squid. This is because Id like to use squids ACLs not so much
as the caching that obviously doesnt work with this protocol.

Are there ways I can proxy https? Ive heard somebody mention that it
is possible by specifying that it should go with a CONNECT method...
I tried searching the faq for an example of this but i wasnt
successful...

Please advise..

Andres


Squid will not do what you want.

HTTPS was created and designed explicitly to prevent traffic interception
security attacks (aka transparent mode proxies).

CONNECT method is an HTTP plain wrapper only used when the browser knows
it is talking to a proxy.


But is there a universal way to make all browsers on an end users system (like 
windows or mac) use the CONNECT method? It seems like there should be a way to 
force this behavior. The only thing that I found is to have sort of a script 
that modifies the browsers config file, which means that I have to write one 
for each browser out there.


Best Regards,
Al


[squid-users] problems

2009-10-02 Thread Al - Image Hosting Services

Hi,

I seem to have created a lot of problems for myself. We are using squid 
with custom written software to filter web content. Because the server is 
in one location and my users are in other locations and because of the 
large number of hours spent helping people setup their computers to use 
the proxy, I had software written to push everything on port 80, 443, and 
21 to the squid servers and to prevent people from changing the settings. 
This is where I ran into problems. Both https and ftp are filtered fine 
when configured in the browser, but don't work when just pushed to the 
proxy though the software. Since the software runs on the end users 
computers, it seems like I should be able to make ftp and https work. Does 
anyone have any suggestions on how to do this?


Best Regards,
Al




[squid-users] transperate proxy with https

2009-09-14 Thread Al - Image Hosting Services

Hi,

I ran into basically the same issue with https. If https requests are just 
rerouted to squid then it doesn't work. It looks like the browser sends 
the request encrypted when just routed to the proxy and it looks like it 
sends the request plain text when you have the browser configured to use 
the proxy. Can someone confirm this? And if this is the case, is there a 
way to use a transparent proxy with https?


Best Regards,
Al


Re: [squid-users] port 21 to squid

2009-09-13 Thread Al - Image Hosting Services

Hi,

On Sat, 12 Sep 2009, Jakob Curdes wrote:


Date: Sat, 12 Sep 2009 19:06:04 +0200
From: Jakob Curdes 
To: Al - Image Hosting Services 
Cc: squid-users@squid-cache.org
Subject: Re: [squid-users] port 21 to squid

Al - Image Hosting Services schrieb:

Hi,

I have routing set up to push everything from port 21 to the squid proxy 
port. This doesn't seem to work even with IE using ftp on Windows. However 
on linux FireFox works fine when I configure it to use the proxy with ftp 
connections. It also seems to work fine with wget, although using gFTP 
seems to have some issues. I was hoping to use squid to block some ftp 
sites. Is there any way to do this? 
Only if you block port 21 and tell the browsers to use squid as FTP proxy. 
squid does FTP proxing via HTTP, it is not a true FTP proxy (but such proxies 
exist!).
Most current FTP client can operate via a HTTP proxy in the download 
direction; uploads are a different issue. This should be ok for the 
occasional driver download; if you use FTP seriously you should look for a 
dedicated FTP proxy program.


HTH,
Jakob Curdes



Thanks for the info, I just wanted to make sure that I was seeing what I 
thought I was seeing. I am looking to block certain ftp sites and to stop 
http traffic on port 21.


Best Regards,
Al


Re: [squid-users] port 21 to squid

2009-09-13 Thread Al - Image Hosting Services

Hi,

On Sun, 13 Sep 2009, Amos Jeffries wrote:

Jakob Curdes wrote:

Al - Image Hosting Services schrieb:

Hi,

I have routing set up to push everything from port 21 to the squid proxy 
port. This doesn't seem to work even with IE using ftp on Windows. However 
on linux FireFox works fine when I configure it to use the proxy with ftp 
connections. It also seems to work fine with wget, although using gFTP 
seems to have some issues. I was hoping to use squid to block some ftp 
sites. Is there any way to do this? 
Only if you block port 21 and tell the browsers to use squid as FTP proxy. 
squid does FTP proxing via HTTP, it is not a true FTP proxy (but such 
proxies exist!).
Most current FTP client can operate via a HTTP proxy in the download 
direction; uploads are a different issue. This should be ok for the 
occasional driver download; if you use FTP seriously you should look for a 
dedicated FTP proxy program.


HTH,
Jakob Curdes



frox is the FTP proxy I'd recommend.

Amos
--
Please be using
 Current Stable Squid 2.7.STABLE6 or 3.0.STABLE19
 Current Beta Squid 3.1.0.13



Thanks! I am taking a look at frox, I think this could be a better 
solution.


Best Regards,
Al


[squid-users] port 21 to squid

2009-09-12 Thread Al - Image Hosting Services

Hi,

I have routing set up to push everything from port 21 to the squid proxy 
port. This doesn't seem to work even with IE using ftp on Windows. However 
on linux FireFox works fine when I configure it to use the proxy with ftp 
connections. It also seems to work fine with wget, although using gFTP 
seems to have some issues. I was hoping to use squid to block some ftp 
sites. Is there any way to do this?


Best Regards,
Al



Re: [squid-users] authication retries

2009-06-15 Thread Al - Image Hosting Services

Hi,

On Mon, 15 Jun 2009, Amos Jeffries wrote:

On Sun, 14 Jun 2009 20:28:28 -0500 (CDT), Al - Image Hosting Services
 wrote:

Hi,

After thinking about it, I decided that if a person lost their password,
that I should have away for them to retrieve it without needing me, so I
added an acl to unblock a site so it would work without authentication.
Where I have a problem is that it looks like you can try wrong usernames
and passwords all day. Could someone tell me how many times a user will

be

able to type in their username and password before squid will give the
ERR_CACHE_ACCESS_DENIED page? Or if there is even a way to change this
number. I would like people to see the error page after maybe 10 tries.

If

this can't be changed, then I will need to find another way to deal with
this issue.

Best Regards,
Al


Zero times. It is displayed immediately when auth credentials are missing
or bad.

The problem you have now is that the error page is hidden by the browsers
and converted into that popup everyone is so familiar with.


I must admit that I really expected to get this answer, but I need to be 
sure. Do you know if there is any kind of work around?


Thanks,
Al


[squid-users] authication retries

2009-06-14 Thread Al - Image Hosting Services

Hi,

After thinking about it, I decided that if a person lost their password, 
that I should have away for them to retrieve it without needing me, so I 
added an acl to unblock a site so it would work without authentication. 
Where I have a problem is that it looks like you can try wrong usernames 
and passwords all day. Could someone tell me how many times a user will be 
able to type in their username and password before squid will give the 
ERR_CACHE_ACCESS_DENIED page? Or if there is even a way to change this 
number. I would like people to see the error page after maybe 10 tries. If 
this can't be changed, then I will need to find another way to deal with 
this issue.


Best Regards,
Al



Re: [squid-users] multiport config question (solved)

2009-06-14 Thread Al - Image Hosting Services

Hi,

On Sat, 13 Jun 2009, Amos Jeffries wrote:

Al - Image Hosting Services wrote:
Correct. Those will match requests arriving in "http_port 8080" and 
"http_port 8081" respectively.


It looks like this sets the port numbers, but I am not sure how or even if 
there is an acl for "url_rewrite_program /usr/local/bin/squidGuard -c 
/usr/local/etc/squid/squidGuard.conf"






http://www.squid-cache.org/Doc/config/url_rewrite_access


I already have: acl custom-auth proxy_auth REQUIRED
http_access allow custom-auth
http_access allow localhost
http_access deny all

for authentication, so I think that will also work to complicate things. 
Would anyone be able to give me some ideas on this?




http://wiki.squid-cache.org/SquidFaq/SquidAcl

Squid has full boolean logic in it's ACL. (A and (B or X) but not Y) etc.  If 
you can state your needs in such a way then it can be configured.


Rows are vertically first-match wins. 'acl' lines define 'OR' groups. 
*_access lines define an 'AND' condition out of multiple ACL named groups. 
placing '!' before an acl name on *_access makes it 'NOT'.




This issue was the order. In this order it works:
acl custom-auth proxy_auth REQUIRED
acl portA myport 8080
acl portB myport 8181
url_rewrite_access allow portA
url_rewrite_program /usr/local/bin/squidGuard -c 
/usr/local/etc/squid/squidGuard.conf

url_rewrite_children 5
http_access allow custom-auth portA
http_access allow custom-auth portB

Best Regards,
Al


Re: [squid-users] custom auth not working

2009-06-12 Thread Al - Image Hosting Services

Hi,

Ok, I will give this a try. Thank you for the idea. As you can probably 
tell, we are really not perl programmers, but with your help it looks like 
we are going to be able to make it work.


Thankyou,
Al



On Thu, 11 Jun 2009, Chris Robertson wrote:


Date: Thu, 11 Jun 2009 16:13:23 -0800
From: Chris Robertson 
To: squid-users@squid-cache.org
Subject: Re: [squid-users] custom auth not working

Al - Image Hosting Services wrote:

Hi,

On Thu, 11 Jun 2009, Chris Robertson wrote:

# Flush STDOUT
$|=1;



That fixed it. So, it is working. Would you have an idea on how to get it 
to reconnect to the mysql server, if the connection goes down?


Replace...

$sth->execute();

...with...

$sth->execute() or die $dbh->errstr;

...so your helper just exits on that condition.  Squid will kick of another 
one (and I think retry).  If something is really wrong, Squid will quit with 
a "Helpers dying too rapidly" message.




Best Regards,
Al


Chris



[squid-users] multiport config question

2009-06-12 Thread Al - Image Hosting Services

Hi,

I am hoping that someone can give me an example. I want to run squid on 
two ports, with the idea that on port 8080 it will be filtered and on port 
8081 it will not be. I think that I can use:


acl with_filter myport 8080
acl without_filter myport 8081

It looks like this sets the port numbers, but I am not sure how or even if 
there is an acl for "url_rewrite_program /usr/local/bin/squidGuard -c 
/usr/local/etc/squid/squidGuard.conf"


I already have: 
acl custom-auth proxy_auth REQUIRED

http_access allow custom-auth
http_access allow localhost
http_access deny all

for authentication, so I think that will also work to complicate things. 
Would anyone be able to give me some ideas on this?


Best Regards,
Al



Re: [squid-users] custom auth not working

2009-06-11 Thread Al - Image Hosting Services

Hi,

On Thu, 11 Jun 2009, Chris Robertson wrote:

# Flush STDOUT
$|=1;



That fixed it. So, it is working. Would you have an idea on how to get it 
to reconnect to the mysql server, if the connection goes down?


Best Regards,
Al



Re: [squid-users] custom auth not working

2009-06-11 Thread Al - Image Hosting Services

Hi,

The program I tried to use is mysql_auth. You can find a link to it on 
http://www.squid-cache.org/Misc/related-software.dyn


Although, now when I try to download it, I can't. It says it is not found.

Thanks,
Al



On Fri, 12 Jun 2009, Amos Jeffries wrote:


Date: Fri, 12 Jun 2009 04:12:47 +1200
From: Amos Jeffries 
To: Al - Image Hosting Services 
Cc: squid-users@squid-cache.org
Subject: Re: [squid-users] custom auth not working

Al - Image Hosting Services wrote:

Hi,

I was not able to get the mysql authentication program to compile, after a 
day of playing with it, it would still just give me an ld error.


BTW, what mysql program?


Amos
--
Please be using
 Current Stable Squid 2.7.STABLE6 or 3.0.STABLE15
 Current Beta Squid 3.1.0.8 or 3.0.STABLE16-RC1



Re: [squid-users] custom auth not working

2009-06-11 Thread Al - Image Hosting Services

Hi,

On Fri, 12 Jun 2009, Amos Jeffries wrote:

Al - Image Hosting Services wrote:

print "OK\n";
} else {
print "ERR\n";
}
} else {
print "ERR\n";
}

$sth->finish();
}

$dbh->disconnect();



Of course, I changed the username and password that we are using for the 
database. Also, the line that is supposed to update the database is not 
working yet, but as you can see it is commented out. I was supposed to have 
this working Monday and it is now Thursday, so any help would be greatly 
appreciated.


Best Regards,
Al


Maybe all those print "...\n";

IIRC perl adds its own \n to the end for some outputs. Squid is expecting 
only one.


I have considered that, but "...\n" are all the perl examples we can 
find. 
Here are the 2 perl scripts that we based ours on:

http://www.tbits.org/download/squidauth.tgz
http://www.devet.org/squid/proxy_auth/contrib/auth.pl

I may be a little bit confused as to what squid wants back for a 
reply. I guess really it will not hurt to try it without the "...\n".


Thanks,
Al


[squid-users] custom auth not working

2009-06-11 Thread Al - Image Hosting Services

Hi,

I was not able to get the mysql authentication program to compile, after a 
day of playing with it, it would still just give me an ld error. I found a 
couple of examples of authentication software written in perl, so I 
thought "why not custom write it". My oldest son does some programming so 
he helped me with it, although both of us are at a loss as to why it 
doesn't work. It always returns the right response from the command line, 
either an OK or ERR. Maybe the response squid expects has changed. We have 
it logging what is sent to the perl script from squid, so I know that it 
is getting input from squid, but it doesn't look like it gives a response 
back to squid or at least one that squid can understand, because after 
putting in a username and password, the browser does nothing. Also, 
ncsa_auth works with my current config, and I am just changing that line 
in squid.conf to call custom_auth.pl.


Here is what we have:

#!/usr/bin/perl
use DBI;

# config
my $host = "localhost";
my $database = "filter";
my $tablename = "filteredusers";
my $user = "msyqlfilteruser";
my $pw = "faithhope";

# dbi connect
my $dbh = DBI->connect("DBI:mysql:database=$database;host=$host", $user, 
$pw, {'RaiseError' => 1});


while () {
chop $_;

open $file, '>>/var/log/custom_auth.log';
print $file "$_\n";
close $file;

# get email and password
my @info  = split(/ /, $_);
my $email = $dbh->quote(shift(@info));
my $pass  = $dbh->quote(shift(@info));

# query for user status
my $sth = $dbh->prepare("SELECT stat FROM $tablename WHERE 
email=$email AND passwd=$pass LIMIT 1");

$sth->execute();

# check
if (my $ref = $sth->fetchrow_hashref()) {
if ($ref->{'stat'} =~ m/[AF]/) {
#$dbh->do("UPDATE $tablename SET
login_date='2010-06-15 00:00:00' WHERE id='2' LIMIT 1";
print "OK\n";
} else {
print "ERR\n";
}
} else {
print "ERR\n";
}

$sth->finish();
}

$dbh->disconnect();



Of course, I changed the username and password that we are using for the 
database. Also, the line that is supposed to update the database is not 
working yet, but as you can see it is commented out. I was supposed to 
have this working Monday and it is now Thursday, so any help would be 
greatly appreciated.


Best Regards,
Al


[squid-users] OT: software to force the client to use the proxy

2009-05-12 Thread Al - Image Hosting Services

Hi,

I am using squid with a block list. It works great for everyone on the 
LAN, but the issue that I am not able to effectively filter the internet 
for anyone who is not on the LAN without putting in some proxy settings. 
Is there software that could automatically set this up and lock the 
settings when not on the LAN?


Best Regards,
Al


Re: [squid-users] tuning an overloaded server

2008-12-01 Thread Al - Image Hosting Services

Hi,


On Sun, 23 Nov 2008, Amos Jeffries wrote:

Chris Robertson wrote:

Al - Image Hosting Services wrote:
I hope that someone can help me. I have 3 servers running squid acting as 
a web accelerator for a single http server. I have been having a large 
amount of problems with the http server locking up and having drive 
failures and squid seems to be a great solution. I was pleasantly 
surprised at the performance of squid when I first installed it. Even on 
the slowest of my servers, it seemed faster than directly from the http 
server, but after I put the load back on the slowest of my servers it was 
0% idle.


Where is the CPU usage at, User, System or Wait?

Since this server had a second hard drive that was basically not being 
used, I moved the squid cache over to it. I then did some googling and I 
made some other changes and the server is now only 0% idle a few times a 
day. I am really suprised at just how tunable squid is and it was 
wonderful that a change in configuration could make such a difference. But 
since the busy season is coming, I still think that more can be done to 
tune it.


Here are the changes that I made to the squid.conf:
redirect_rewrites_host_header off
acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl to_localhost dst 127.0.0.0/8
acl SSL_ports port 443
acl CONNECT method CONNECT
http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access allow all


Not a good idea, limit the access to just the domains you are serving where 
at all possible.



icp_access allow all
http_port 80 accel vhost
cache_peer 10.10.1.4 parent 80 0 no-query originserver
hierarchy_stoplist cgi-bin ?
cache_mem 16 MB


cache_mem is worth expanding, it lets more objects stay in memory and skip 
the disk IO waiting times.



maximum_object_size_in_memory 128 KB
memory_replacement_policy heap GDSF
cache_replacement_policy heap GDSF
cache_dir ufs /var/squid/cache 200 16 256
maximum_object_size 2048 KB
cache_swap_low 90
cache_swap_high 98
access_log /var/log/squid/access.log squid
cache_log /var/log/squid/cache.log
cache_store_log /var/log/squid/store.log


dump cache_store_log its a waste of disk IO unless you really need it for 
something weird.



emulate_httpd_log on
buffered_logs on
acl QUERY urlpath_regex cgi-bin \?
cache deny QUERY
refresh_pattern ^ftp:   144020% 10080
refresh_pattern ^gopher:14400%  1440
refresh_pattern .   0   20% 4320
acl apache rep_header Server ^Apache
broken_vary_encoding allow apache
half_closed_clients off
icp_port 0
log_icp_queries off
coredump_dir /var/squid/cache

Are there any other changes that can be made to improve cpu usage?

I also did an install from source, since I am using NetBSD systems I used 
pkgsrc to install it. There are several build options:


arp-acl
aufs
carp
coss
diskd
icmp
ipf-transparent
pam-helper
pf-transparent
snmp
ssl
unlinkd

I looked up what some of these are, but I am not sure how they are used by 
squid or if I really need them. My thought is that if I compiled squid 
without these options it could improve performance. I wish there was more 
information on running squid on BSD systems (especially recent info) and 
then maybe I could be sure what effect compiling without an option would 
have. For example, I know what pam is, but do I need it?


You are not using authentication, so could do without PAM.

I looked up what unlinkd does and I can see it running, but will it help 
cpu usage to use it?


Using --enable-truncate "gives a little performance improvement, but may 
cause problems when used with async I/O.  Truncate uses more filesystem 
inodes than unlink.."  asynch I/O refers to diskd, aufs and coss.  The only 
way to see if that will lower your CPU usage is to try, but...


I don't see diskd running as a seperate process, so I don't know that I am 
even using it.


Your cache_dir line specifies "ufs".  To use diskd, you need to change that 
line to...


cache_dir diskd /var/squid/cache 200 16 256


I tried setting it to diskd and then had a problem with it shutting down 
after a few minutes. Is there something else that I need to do to before I 
put in this change?


Thanks,
Al



[squid-users] tuning an overloaded server

2008-11-21 Thread Al - Image Hosting Services

Hi,

I hope that someone can help me. I have 3 servers running squid acting as 
a web accelerator for a single http server. I have been having a large 
amount of problems with the http server locking up and having drive 
failures and squid seems to be a great solution. I was pleasantly 
surprised at the performance of squid when I first installed it. Even on 
the slowest of my servers, it seemed faster than directly from the http 
server, but after I put the load back on the slowest of my servers it was 
0% idle. Since this server had a second hard drive that was basically not 
being used, I moved the squid cache over to it. I then did some googling 
and I made some other changes and the server is now only 0% idle a few 
times a day. I am really suprised at just how tunable squid is and it was 
wonderful that a change in configuration could make such a difference. But 
since the busy season is coming, I still think that more can be done to 
tune it.


Here are the changes that I made to the squid.conf:
redirect_rewrites_host_header off
acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl to_localhost dst 127.0.0.0/8
acl SSL_ports port 443
acl CONNECT method CONNECT
http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access allow all
icp_access allow all
http_port 80 accel vhost
cache_peer 10.10.1.4 parent 80 0 no-query originserver
hierarchy_stoplist cgi-bin ?
cache_mem 16 MB
maximum_object_size_in_memory 128 KB
memory_replacement_policy heap GDSF
cache_replacement_policy heap GDSF
cache_dir ufs /var/squid/cache 200 16 256
maximum_object_size 2048 KB
cache_swap_low 90
cache_swap_high 98
access_log /var/log/squid/access.log squid
cache_log /var/log/squid/cache.log
cache_store_log /var/log/squid/store.log
emulate_httpd_log on
buffered_logs on
acl QUERY urlpath_regex cgi-bin \?
cache deny QUERY
refresh_pattern ^ftp:   144020% 10080
refresh_pattern ^gopher:14400%  1440
refresh_pattern .   0   20% 4320
acl apache rep_header Server ^Apache
broken_vary_encoding allow apache
half_closed_clients off
icp_port 0
log_icp_queries off
coredump_dir /var/squid/cache

Are there any other changes that can be made to improve cpu usage?

I also did an install from source, since I am using NetBSD systems I used 
pkgsrc to install it. There are several build options:


arp-acl
aufs
carp
coss
diskd
icmp
ipf-transparent
pam-helper
pf-transparent
snmp
ssl
unlinkd

I looked up what some of these are, but I am not sure how they are used by 
squid or if I really need them. My thought is that if I compiled squid 
without these options it could improve performance. I wish there was more 
information on running squid on BSD systems (especially recent info) and 
then maybe I could be sure what effect compiling without an option would 
have. For example, I know what pam is, but do I need it? I looked up what 
unlinkd does and I can see it running, but will it help cpu usage to use 
it? I don't see diskd running as a seperate process, so I don't know that 
I am even using it. The only one that I am sure that I need is ssl.


Any help would be greatly appreciated!

Best Regards,
Al