Re: [squid-users] Limiting Connections & MySQL through SSH Tunnel

2021-06-08 Thread squid3

On 2021-06-08 00:04, Grails UK wrote:

Hello,
I hope you are well. I have two questions:

1. Is there any easy way to limit concurrent connections by a single
squid user or the local IP the client connected to.


What are you trying to achieve that make you think of doing that?




2. Our MySQL database is currently only accessible from our local
server on PythonAnywhere and any external access has to be done via an
SSH Tunnel, is there any way to SSH tunnel when using the
basic_db_auth or log_db_daemon?


Getting TCP/IP data to travel over SSH protocol tunnels is a OS routing 
detail.




Amos
___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users


[squid-users] Limiting Connections & MySQL through SSH Tunnel

2021-06-07 Thread Grails UK
Hello,
I hope you are well. I have two questions:

1. Is there any easy way to limit concurrent connections by a single squid
user or the local IP the client connected to. I understand there is a
robust framework for limiting max connections via IP address or number of
IP addresses a user can use, however, this is slightly different. I did see
a post that said this is possible through an external ACL, however, what I
struggle with there is how the external ACL would see which concurrent
connections are still active / count the number of current connections to
determine whether a limit is necessary.

2. Our MySQL database is currently only accessible from our local server on
PythonAnywhere and any external access has to be done via an SSH Tunnel, is
there any way to SSH tunnel when using the basic_db_auth or log_db_daemon?

Thanks!
___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users


Re: [squid-users] limiting connections

2012-05-29 Thread Carlos Manuel Trepeu Pupo
Here I make this post alive because a make a few changes. Here you
have, if anyone need it:

#!/bin/bash
while read line; do

   shortLine=`echo $line | awk -F / '{print $NF}'`
   #echo $shortLine  /home/carlos/guarda   - This is for debugging
   result=`squidclient -h 127.0.0.1 mgr:active_requests | grep
-c $shortLine`

 if [ $result == 1 ]
   then
   echo 'OK'
   #echo 'OK'/home/carlos/guarda   - This is for debugging
 else
   echo 'ERR'
   #echo 'ERR'/home/carlos/guarda   - This is for debugging
 fi
done


The main change is to compare the file to download and not the URL, to
avoid the use of mirrors to increase the simultaneous connections.


On Tue, May 29, 2012 at 9:46 AM, Carlos Manuel Trepeu Pupo
charlie@gmail.com wrote:
 Here I make this post alive because a make a few changes. Here you
 have, if anyone need it:

 #!/bin/bash
 while read line; do

        shortLine=`echo $line | awk -F / '{print $NF}'`
        #echo $shortLine  /home/carlos/guarda   - This is for debugging
        result=`squidclient -h 127.0.0.1 mgr:active_requests | grep
 -c $shortLine`

  if [ $result == 1 ]
        then
        echo 'OK'
        #echo 'OK'/home/carlos/guarda   - This is for debugging
  else
        echo 'ERR'
        #echo 'ERR'/home/carlos/guarda   - This is for debugging
  fi
 done


 The main change is to compare the file to download and not the URL, to
 avoid the use of mirrors to increase the simultaneous connections.


 On Thu, Apr 5, 2012 at 12:52 PM, H h...@hm.net.br wrote:
 Carlos Manuel Trepeu Pupo wrote:
 On Thu, Apr 5, 2012 at 10:32 AM, H h...@hm.net.br wrote:
 Carlos Manuel Trepeu Pupo wrote:
 what is your purpose? solve bandwidth problems? Connection rate?
 Congestion? I believe that limiting to *one* download is not your real
 intention, because the browser could still open hundreds of regular
 pages and your download limit is nuked and was for nothing ...

 what is your operating system?

 I pretend solve bandwidth problems. For the persons who uses download
 manager or accelerators, just limit them to 1 connection. Otherwise I
 tried to solve with delay_pool, the packet that I delivery to the
 client was just like I configured, but with accelerators the upload
 saturate the channel.



 since you did not say what OS youŕe running I can give you only some
 direction, any or most Unix firewall can solve this easy, if you use
 Linux you may like pf with FBSD you should go with ipfw, the latter
 probably is easier to understand but for both you will find zillions of
 examples on the net, look for short setups

 Sorry, I forgot !! Squid is in Debian 6.0 32 bits. My firewall is
 Kerio but in Windows, and i'm not so glad to use it !!!


 first you divide your bandwidth between your users

 First I search about the dynamic bandwidth with Squid, but squid do
 not do this, and them after many search I just find ISA Server with a
 third-party plugin, but I prefer linux.


 if you use TPROXy you can devide/limit the bandwidth on the outside
 interface in order to limit only access to the link but if squid has the
 object in cache it might go out as fast as it can

 you still can manage the bandwidth pool with delay parameters if you wish

 I tried with delay_pool, but the delay_pool just manage the download
 average, and not the upload, I need the both. The last time I tried
 with delay_pool the download accelerator download at the speed that
 I specify, but the proxy consume all channel with the download,
 something that I never understand.



 I guess you meant downlaod accelerator, not manager, you can then limit
 the connection rate within the bandwidth for each user and each
 protocol, for DL-accelerator you should pay attention to udp packages as
 well, you did not say how much user and bandwdith you have but limit the
 tcp connection to 25 and udp to 40 to begin with, then test it until
 coming to something what suites your wish

 I have 128 kbps, and I have no idea about the UDP packages !!! That's
 new for me !! Any documentation that I can read ???



 any of this we talk about has nothing to do with squid

 bw control, connection limiting etc you should handle with the firewall

 let squid do what it does well, cache and proxy

 you could consider a different setup, a Unix box with firewall on your
 internet connection and as your gateway, squid as TPROXY or transparent
 proxy if you need NAT, all on the same box

 if you use Linux you should look for pf firewall, if you use FreeBSD you
 should use ipfw firewall and read the specific documentations, if this
 all is new for you,  you might find it easier to use FreeBSD since all
 setups are straight forward, linux and also pf is a little bit more
 complicated
 as example, setting nat on IPFW can be down with three lines of code, I
 believe pf needs at least 6 to work

 but before you dig deeper you might think about a new design of your
 concept of Internet access



 you still could check which DLaccel your 

Re: [squid-users] limiting connections

2012-04-05 Thread H
Carlos Manuel Trepeu Pupo wrote:
 On Tue, Apr 3, 2012 at 6:35 PM, H h...@hm.net.br wrote:
 Eliezer Croitoru wrote:
 On 03/04/2012 18:30, Carlos Manuel Trepeu Pupo wrote:
 On Mon, Apr 2, 2012 at 6:43 PM, Amos Jeffriessqu...@treenet.co.nz
 wrote:
 On 03.04.2012 02:21, Carlos Manuel Trepeu Pupo wrote:

 Thanks a looot !! That's what I'm missing, everything work
 fine now. So this script can use it cause it's already works.

 Now, I need to know if there is any way to consult the active request
 in squid that work faster that squidclient 


 ACL types are pretty easy to add to the Squid code. I'm happy to
 throw an
 ACL patch your way for a few $$.

 Which comes back to me earlier still unanswered question about why
 you want
 to do this very, very strange thing?

 Amos



 OK !! Here the complicate and strange explanation:

 Where I work we have 128 Kbps for the use of almost 80 PCs, a few of
 them use download accelerators and saturate the channel. I began to
 use the ACL maxconn but I have still a few problems. 60 of the clients
 are under an ISA server that I don't administrate, so I can't limit
 the maxconn to them like the others. Now with this ACL, everyone can
 download but with only one connection. that's the strange main idea.
 what do you mean by only one connection?
 if it's under one isa server then all of them share the same external IP.


 Hi

 I am following this thread with mixed feelings of weirdness and
 admiration ...

 there are always two ways to reach a far point, it's left around or
 right around the world, depending on your position one of the ways is
 always the longer one. I can understand that some without hurry and
 money issues chose the longer one, perhaps also because of more chance
 for adventurous happenings, unknown and the unexpected

 so know I explained in a similar long way what I do not understand, why
 would you make such a complicated out of scope code, slow, certainly
 dangerous ... if at least it would be perl, but bash calling external
 prog and grepping, whow ... when you can solve it with a line of code ?

 this task would fit pf or ipfw much better, would be more elegant and
 zillions times faster and secure, not speaking about time investment,
 how much time you need to write 5/6 keywords of code?

 or is it for demonstration purpose, showing it as an alternative
 possibility?

 
 It's great read this. I just know BASH SHELL, but if you tell me that
 I can make this safer and faster... Previously post I talk about
 this!! That someone tell me if there is a better way of do that, I'm
 newer !! Please, if you can guide me
 


who knows ...

what is your purpose? solve bandwidth problems? Connection rate?
Congestion? I believe that limiting to *one* download is not your real
intention, because the browser could still open hundreds of regular
pages and your download limit is nuked and was for nothing ...

what is your operating system?



-- 
H
+55 11 4249.



signature.asc
Description: OpenPGP digital signature


Re: [squid-users] limiting connections

2012-04-05 Thread Carlos Manuel Trepeu Pupo
On Thu, Apr 5, 2012 at 7:01 AM, H h...@hm.net.br wrote:
 Carlos Manuel Trepeu Pupo wrote:
 On Tue, Apr 3, 2012 at 6:35 PM, H h...@hm.net.br wrote:
 Eliezer Croitoru wrote:
 On 03/04/2012 18:30, Carlos Manuel Trepeu Pupo wrote:
 On Mon, Apr 2, 2012 at 6:43 PM, Amos Jeffriessqu...@treenet.co.nz
 wrote:
 On 03.04.2012 02:21, Carlos Manuel Trepeu Pupo wrote:

 Thanks a looot !! That's what I'm missing, everything work
 fine now. So this script can use it cause it's already works.

 Now, I need to know if there is any way to consult the active request
 in squid that work faster that squidclient 


 ACL types are pretty easy to add to the Squid code. I'm happy to
 throw an
 ACL patch your way for a few $$.

 Which comes back to me earlier still unanswered question about why
 you want
 to do this very, very strange thing?

 Amos



 OK !! Here the complicate and strange explanation:

 Where I work we have 128 Kbps for the use of almost 80 PCs, a few of
 them use download accelerators and saturate the channel. I began to
 use the ACL maxconn but I have still a few problems. 60 of the clients
 are under an ISA server that I don't administrate, so I can't limit
 the maxconn to them like the others. Now with this ACL, everyone can
 download but with only one connection. that's the strange main idea.
 what do you mean by only one connection?
 if it's under one isa server then all of them share the same external IP.


 Hi

 I am following this thread with mixed feelings of weirdness and
 admiration ...

 there are always two ways to reach a far point, it's left around or
 right around the world, depending on your position one of the ways is
 always the longer one. I can understand that some without hurry and
 money issues chose the longer one, perhaps also because of more chance
 for adventurous happenings, unknown and the unexpected

 so know I explained in a similar long way what I do not understand, why
 would you make such a complicated out of scope code, slow, certainly
 dangerous ... if at least it would be perl, but bash calling external
 prog and grepping, whow ... when you can solve it with a line of code ?

 this task would fit pf or ipfw much better, would be more elegant and
 zillions times faster and secure, not speaking about time investment,
 how much time you need to write 5/6 keywords of code?

 or is it for demonstration purpose, showing it as an alternative
 possibility?


 It's great read this. I just know BASH SHELL, but if you tell me that
 I can make this safer and faster... Previously post I talk about
 this!! That someone tell me if there is a better way of do that, I'm
 newer !! Please, if you can guide me



 who knows ...

 what is your purpose? solve bandwidth problems? Connection rate?
 Congestion? I believe that limiting to *one* download is not your real
 intention, because the browser could still open hundreds of regular
 pages and your download limit is nuked and was for nothing ...

 what is your operating system?


I pretend solve bandwidth problems. For the persons who uses download
manager or accelerators, just limit them to 1 connection. Otherwise I
tried to solve with delay_pool, the packet that I delivery to the
client was just like I configured, but with accelerators the upload
saturate the channel.



 --
 H
 +55 11 4249.



Re: [squid-users] limiting connections

2012-04-05 Thread H
Carlos Manuel Trepeu Pupo wrote:
  what is your purpose? solve bandwidth problems? Connection rate?
  Congestion? I believe that limiting to *one* download is not your real
  intention, because the browser could still open hundreds of regular
  pages and your download limit is nuked and was for nothing ...
 
  what is your operating system?
 
 I pretend solve bandwidth problems. For the persons who uses download
 manager or accelerators, just limit them to 1 connection. Otherwise I
 tried to solve with delay_pool, the packet that I delivery to the
 client was just like I configured, but with accelerators the upload
 saturate the channel.
 


since you did not say what OS youŕe running I can give you only some
direction, any or most Unix firewall can solve this easy, if you use
Linux you may like pf with FBSD you should go with ipfw, the latter
probably is easier to understand but for both you will find zillions of
examples on the net, look for short setups

first you divide your bandwidth between your users

if you use TPROXy you can devide/limit the bandwidth on the outside
interface in order to limit only access to the link but if squid has the
object in cache it might go out as fast as it can

you still can manage the bandwidth pool with delay parameters if you wish


I guess you meant downlaod accelerator, not manager, you can then limit
the connection rate within the bandwidth for each user and each
protocol, for DL-accelerator you should pay attention to udp packages as
well, you did not say how much user and bandwdith you have but limit the
tcp connection to 25 and udp to 40 to begin with, then test it until
coming to something what suites your wish

you still could check which DLaccel your people are using and then limit
or block only this P2P ports which used to be very effective




-- 
H
+55 11 4249.



signature.asc
Description: OpenPGP digital signature


Re: [squid-users] limiting connections

2012-04-05 Thread Carlos Manuel Trepeu Pupo
On Thu, Apr 5, 2012 at 10:32 AM, H h...@hm.net.br wrote:
 Carlos Manuel Trepeu Pupo wrote:
  what is your purpose? solve bandwidth problems? Connection rate?
  Congestion? I believe that limiting to *one* download is not your real
  intention, because the browser could still open hundreds of regular
  pages and your download limit is nuked and was for nothing ...
 
  what is your operating system?
 
 I pretend solve bandwidth problems. For the persons who uses download
 manager or accelerators, just limit them to 1 connection. Otherwise I
 tried to solve with delay_pool, the packet that I delivery to the
 client was just like I configured, but with accelerators the upload
 saturate the channel.



 since you did not say what OS youŕe running I can give you only some
 direction, any or most Unix firewall can solve this easy, if you use
 Linux you may like pf with FBSD you should go with ipfw, the latter
 probably is easier to understand but for both you will find zillions of
 examples on the net, look for short setups

Sorry, I forgot !! Squid is in Debian 6.0 32 bits. My firewall is
Kerio but in Windows, and i'm not so glad to use it !!!


 first you divide your bandwidth between your users

First I search about the dynamic bandwidth with Squid, but squid do
not do this, and them after many search I just find ISA Server with a
third-party plugin, but I prefer linux.


 if you use TPROXy you can devide/limit the bandwidth on the outside
 interface in order to limit only access to the link but if squid has the
 object in cache it might go out as fast as it can

 you still can manage the bandwidth pool with delay parameters if you wish

I tried with delay_pool, but the delay_pool just manage the download
average, and not the upload, I need the both. The last time I tried
with delay_pool the download accelerator download at the speed that
I specify, but the proxy consume all channel with the download,
something that I never understand.



 I guess you meant downlaod accelerator, not manager, you can then limit
 the connection rate within the bandwidth for each user and each
 protocol, for DL-accelerator you should pay attention to udp packages as
 well, you did not say how much user and bandwdith you have but limit the
 tcp connection to 25 and udp to 40 to begin with, then test it until
 coming to something what suites your wish

I have 128 kbps, and I have no idea about the UDP packages !!! That's
new for me !! Any documentation that I can read ???


 you still could check which DLaccel your people are using and then limit
 or block only this P2P ports which used to be very effective

Even if I do not permit CONNECT the users can use P2P ports ??

Thanks for this, I can get clear many question about squid that I have !!!





 --
 H
 +55 11 4249.



Re: [squid-users] limiting connections

2012-04-05 Thread H
Carlos Manuel Trepeu Pupo wrote:
 On Thu, Apr 5, 2012 at 10:32 AM, H h...@hm.net.br wrote:
 Carlos Manuel Trepeu Pupo wrote:
 what is your purpose? solve bandwidth problems? Connection rate?
 Congestion? I believe that limiting to *one* download is not your real
 intention, because the browser could still open hundreds of regular
 pages and your download limit is nuked and was for nothing ...

 what is your operating system?

 I pretend solve bandwidth problems. For the persons who uses download
 manager or accelerators, just limit them to 1 connection. Otherwise I
 tried to solve with delay_pool, the packet that I delivery to the
 client was just like I configured, but with accelerators the upload
 saturate the channel.



 since you did not say what OS youŕe running I can give you only some
 direction, any or most Unix firewall can solve this easy, if you use
 Linux you may like pf with FBSD you should go with ipfw, the latter
 probably is easier to understand but for both you will find zillions of
 examples on the net, look for short setups
 
 Sorry, I forgot !! Squid is in Debian 6.0 32 bits. My firewall is
 Kerio but in Windows, and i'm not so glad to use it !!!
 

 first you divide your bandwidth between your users
 
 First I search about the dynamic bandwidth with Squid, but squid do
 not do this, and them after many search I just find ISA Server with a
 third-party plugin, but I prefer linux.
 

 if you use TPROXy you can devide/limit the bandwidth on the outside
 interface in order to limit only access to the link but if squid has the
 object in cache it might go out as fast as it can

 you still can manage the bandwidth pool with delay parameters if you wish
 
 I tried with delay_pool, but the delay_pool just manage the download
 average, and not the upload, I need the both. The last time I tried
 with delay_pool the download accelerator download at the speed that
 I specify, but the proxy consume all channel with the download,
 something that I never understand.
 


 I guess you meant downlaod accelerator, not manager, you can then limit
 the connection rate within the bandwidth for each user and each
 protocol, for DL-accelerator you should pay attention to udp packages as
 well, you did not say how much user and bandwdith you have but limit the
 tcp connection to 25 and udp to 40 to begin with, then test it until
 coming to something what suites your wish
 
 I have 128 kbps, and I have no idea about the UDP packages !!! That's
 new for me !! Any documentation that I can read ???
 


any of this we talk about has nothing to do with squid

bw control, connection limiting etc you should handle with the firewall

let squid do what it does well, cache and proxy

you could consider a different setup, a Unix box with firewall on your
internet connection and as your gateway, squid as TPROXY or transparent
proxy if you need NAT, all on the same box

if you use Linux you should look for pf firewall, if you use FreeBSD you
should use ipfw firewall and read the specific documentations, if this
all is new for you,  you might find it easier to use FreeBSD since all
setups are straight forward, linux and also pf is a little bit more
complicated
as example, setting nat on IPFW can be down with three lines of code, I
believe pf needs at least 6 to work

but before you dig deeper you might think about a new design of your
concept of Internet access



 you still could check which DLaccel your people are using and then limit
 or block only this P2P ports which used to be very effective
 
 Even if I do not permit CONNECT the users can use P2P ports ??
 

I do not understand this question, is this squids connect keyword? If,
nothing to do ...

all I was talking about is on firewall layer, before squid

DL-accel use to fire lots of UDP packets to find a peer, this packages
can saturate small links easily if you do not limit them

you limit the max udp connections as also the max tcp connections, what
helps you getting even with small bandwidth reasonable speed as far as
128kbit/s can be reasonable

you can run a simple squid setup
and you run a simple firewall setup

both on one machine


 Thanks for this, I can get clear many question about squid that I have !!!
 

you are welcome



-- 
H
+55 11 4249.



signature.asc
Description: OpenPGP digital signature


Re: [squid-users] limiting connections

2012-04-04 Thread Carlos Manuel Trepeu Pupo
On Tue, Apr 3, 2012 at 6:35 PM, H h...@hm.net.br wrote:
 Eliezer Croitoru wrote:
 On 03/04/2012 18:30, Carlos Manuel Trepeu Pupo wrote:
 On Mon, Apr 2, 2012 at 6:43 PM, Amos Jeffriessqu...@treenet.co.nz
 wrote:
 On 03.04.2012 02:21, Carlos Manuel Trepeu Pupo wrote:

 Thanks a looot !! That's what I'm missing, everything work
 fine now. So this script can use it cause it's already works.

 Now, I need to know if there is any way to consult the active request
 in squid that work faster that squidclient 


 ACL types are pretty easy to add to the Squid code. I'm happy to
 throw an
 ACL patch your way for a few $$.

 Which comes back to me earlier still unanswered question about why
 you want
 to do this very, very strange thing?

 Amos



 OK !! Here the complicate and strange explanation:

 Where I work we have 128 Kbps for the use of almost 80 PCs, a few of
 them use download accelerators and saturate the channel. I began to
 use the ACL maxconn but I have still a few problems. 60 of the clients
 are under an ISA server that I don't administrate, so I can't limit
 the maxconn to them like the others. Now with this ACL, everyone can
 download but with only one connection. that's the strange main idea.
 what do you mean by only one connection?
 if it's under one isa server then all of them share the same external IP.


 Hi

 I am following this thread with mixed feelings of weirdness and
 admiration ...

 there are always two ways to reach a far point, it's left around or
 right around the world, depending on your position one of the ways is
 always the longer one. I can understand that some without hurry and
 money issues chose the longer one, perhaps also because of more chance
 for adventurous happenings, unknown and the unexpected

 so know I explained in a similar long way what I do not understand, why
 would you make such a complicated out of scope code, slow, certainly
 dangerous ... if at least it would be perl, but bash calling external
 prog and grepping, whow ... when you can solve it with a line of code ?

 this task would fit pf or ipfw much better, would be more elegant and
 zillions times faster and secure, not speaking about time investment,
 how much time you need to write 5/6 keywords of code?

 or is it for demonstration purpose, showing it as an alternative
 possibility?


It's great read this. I just know BASH SHELL, but if you tell me that
I can make this safer and faster... Previously post I talk about
this!! That someone tell me if there is a better way of do that, I'm
newer !! Please, if you can guide me


 --
 H
 +55 11 4249.



Re: [squid-users] limiting connections

2012-04-03 Thread Carlos Manuel Trepeu Pupo
On Mon, Apr 2, 2012 at 6:43 PM, Amos Jeffries squ...@treenet.co.nz wrote:
 On 03.04.2012 02:21, Carlos Manuel Trepeu Pupo wrote:

 Thanks a looot !! That's what I'm missing, everything work
 fine now. So this script can use it cause it's already works.

 Now, I need to know if there is any way to consult the active request
 in squid that work faster that squidclient 


 ACL types are pretty easy to add to the Squid code. I'm happy to throw an
 ACL patch your way for a few $$.

 Which comes back to me earlier still unanswered question about why you want
 to do this very, very strange thing?

 Amos



OK !! Here the complicate and strange explanation:

Where I work we have 128 Kbps for the use of almost 80 PCs, a few of
them use download accelerators and saturate the channel. I began to
use the ACL maxconn but I have still a few problems. 60 of the clients
are under an ISA server that I don't administrate, so I can't limit
the maxconn to them like the others. Now with this ACL, everyone can
download but with only one connection. that's the strange main idea.


Re: [squid-users] limiting connections

2012-04-03 Thread Eliezer Croitoru

On 03/04/2012 18:30, Carlos Manuel Trepeu Pupo wrote:

On Mon, Apr 2, 2012 at 6:43 PM, Amos Jeffriessqu...@treenet.co.nz  wrote:

On 03.04.2012 02:21, Carlos Manuel Trepeu Pupo wrote:


Thanks a looot !! That's what I'm missing, everything work
fine now. So this script can use it cause it's already works.

Now, I need to know if there is any way to consult the active request
in squid that work faster that squidclient 



ACL types are pretty easy to add to the Squid code. I'm happy to throw an
ACL patch your way for a few $$.

Which comes back to me earlier still unanswered question about why you want
to do this very, very strange thing?

Amos




OK !! Here the complicate and strange explanation:

Where I work we have 128 Kbps for the use of almost 80 PCs, a few of
them use download accelerators and saturate the channel. I began to
use the ACL maxconn but I have still a few problems. 60 of the clients
are under an ISA server that I don't administrate, so I can't limit
the maxconn to them like the others. Now with this ACL, everyone can
download but with only one connection. that's the strange main idea.

what do you mean by only one connection?
if it's under one isa server then all of them share the same external IP.

--
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting for Nonprofit organizations
eliezer at ngtech.co.il


Re: [squid-users] limiting connections

2012-04-03 Thread Carlos Manuel Trepeu Pupo
On Tue, Apr 3, 2012 at 4:36 PM, Eliezer Croitoru elie...@ngtech.co.il wrote:
 On 03/04/2012 18:30, Carlos Manuel Trepeu Pupo wrote:

 On Mon, Apr 2, 2012 at 6:43 PM, Amos Jeffriessqu...@treenet.co.nz
  wrote:

 On 03.04.2012 02:21, Carlos Manuel Trepeu Pupo wrote:


 Thanks a looot !! That's what I'm missing, everything work
 fine now. So this script can use it cause it's already works.

 Now, I need to know if there is any way to consult the active request
 in squid that work faster that squidclient 


 ACL types are pretty easy to add to the Squid code. I'm happy to throw an
 ACL patch your way for a few $$.

 Which comes back to me earlier still unanswered question about why you
 want
 to do this very, very strange thing?

 Amos



 OK !! Here the complicate and strange explanation:

 Where I work we have 128 Kbps for the use of almost 80 PCs, a few of
 them use download accelerators and saturate the channel. I began to
 use the ACL maxconn but I have still a few problems. 60 of the clients
 are under an ISA server that I don't administrate, so I can't limit
 the maxconn to them like the others. Now with this ACL, everyone can
 download but with only one connection. that's the strange main idea.

 what do you mean by only one connection?
 if it's under one isa server then all of them share the same external IP.


Yes, all the users under ISA server just can download the same file
with one connection, no more, because as you say have the same IP.


 --
 Eliezer Croitoru
 https://www1.ngtech.co.il
 IT consulting for Nonprofit organizations
 eliezer at ngtech.co.il


Re: [squid-users] limiting connections

2012-04-03 Thread H
Eliezer Croitoru wrote:
 On 03/04/2012 18:30, Carlos Manuel Trepeu Pupo wrote:
 On Mon, Apr 2, 2012 at 6:43 PM, Amos Jeffriessqu...@treenet.co.nz 
 wrote:
 On 03.04.2012 02:21, Carlos Manuel Trepeu Pupo wrote:

 Thanks a looot !! That's what I'm missing, everything work
 fine now. So this script can use it cause it's already works.

 Now, I need to know if there is any way to consult the active request
 in squid that work faster that squidclient 


 ACL types are pretty easy to add to the Squid code. I'm happy to
 throw an
 ACL patch your way for a few $$.

 Which comes back to me earlier still unanswered question about why
 you want
 to do this very, very strange thing?

 Amos



 OK !! Here the complicate and strange explanation:

 Where I work we have 128 Kbps for the use of almost 80 PCs, a few of
 them use download accelerators and saturate the channel. I began to
 use the ACL maxconn but I have still a few problems. 60 of the clients
 are under an ISA server that I don't administrate, so I can't limit
 the maxconn to them like the others. Now with this ACL, everyone can
 download but with only one connection. that's the strange main idea.
 what do you mean by only one connection?
 if it's under one isa server then all of them share the same external IP.
 

Hi

I am following this thread with mixed feelings of weirdness and
admiration ...

there are always two ways to reach a far point, it's left around or
right around the world, depending on your position one of the ways is
always the longer one. I can understand that some without hurry and
money issues chose the longer one, perhaps also because of more chance
for adventurous happenings, unknown and the unexpected

so know I explained in a similar long way what I do not understand, why
would you make such a complicated out of scope code, slow, certainly
dangerous ... if at least it would be perl, but bash calling external
prog and grepping, whow ... when you can solve it with a line of code ?

this task would fit pf or ipfw much better, would be more elegant and
zillions times faster and secure, not speaking about time investment,
how much time you need to write 5/6 keywords of code?

or is it for demonstration purpose, showing it as an alternative
possibility?


-- 
H
+55 11 4249.



signature.asc
Description: OpenPGP digital signature


Re: [squid-users] limiting connections

2012-04-02 Thread Carlos Manuel Trepeu Pupo
Thanks a looot !! That's what I'm missing, everything work
fine now. So this script can use it cause it's already works.

Now, I need to know if there is any way to consult the active request
in squid that work faster that squidclient 

On Sat, Mar 31, 2012 at 9:58 PM, Amos Jeffries squ...@treenet.co.nz wrote:
 On 1/04/2012 7:58 a.m., Carlos Manuel Trepeu Pupo wrote:

 On Sat, Mar 31, 2012 at 4:18 AM, Amos Jeffriessqu...@treenet.co.nz
  wrote:

 On 31/03/2012 3:07 a.m., Carlos Manuel Trepeu Pupo wrote:


 Now I have the following question:
 The possible error to return are 'OK' or 'ERR', if I assume like
 Boolean answer, OK-TRUE    ERR-FALSE. Is this right ?


 Equivalent, yes. Specifically it means success / failure or match /
 non-match on the ACL.


 So, if I deny my acl:
 http_access deny external_helper_acl

 work like this (with the http_access below):
 If return OK -    I denied
 If return ERR -    I do not denied

 It's right this ??? Tanks again for the help !!!


 Correct.

 OK, following the idea of this thread that's what I have:

 #!/bin/bash
 while read line; do
         # -  This it for debug (Testing i saw that not always save to
 file, maybe not always pass from this ACL)
         echo $line  /home/carlos/guarda

         result=`squidclient -h 10.11.10.18 mgr:active_requests | grep
 -c $line`

   if [ $result == 1 ]
         then
         echo 'OK'
         echo 'OK'/home/carlos/guarda
   else
         echo 'ERR'
         echo 'ERR'/home/carlos/guarda
   fi
 done

 In the squid.conf this is the configuration:

 acl test src 10.11.10.12/32
 acl test src 10.11.10.11/32

 acl extensions url_regex /etc/squid3/extensions
 # extensions contains:

 \.(iso|avi|wav|mp3|mp4|mpeg|swf|flv|mpg|wma|ogg|wmv|asx|asf|deb|rpm|exe|zip|tar|tgz|rar|ppt|doc|tiff|pdf)$
 external_acl_type one_conn %URI /home/carlos/contain
 acl limit external one_conn

 http_access allow localhost
 http_access deny extensions !limit
 deny_info ERR_LIMIT limit
 http_access allow test


 I start to download from:
 10.11.10.12 -
  http://ch.releases.ubuntu.com//oneiric/ubuntu-11.10-desktop-i386.iso
 then start from:
 10.11.10.11 -
  http://ch.releases.ubuntu.com//oneiric/ubuntu-11.10-desktop-i386.iso

 And let me download. What I'm missing ???


 You must set ttl=0 negative_ttl=0 grace=0 as options for your
 external_acl_type directive. To disable caching optimizations on the helper
 results.

 Amos


Re: [squid-users] limiting connections

2012-04-02 Thread Amos Jeffries

On 03.04.2012 02:21, Carlos Manuel Trepeu Pupo wrote:

Thanks a looot !! That's what I'm missing, everything work
fine now. So this script can use it cause it's already works.

Now, I need to know if there is any way to consult the active request
in squid that work faster that squidclient 



ACL types are pretty easy to add to the Squid code. I'm happy to throw 
an ACL patch your way for a few $$.


Which comes back to me earlier still unanswered question about why you 
want to do this very, very strange thing?


Amos



Re: [squid-users] limiting connections

2012-03-31 Thread Amos Jeffries

On 31/03/2012 3:07 a.m., Carlos Manuel Trepeu Pupo wrote:


Now I have the following question:
The possible error to return are 'OK' or 'ERR', if I assume like
Boolean answer, OK-TRUE  ERR-FALSE. Is this right ?


Equivalent, yes. Specifically it means success / failure or match / 
non-match on the ACL.



So, if I deny my acl:
http_access deny external_helper_acl

work like this (with the http_access below):
If return OK -  I denied
If return ERR -  I do not denied

It's right this ??? Tanks again for the help !!!


Correct.

Amos



Re: [squid-users] limiting connections

2012-03-31 Thread Carlos Manuel Trepeu Pupo
On Sat, Mar 31, 2012 at 4:18 AM, Amos Jeffries squ...@treenet.co.nz wrote:
 On 31/03/2012 3:07 a.m., Carlos Manuel Trepeu Pupo wrote:


 Now I have the following question:
 The possible error to return are 'OK' or 'ERR', if I assume like
 Boolean answer, OK-TRUE  ERR-FALSE. Is this right ?


 Equivalent, yes. Specifically it means success / failure or match /
 non-match on the ACL.


 So, if I deny my acl:
 http_access deny external_helper_acl

 work like this (with the http_access below):
 If return OK -  I denied
 If return ERR -  I do not denied

 It's right this ??? Tanks again for the help !!!


 Correct.

OK, following the idea of this thread that's what I have:

#!/bin/bash
while read line; do
# - This it for debug (Testing i saw that not always save to
file, maybe not always pass from this ACL)
echo $line  /home/carlos/guarda 

result=`squidclient -h 10.11.10.18 mgr:active_requests | grep
-c $line`

  if [ $result == 1 ]
then
echo 'OK'
echo 'OK'/home/carlos/guarda 
  else
echo 'ERR'
echo 'ERR'/home/carlos/guarda 
  fi
done

In the squid.conf this is the configuration:

acl test src 10.11.10.12/32
acl test src 10.11.10.11/32

acl extensions url_regex /etc/squid3/extensions
# extensions contains:
\.(iso|avi|wav|mp3|mp4|mpeg|swf|flv|mpg|wma|ogg|wmv|asx|asf|deb|rpm|exe|zip|tar|tgz|rar|ppt|doc|tiff|pdf)$
external_acl_type one_conn %URI /home/carlos/contain
acl limit external one_conn

http_access allow localhost
http_access deny extensions !limit
deny_info ERR_LIMIT limit
http_access allow test


I start to download from:
10.11.10.12 - 
http://ch.releases.ubuntu.com//oneiric/ubuntu-11.10-desktop-i386.iso
then start from:
10.11.10.11 - 
http://ch.releases.ubuntu.com//oneiric/ubuntu-11.10-desktop-i386.iso

And let me download. What I'm missing ???


# -

http_access deny all




 Amos



Re: [squid-users] limiting connections

2012-03-31 Thread Amos Jeffries

On 1/04/2012 7:58 a.m., Carlos Manuel Trepeu Pupo wrote:

On Sat, Mar 31, 2012 at 4:18 AM, Amos Jeffriessqu...@treenet.co.nz  wrote:

On 31/03/2012 3:07 a.m., Carlos Manuel Trepeu Pupo wrote:


Now I have the following question:
The possible error to return are 'OK' or 'ERR', if I assume like
Boolean answer, OK-TRUEERR-FALSE. Is this right ?


Equivalent, yes. Specifically it means success / failure or match /
non-match on the ACL.



So, if I deny my acl:
http_access deny external_helper_acl

work like this (with the http_access below):
If return OK -I denied
If return ERR -I do not denied

It's right this ??? Tanks again for the help !!!


Correct.

OK, following the idea of this thread that's what I have:

#!/bin/bash
while read line; do
 # -  This it for debug (Testing i saw that not always save to
file, maybe not always pass from this ACL)
 echo $line  /home/carlos/guarda

 result=`squidclient -h 10.11.10.18 mgr:active_requests | grep
-c $line`

   if [ $result == 1 ]
 then
 echo 'OK'
 echo 'OK'/home/carlos/guarda
   else
 echo 'ERR'
 echo 'ERR'/home/carlos/guarda
   fi
done

In the squid.conf this is the configuration:

acl test src 10.11.10.12/32
acl test src 10.11.10.11/32

acl extensions url_regex /etc/squid3/extensions
# extensions contains:
\.(iso|avi|wav|mp3|mp4|mpeg|swf|flv|mpg|wma|ogg|wmv|asx|asf|deb|rpm|exe|zip|tar|tgz|rar|ppt|doc|tiff|pdf)$
external_acl_type one_conn %URI /home/carlos/contain
acl limit external one_conn

http_access allow localhost
http_access deny extensions !limit
deny_info ERR_LIMIT limit
http_access allow test


I start to download from:
10.11.10.12 -  
http://ch.releases.ubuntu.com//oneiric/ubuntu-11.10-desktop-i386.iso
then start from:
10.11.10.11 -  
http://ch.releases.ubuntu.com//oneiric/ubuntu-11.10-desktop-i386.iso

And let me download. What I'm missing ???


You must set ttl=0 negative_ttl=0 grace=0 as options for your 
external_acl_type directive. To disable caching optimizations on the 
helper results.


Amos


Re: [squid-users] limiting connections

2012-03-30 Thread Carlos Manuel Trepeu Pupo
On Thu, Mar 29, 2012 at 4:03 PM, Eliezer Croitoru elie...@ngtech.co.il wrote:
 On 29/03/2012 21:05, Carlos Manuel Trepeu Pupo wrote:

 On Tue, Mar 27, 2012 at 1:23 PM, Eliezer Croitoruelie...@ngtech.co.il
  wrote:

 On 27/03/2012 17:27, Carlos Manuel Trepeu Pupo wrote:


 On Mon, Mar 26, 2012 at 5:45 PM, Amos Jeffriessqu...@treenet.co.nz
  wrote:


 On 27.03.2012 10:13, Carlos Manuel Trepeu Pupo wrote:



 On Sat, Mar 24, 2012 at 6:31 PM, Amos Jeffriessqu...@treenet.co.nz
 wrote:



 On 25/03/2012 7:23 a.m., Carlos Manuel Trepeu Pupo wrote:

 On Thu, Mar 22, 2012 at 10:00 PM, Amos Jeffries wrote:




 On 23/03/2012 5:42 a.m., Carlos Manuel Trepeu Pupo wrote:




 I need to block each user to make just one connection to download
 specific extension files, but I dont know how to tell that can
 make
 just one connection to each file and not just one connection to
 every
 file with this extension.

 i.e:
 www.google.com #All connection that required
 www.any.domain.com/my_file.rar #just one connection to that file
 www.other.domain.net/other_file.iso #just connection to this file
 www.other_domain1.com/other_file1.rar #just one connection to that
 file

 I hope you understand me and can help me, I have my boss hurrying
 me
 !!!





 There is no easy way to test this in Squid.

 You need an external_acl_type helper which gets given the URI and
 decides
 whether it is permitted or not. That decision can be made by
 querying
 Squid
 cache manager for the list of active_requests and seeing if the URL
 appears
 more than once.




 Hello Amos, following your instructions I make this
 external_acl_type
 helper:

 #!/bin/bash
 result=`squidclient -h 192.168.19.19 mgr:active_requests | grep -c
 $1`
 if [ $result -eq 0 ]
 then
 echo 'OK'
 else
 echo 'ERR'
 fi

 # If I have the same URI then I denied. I make a few test and it
 work
 for me. The problem is when I add the rule to the squid. I make
 this:

 acl extensions url_regex /etc/squid3/extensions
 external_acl_type one_conn %URI /home/carlos/script
 acl limit external one_conn

 # where extensions have:





 \.(iso|avi|wav|mp3|mp4|mpeg|swf|flv|mpg|wma|ogg|wmv|asx|asf|deb|rpm|exe|zip|tar|tgz|rar|ppt|doc|tiff|pdf)$

 http_access deny extensions limit


 So when I make squid3 -k reconfigure the squid stop working

 What can be happening ???





 * The helper needs to be running in a constant loop.
 You can find an example




 http://bazaar.launchpad.net/~squid/squid/3.2/view/head:/helpers/url_rewrite/fake/url_fake_rewrite.sh
 although that is re-writer and you do need to keep the OK/ERR for
 external
 ACL.




 Sorry, this is my first helper, I do not understand the meaning of
 running in a constant loop, in the example I see something like I do.
 Making some test I found that without this line :
 result=`squidclient -h 192.168.19.19 mgr:active_requests | grep -c
 $1`
 the helper not crash, dont work event too, but do not crash, so i
 consider this is in some way the problem.





 Squid starts helpers then uses the STDIN channel to pass it a series of
 requests, reading STDOUt channel for the results. The helper once
 started
 is
 expected to continue until a EOL/close/terminate signal is received on
 its
 STDIN.

 Your helper is exiting without being asked to be Squid after only one
 request. That is logged by Squid as a crash.




 * eq 0 - there should always be 1 request matching the URL. Which
 is
 the
 request you are testing to see if its1 or not. You are wanting to
 deny
 for
 the case where there are *2* requests in existence.




 This is true, but the way I saw was: If the URL do not exist, so
 can't be duplicate, I think isn't wrong !!




 It can't not exist. Squid is already servicing the request you are
 testing
 about.

 Like this:

  receive HTTP request -    (count=1)
  - test ACL (count=1 -    OK)
  - done (count=0)

  receive a HTTP request (count-=1)
  - test ACL (count=1 -    OK)
  receive b HTTP request (count=2)
  - test ACL (count=2 -    ERR)
  - reject b (count=1)
  done a (count=0)



 With your explanation and code from Eliezer Croitoru I made this:

 #!/bin/bash

 while read line; do
        result=`squidclient -h 192.168.19.19 mgr:active_requests | grep
 -c $line`

        echo $line    /home/carlos/guarda       # -    Add this line
 to
 see in a file the $URI I passed to the helper

        if [ $result -eq 1 ]                                   # -
 With your great explain you made me, I change to 1
        then
        echo 'OK'
        else
        echo 'ERR'
        fi
 done

 It's look like it's gonna work, but, here another miss.
 1- The echo $line    /home/carlos/guarda do not save anything to
 the
 file.
 2- When I return 'OK' then in my .conf I can't make a rule like I
 wrote before, I have to make something like this: http_access deny
 extensions !limit, in the many helps you bring me guys, I learn that
 the name limit here its not functional. The deny of limit its
 because when there are just one connection I cant block 

Fwd: [squid-users] limiting connections

2012-03-29 Thread Carlos Manuel Trepeu Pupo
On Tue, Mar 27, 2012 at 1:23 PM, Eliezer Croitoru elie...@ngtech.co.il wrote:
 On 27/03/2012 17:27, Carlos Manuel Trepeu Pupo wrote:

 On Mon, Mar 26, 2012 at 5:45 PM, Amos Jeffriessqu...@treenet.co.nz
  wrote:

 On 27.03.2012 10:13, Carlos Manuel Trepeu Pupo wrote:


 On Sat, Mar 24, 2012 at 6:31 PM, Amos Jeffriessqu...@treenet.co.nz
 wrote:


 On 25/03/2012 7:23 a.m., Carlos Manuel Trepeu Pupo wrote:

 On Thu, Mar 22, 2012 at 10:00 PM, Amos Jeffries wrote:



 On 23/03/2012 5:42 a.m., Carlos Manuel Trepeu Pupo wrote:



 I need to block each user to make just one connection to download
 specific extension files, but I dont know how to tell that can make
 just one connection to each file and not just one connection to
 every
 file with this extension.

 i.e:
 www.google.com #All connection that required
 www.any.domain.com/my_file.rar #just one connection to that file
 www.other.domain.net/other_file.iso #just connection to this file
 www.other_domain1.com/other_file1.rar #just one connection to that
 file

 I hope you understand me and can help me, I have my boss hurrying me
 !!!




 There is no easy way to test this in Squid.

 You need an external_acl_type helper which gets given the URI and
 decides
 whether it is permitted or not. That decision can be made by querying
 Squid
 cache manager for the list of active_requests and seeing if the URL
 appears
 more than once.



 Hello Amos, following your instructions I make this external_acl_type
 helper:

 #!/bin/bash
 result=`squidclient -h 192.168.19.19 mgr:active_requests | grep -c
 $1`
 if [ $result -eq 0 ]
 then
 echo 'OK'
 else
 echo 'ERR'
 fi

 # If I have the same URI then I denied. I make a few test and it work
 for me. The problem is when I add the rule to the squid. I make this:

 acl extensions url_regex /etc/squid3/extensions
 external_acl_type one_conn %URI /home/carlos/script
 acl limit external one_conn

 # where extensions have:




 \.(iso|avi|wav|mp3|mp4|mpeg|swf|flv|mpg|wma|ogg|wmv|asx|asf|deb|rpm|exe|zip|tar|tgz|rar|ppt|doc|tiff|pdf)$

 http_access deny extensions limit


 So when I make squid3 -k reconfigure the squid stop working

 What can be happening ???




 * The helper needs to be running in a constant loop.
 You can find an example



 http://bazaar.launchpad.net/~squid/squid/3.2/view/head:/helpers/url_rewrite/fake/url_fake_rewrite.sh
 although that is re-writer and you do need to keep the OK/ERR for
 external
 ACL.



 Sorry, this is my first helper, I do not understand the meaning of
 running in a constant loop, in the example I see something like I do.
 Making some test I found that without this line :
 result=`squidclient -h 192.168.19.19 mgr:active_requests | grep -c $1`
 the helper not crash, dont work event too, but do not crash, so i
 consider this is in some way the problem.




 Squid starts helpers then uses the STDIN channel to pass it a series of
 requests, reading STDOUt channel for the results. The helper once started
 is
 expected to continue until a EOL/close/terminate signal is received on
 its
 STDIN.

 Your helper is exiting without being asked to be Squid after only one
 request. That is logged by Squid as a crash.




 * eq 0 - there should always be 1 request matching the URL. Which is
 the
 request you are testing to see if its1 or not. You are wanting to deny
 for
 the case where there are *2* requests in existence.



 This is true, but the way I saw was: If the URL do not exist, so
 can't be duplicate, I think isn't wrong !!



 It can't not exist. Squid is already servicing the request you are
 testing
 about.

 Like this:

  receive HTTP request -  (count=1)
  - test ACL (count=1 -  OK)
  - done (count=0)

  receive a HTTP request (count-=1)
  - test ACL (count=1 -  OK)
  receive b HTTP request (count=2)
  - test ACL (count=2 -  ERR)
  - reject b (count=1)
  done a (count=0)


 With your explanation and code from Eliezer Croitoru I made this:

 #!/bin/bash

 while read line; do
        result=`squidclient -h 192.168.19.19 mgr:active_requests | grep
 -c $line`

        echo $line  /home/carlos/guarda       # -  Add this line to
 see in a file the $URI I passed to the helper

        if [ $result -eq 1 ]                                   # -
 With your great explain you made me, I change to 1
        then
        echo 'OK'
        else
        echo 'ERR'
        fi
 done

 It's look like it's gonna work, but, here another miss.
 1- The echo $line  /home/carlos/guarda do not save anything to the
 file.
 2- When I return 'OK' then in my .conf I can't make a rule like I
 wrote before, I have to make something like this: http_access deny
 extensions !limit, in the many helps you bring me guys, I learn that
 the name limit here its not functional. The deny of limit its
 because when there are just one connection I cant block the page.
 3- With the script just like Eliezer tape it the page with the URL to
 download stay loading infinitely.

 So, I have less work, can you help me ??


 1. the first is 

Re: [squid-users] limiting connections

2012-03-29 Thread Carlos Manuel Trepeu Pupo
On Thu, Mar 29, 2012 at 4:03 PM, Eliezer Croitoru elie...@ngtech.co.il wrote:
 On 29/03/2012 21:05, Carlos Manuel Trepeu Pupo wrote:

 On Tue, Mar 27, 2012 at 1:23 PM, Eliezer Croitoruelie...@ngtech.co.il
  wrote:

 On 27/03/2012 17:27, Carlos Manuel Trepeu Pupo wrote:


 On Mon, Mar 26, 2012 at 5:45 PM, Amos Jeffriessqu...@treenet.co.nz
  wrote:


 On 27.03.2012 10:13, Carlos Manuel Trepeu Pupo wrote:



 On Sat, Mar 24, 2012 at 6:31 PM, Amos Jeffriessqu...@treenet.co.nz
 wrote:



 On 25/03/2012 7:23 a.m., Carlos Manuel Trepeu Pupo wrote:

 On Thu, Mar 22, 2012 at 10:00 PM, Amos Jeffries wrote:




 On 23/03/2012 5:42 a.m., Carlos Manuel Trepeu Pupo wrote:




 I need to block each user to make just one connection to download
 specific extension files, but I dont know how to tell that can
 make
 just one connection to each file and not just one connection to
 every
 file with this extension.

 i.e:
 www.google.com #All connection that required
 www.any.domain.com/my_file.rar #just one connection to that file
 www.other.domain.net/other_file.iso #just connection to this file
 www.other_domain1.com/other_file1.rar #just one connection to that
 file

 I hope you understand me and can help me, I have my boss hurrying
 me
 !!!





 There is no easy way to test this in Squid.

 You need an external_acl_type helper which gets given the URI and
 decides
 whether it is permitted or not. That decision can be made by
 querying
 Squid
 cache manager for the list of active_requests and seeing if the URL
 appears
 more than once.




 Hello Amos, following your instructions I make this
 external_acl_type
 helper:

 #!/bin/bash
 result=`squidclient -h 192.168.19.19 mgr:active_requests | grep -c
 $1`
 if [ $result -eq 0 ]
 then
 echo 'OK'
 else
 echo 'ERR'
 fi

 # If I have the same URI then I denied. I make a few test and it
 work
 for me. The problem is when I add the rule to the squid. I make
 this:

 acl extensions url_regex /etc/squid3/extensions
 external_acl_type one_conn %URI /home/carlos/script
 acl limit external one_conn

 # where extensions have:





 \.(iso|avi|wav|mp3|mp4|mpeg|swf|flv|mpg|wma|ogg|wmv|asx|asf|deb|rpm|exe|zip|tar|tgz|rar|ppt|doc|tiff|pdf)$

 http_access deny extensions limit


 So when I make squid3 -k reconfigure the squid stop working

 What can be happening ???





 * The helper needs to be running in a constant loop.
 You can find an example




 http://bazaar.launchpad.net/~squid/squid/3.2/view/head:/helpers/url_rewrite/fake/url_fake_rewrite.sh
 although that is re-writer and you do need to keep the OK/ERR for
 external
 ACL.




 Sorry, this is my first helper, I do not understand the meaning of
 running in a constant loop, in the example I see something like I do.
 Making some test I found that without this line :
 result=`squidclient -h 192.168.19.19 mgr:active_requests | grep -c
 $1`
 the helper not crash, dont work event too, but do not crash, so i
 consider this is in some way the problem.





 Squid starts helpers then uses the STDIN channel to pass it a series of
 requests, reading STDOUt channel for the results. The helper once
 started
 is
 expected to continue until a EOL/close/terminate signal is received on
 its
 STDIN.

 Your helper is exiting without being asked to be Squid after only one
 request. That is logged by Squid as a crash.




 * eq 0 - there should always be 1 request matching the URL. Which
 is
 the
 request you are testing to see if its1 or not. You are wanting to
 deny
 for
 the case where there are *2* requests in existence.




 This is true, but the way I saw was: If the URL do not exist, so
 can't be duplicate, I think isn't wrong !!




 It can't not exist. Squid is already servicing the request you are
 testing
 about.

 Like this:

  receive HTTP request -    (count=1)
  - test ACL (count=1 -    OK)
  - done (count=0)

  receive a HTTP request (count-=1)
  - test ACL (count=1 -    OK)
  receive b HTTP request (count=2)
  - test ACL (count=2 -    ERR)
  - reject b (count=1)
  done a (count=0)



 With your explanation and code from Eliezer Croitoru I made this:

 #!/bin/bash

 while read line; do
        result=`squidclient -h 192.168.19.19 mgr:active_requests | grep
 -c $line`

        echo $line    /home/carlos/guarda       # -    Add this line
 to
 see in a file the $URI I passed to the helper

        if [ $result -eq 1 ]                                   # -
 With your great explain you made me, I change to 1
        then
        echo 'OK'
        else
        echo 'ERR'
        fi
 done

 It's look like it's gonna work, but, here another miss.
 1- The echo $line    /home/carlos/guarda do not save anything to
 the
 file.
 2- When I return 'OK' then in my .conf I can't make a rule like I
 wrote before, I have to make something like this: http_access deny
 extensions !limit, in the many helps you bring me guys, I learn that
 the name limit here its not functional. The deny of limit its
 because when there are just one connection I cant block 

Re: [squid-users] limiting connections

2012-03-29 Thread Carlos Manuel Trepeu Pupo
On Thu, Mar 29, 2012 at 4:03 PM, Eliezer Croitoru elie...@ngtech.co.il wrote:
 On 29/03/2012 21:05, Carlos Manuel Trepeu Pupo wrote:

 On Tue, Mar 27, 2012 at 1:23 PM, Eliezer Croitoruelie...@ngtech.co.il
  wrote:

 On 27/03/2012 17:27, Carlos Manuel Trepeu Pupo wrote:


 On Mon, Mar 26, 2012 at 5:45 PM, Amos Jeffriessqu...@treenet.co.nz
  wrote:


 On 27.03.2012 10:13, Carlos Manuel Trepeu Pupo wrote:



 On Sat, Mar 24, 2012 at 6:31 PM, Amos Jeffriessqu...@treenet.co.nz
 wrote:



 On 25/03/2012 7:23 a.m., Carlos Manuel Trepeu Pupo wrote:

 On Thu, Mar 22, 2012 at 10:00 PM, Amos Jeffries wrote:




 On 23/03/2012 5:42 a.m., Carlos Manuel Trepeu Pupo wrote:




 I need to block each user to make just one connection to download
 specific extension files, but I dont know how to tell that can
 make
 just one connection to each file and not just one connection to
 every
 file with this extension.

 i.e:
 www.google.com #All connection that required
 www.any.domain.com/my_file.rar #just one connection to that file
 www.other.domain.net/other_file.iso #just connection to this file
 www.other_domain1.com/other_file1.rar #just one connection to that
 file

 I hope you understand me and can help me, I have my boss hurrying
 me
 !!!





 There is no easy way to test this in Squid.

 You need an external_acl_type helper which gets given the URI and
 decides
 whether it is permitted or not. That decision can be made by
 querying
 Squid
 cache manager for the list of active_requests and seeing if the URL
 appears
 more than once.




 Hello Amos, following your instructions I make this
 external_acl_type
 helper:

 #!/bin/bash
 result=`squidclient -h 192.168.19.19 mgr:active_requests | grep -c
 $1`
 if [ $result -eq 0 ]
 then
 echo 'OK'
 else
 echo 'ERR'
 fi

 # If I have the same URI then I denied. I make a few test and it
 work
 for me. The problem is when I add the rule to the squid. I make
 this:

 acl extensions url_regex /etc/squid3/extensions
 external_acl_type one_conn %URI /home/carlos/script
 acl limit external one_conn

 # where extensions have:





 \.(iso|avi|wav|mp3|mp4|mpeg|swf|flv|mpg|wma|ogg|wmv|asx|asf|deb|rpm|exe|zip|tar|tgz|rar|ppt|doc|tiff|pdf)$

 http_access deny extensions limit


 So when I make squid3 -k reconfigure the squid stop working

 What can be happening ???





 * The helper needs to be running in a constant loop.
 You can find an example




 http://bazaar.launchpad.net/~squid/squid/3.2/view/head:/helpers/url_rewrite/fake/url_fake_rewrite.sh
 although that is re-writer and you do need to keep the OK/ERR for
 external
 ACL.




 Sorry, this is my first helper, I do not understand the meaning of
 running in a constant loop, in the example I see something like I do.
 Making some test I found that without this line :
 result=`squidclient -h 192.168.19.19 mgr:active_requests | grep -c
 $1`
 the helper not crash, dont work event too, but do not crash, so i
 consider this is in some way the problem.





 Squid starts helpers then uses the STDIN channel to pass it a series of
 requests, reading STDOUt channel for the results. The helper once
 started
 is
 expected to continue until a EOL/close/terminate signal is received on
 its
 STDIN.

 Your helper is exiting without being asked to be Squid after only one
 request. That is logged by Squid as a crash.




 * eq 0 - there should always be 1 request matching the URL. Which
 is
 the
 request you are testing to see if its1 or not. You are wanting to
 deny
 for
 the case where there are *2* requests in existence.




 This is true, but the way I saw was: If the URL do not exist, so
 can't be duplicate, I think isn't wrong !!




 It can't not exist. Squid is already servicing the request you are
 testing
 about.

 Like this:

  receive HTTP request -    (count=1)
  - test ACL (count=1 -    OK)
  - done (count=0)

  receive a HTTP request (count-=1)
  - test ACL (count=1 -    OK)
  receive b HTTP request (count=2)
  - test ACL (count=2 -    ERR)
  - reject b (count=1)
  done a (count=0)



 With your explanation and code from Eliezer Croitoru I made this:

 #!/bin/bash

 while read line; do
        result=`squidclient -h 192.168.19.19 mgr:active_requests | grep
 -c $line`

        echo $line    /home/carlos/guarda       # -    Add this line
 to
 see in a file the $URI I passed to the helper

        if [ $result -eq 1 ]                                   # -
 With your great explain you made me, I change to 1
        then
        echo 'OK'
        else
        echo 'ERR'
        fi
 done

 It's look like it's gonna work, but, here another miss.
 1- The echo $line    /home/carlos/guarda do not save anything to
 the
 file.
 2- When I return 'OK' then in my .conf I can't make a rule like I
 wrote before, I have to make something like this: http_access deny
 extensions !limit, in the many helps you bring me guys, I learn that
 the name limit here its not functional. The deny of limit its
 because when there are just one connection I cant block 

Re: [squid-users] limiting connections

2012-03-27 Thread Carlos Manuel Trepeu Pupo
On Mon, Mar 26, 2012 at 5:45 PM, Amos Jeffries squ...@treenet.co.nz wrote:
 On 27.03.2012 10:13, Carlos Manuel Trepeu Pupo wrote:

 On Sat, Mar 24, 2012 at 6:31 PM, Amos Jeffries squ...@treenet.co.nz
 wrote:

 On 25/03/2012 7:23 a.m., Carlos Manuel Trepeu Pupo wrote:

 On Thu, Mar 22, 2012 at 10:00 PM, Amos Jeffries wrote:


 On 23/03/2012 5:42 a.m., Carlos Manuel Trepeu Pupo wrote:


 I need to block each user to make just one connection to download
 specific extension files, but I dont know how to tell that can make
 just one connection to each file and not just one connection to every
 file with this extension.

 i.e:
 www.google.com #All connection that required
 www.any.domain.com/my_file.rar #just one connection to that file
 www.other.domain.net/other_file.iso #just connection to this file
 www.other_domain1.com/other_file1.rar #just one connection to that
 file

 I hope you understand me and can help me, I have my boss hurrying me
 !!!



 There is no easy way to test this in Squid.

 You need an external_acl_type helper which gets given the URI and
 decides
 whether it is permitted or not. That decision can be made by querying
 Squid
 cache manager for the list of active_requests and seeing if the URL
 appears
 more than once.


 Hello Amos, following your instructions I make this external_acl_type
 helper:

 #!/bin/bash
 result=`squidclient -h 192.168.19.19 mgr:active_requests | grep -c $1`
 if [ $result -eq 0 ]
 then
 echo 'OK'
 else
 echo 'ERR'
 fi

 # If I have the same URI then I denied. I make a few test and it work
 for me. The problem is when I add the rule to the squid. I make this:

 acl extensions url_regex /etc/squid3/extensions
 external_acl_type one_conn %URI /home/carlos/script
 acl limit external one_conn

 # where extensions have:



 \.(iso|avi|wav|mp3|mp4|mpeg|swf|flv|mpg|wma|ogg|wmv|asx|asf|deb|rpm|exe|zip|tar|tgz|rar|ppt|doc|tiff|pdf)$

 http_access deny extensions limit


 So when I make squid3 -k reconfigure the squid stop working

 What can be happening ???



 * The helper needs to be running in a constant loop.
 You can find an example


 http://bazaar.launchpad.net/~squid/squid/3.2/view/head:/helpers/url_rewrite/fake/url_fake_rewrite.sh
 although that is re-writer and you do need to keep the OK/ERR for
 external
 ACL.


 Sorry, this is my first helper, I do not understand the meaning of
 running in a constant loop, in the example I see something like I do.
 Making some test I found that without this line :
 result=`squidclient -h 192.168.19.19 mgr:active_requests | grep -c $1`
 the helper not crash, dont work event too, but do not crash, so i
 consider this is in some way the problem.



 Squid starts helpers then uses the STDIN channel to pass it a series of
 requests, reading STDOUt channel for the results. The helper once started is
 expected to continue until a EOL/close/terminate signal is received on its
 STDIN.

 Your helper is exiting without being asked to be Squid after only one
 request. That is logged by Squid as a crash.




 * eq 0 - there should always be 1 request matching the URL. Which is
 the
 request you are testing to see if its 1 or not. You are wanting to deny
 for
 the case where there are *2* requests in existence.


 This is true, but the way I saw was: If the URL do not exist, so
 can't be duplicate, I think isn't wrong !!


 It can't not exist. Squid is already servicing the request you are testing
 about.

 Like this:

  receive HTTP request - (count=1)
  - test ACL (count=1 - OK)
  - done (count=0)

  receive a HTTP request (count-=1)
  - test ACL (count=1 - OK)
  receive b HTTP request (count=2)
  - test ACL (count=2 - ERR)
  - reject b (count=1)
  done a (count=0)

With your explanation and code from Eliezer Croitoru I made this:

#!/bin/bash

while read line; do
   result=`squidclient -h 192.168.19.19 mgr:active_requests | grep
-c $line`

   echo $line  /home/carlos/guarda   # - Add this line to
see in a file the $URI I passed to the helper

   if [ $result -eq 1 ]   # -
With your great explain you made me, I change to 1
   then
   echo 'OK'
   else
   echo 'ERR'
   fi
done

It's look like it's gonna work, but, here another miss.
1- The echo $line  /home/carlos/guarda do not save anything to the file.
2- When I return 'OK' then in my .conf I can't make a rule like I
wrote before, I have to make something like this: http_access deny
extensions !limit, in the many helps you bring me guys, I learn that
the name limit here its not functional. The deny of limit its
because when there are just one connection I cant block the page.
3- With the script just like Eliezer tape it the page with the URL to
download stay loading infinitely.

So, I have less work, can you help me ??







 * ensure you have manager requests form localhost not going through the
 ACL
 test.


 I was making this wrong, the localhost was going through the ACL, but
 I just changed !!! The problem persist, 

Re: [squid-users] limiting connections

2012-03-27 Thread Eliezer Croitoru

On 27/03/2012 17:27, Carlos Manuel Trepeu Pupo wrote:

On Mon, Mar 26, 2012 at 5:45 PM, Amos Jeffriessqu...@treenet.co.nz  wrote:

On 27.03.2012 10:13, Carlos Manuel Trepeu Pupo wrote:


On Sat, Mar 24, 2012 at 6:31 PM, Amos Jeffriessqu...@treenet.co.nz
wrote:


On 25/03/2012 7:23 a.m., Carlos Manuel Trepeu Pupo wrote:


On Thu, Mar 22, 2012 at 10:00 PM, Amos Jeffries wrote:



On 23/03/2012 5:42 a.m., Carlos Manuel Trepeu Pupo wrote:



I need to block each user to make just one connection to download
specific extension files, but I dont know how to tell that can make
just one connection to each file and not just one connection to every
file with this extension.

i.e:
www.google.com #All connection that required
www.any.domain.com/my_file.rar #just one connection to that file
www.other.domain.net/other_file.iso #just connection to this file
www.other_domain1.com/other_file1.rar #just one connection to that
file

I hope you understand me and can help me, I have my boss hurrying me
!!!




There is no easy way to test this in Squid.

You need an external_acl_type helper which gets given the URI and
decides
whether it is permitted or not. That decision can be made by querying
Squid
cache manager for the list of active_requests and seeing if the URL
appears
more than once.



Hello Amos, following your instructions I make this external_acl_type
helper:

#!/bin/bash
result=`squidclient -h 192.168.19.19 mgr:active_requests | grep -c $1`
if [ $result -eq 0 ]
then
echo 'OK'
else
echo 'ERR'
fi

# If I have the same URI then I denied. I make a few test and it work
for me. The problem is when I add the rule to the squid. I make this:

acl extensions url_regex /etc/squid3/extensions
external_acl_type one_conn %URI /home/carlos/script
acl limit external one_conn

# where extensions have:



\.(iso|avi|wav|mp3|mp4|mpeg|swf|flv|mpg|wma|ogg|wmv|asx|asf|deb|rpm|exe|zip|tar|tgz|rar|ppt|doc|tiff|pdf)$

http_access deny extensions limit


So when I make squid3 -k reconfigure the squid stop working

What can be happening ???




* The helper needs to be running in a constant loop.
You can find an example


http://bazaar.launchpad.net/~squid/squid/3.2/view/head:/helpers/url_rewrite/fake/url_fake_rewrite.sh
although that is re-writer and you do need to keep the OK/ERR for
external
ACL.



Sorry, this is my first helper, I do not understand the meaning of
running in a constant loop, in the example I see something like I do.
Making some test I found that without this line :
result=`squidclient -h 192.168.19.19 mgr:active_requests | grep -c $1`
the helper not crash, dont work event too, but do not crash, so i
consider this is in some way the problem.




Squid starts helpers then uses the STDIN channel to pass it a series of
requests, reading STDOUt channel for the results. The helper once started is
expected to continue until a EOL/close/terminate signal is received on its
STDIN.

Your helper is exiting without being asked to be Squid after only one
request. That is logged by Squid as a crash.






* eq 0 - there should always be 1 request matching the URL. Which is
the
request you are testing to see if its1 or not. You are wanting to deny
for
the case where there are *2* requests in existence.



This is true, but the way I saw was: If the URL do not exist, so
can't be duplicate, I think isn't wrong !!



It can't not exist. Squid is already servicing the request you are testing
about.

Like this:

  receive HTTP request -  (count=1)
  - test ACL (count=1 -  OK)
  - done (count=0)

  receive a HTTP request (count-=1)
  - test ACL (count=1 -  OK)
  receive b HTTP request (count=2)
  - test ACL (count=2 -  ERR)
  - reject b (count=1)
  done a (count=0)


With your explanation and code from Eliezer Croitoru I made this:

#!/bin/bash

while read line; do
result=`squidclient -h 192.168.19.19 mgr:active_requests | grep
-c $line`

echo $line  /home/carlos/guarda   # -  Add this line to
see in a file the $URI I passed to the helper

if [ $result -eq 1 ]   # -
With your great explain you made me, I change to 1
then
echo 'OK'
else
echo 'ERR'
fi
done

It's look like it's gonna work, but, here another miss.
1- The echo $line  /home/carlos/guarda do not save anything to the file.
2- When I return 'OK' then in my .conf I can't make a rule like I
wrote before, I have to make something like this: http_access deny
extensions !limit, in the many helps you bring me guys, I learn that
the name limit here its not functional. The deny of limit its
because when there are just one connection I cant block the page.
3- With the script just like Eliezer tape it the page with the URL to
download stay loading infinitely.

So, I have less work, can you help me ??



1. the first is that squidclient -h 192.168.19.19 mgr:active_requests 
can take awhile in some cases.
the first time i tried to run the command it took couple of minutes for 
squid to send the 

Re: [squid-users] limiting connections

2012-03-26 Thread Carlos Manuel Trepeu Pupo
On Sat, Mar 24, 2012 at 6:31 PM, Amos Jeffries squ...@treenet.co.nz wrote:
 On 25/03/2012 7:23 a.m., Carlos Manuel Trepeu Pupo wrote:

 On Thu, Mar 22, 2012 at 10:00 PM, Amos Jeffries wrote:

 On 23/03/2012 5:42 a.m., Carlos Manuel Trepeu Pupo wrote:

 I need to block each user to make just one connection to download
 specific extension files, but I dont know how to tell that can make
 just one connection to each file and not just one connection to every
 file with this extension.

 i.e:
 www.google.com #All connection that required
 www.any.domain.com/my_file.rar #just one connection to that file
 www.other.domain.net/other_file.iso #just connection to this file
 www.other_domain1.com/other_file1.rar #just one connection to that file

 I hope you understand me and can help me, I have my boss hurrying me !!!


 There is no easy way to test this in Squid.

 You need an external_acl_type helper which gets given the URI and decides
 whether it is permitted or not. That decision can be made by querying
 Squid
 cache manager for the list of active_requests and seeing if the URL
 appears
 more than once.

 Hello Amos, following your instructions I make this external_acl_type
 helper:

 #!/bin/bash
 result=`squidclient -h 192.168.19.19 mgr:active_requests | grep -c $1`
 if [ $result -eq 0 ]
 then
 echo 'OK'
 else
 echo 'ERR'
 fi

 # If I have the same URI then I denied. I make a few test and it work
 for me. The problem is when I add the rule to the squid. I make this:

 acl extensions url_regex /etc/squid3/extensions
 external_acl_type one_conn %URI /home/carlos/script
 acl limit external one_conn

 # where extensions have:

 \.(iso|avi|wav|mp3|mp4|mpeg|swf|flv|mpg|wma|ogg|wmv|asx|asf|deb|rpm|exe|zip|tar|tgz|rar|ppt|doc|tiff|pdf)$

 http_access deny extensions limit


 So when I make squid3 -k reconfigure the squid stop working

 What can be happening ???


 * The helper needs to be running in a constant loop.
 You can find an example
 http://bazaar.launchpad.net/~squid/squid/3.2/view/head:/helpers/url_rewrite/fake/url_fake_rewrite.sh
 although that is re-writer and you do need to keep the OK/ERR for external
 ACL.

Sorry, this is my first helper, I do not understand the meaning of
running in a constant loop, in the example I see something like I do.
Making some test I found that without this line :
result=`squidclient -h 192.168.19.19 mgr:active_requests | grep -c $1`
the helper not crash, dont work event too, but do not crash, so i
consider this is in some way the problem.


 * eq 0 - there should always be 1 request matching the URL. Which is the
 request you are testing to see if its 1 or not. You are wanting to deny for
 the case where there are *2* requests in existence.

This is true, but the way I saw was: If the URL do not exist, so
can't be duplicate, I think isn't wrong !!


 * ensure you have manager requests form localhost not going through the ACL
 test.

I was making this wrong, the localhost was going through the ACL, but
I just changed !!! The problem persist, What can I do ???



 Amos



Re: [squid-users] limiting connections

2012-03-26 Thread Amos Jeffries

On 27.03.2012 10:13, Carlos Manuel Trepeu Pupo wrote:
On Sat, Mar 24, 2012 at 6:31 PM, Amos Jeffries squ...@treenet.co.nz 
wrote:

On 25/03/2012 7:23 a.m., Carlos Manuel Trepeu Pupo wrote:


On Thu, Mar 22, 2012 at 10:00 PM, Amos Jeffries wrote:


On 23/03/2012 5:42 a.m., Carlos Manuel Trepeu Pupo wrote:


I need to block each user to make just one connection to download
specific extension files, but I dont know how to tell that can 
make
just one connection to each file and not just one connection to 
every

file with this extension.

i.e:
www.google.com #All connection that required
www.any.domain.com/my_file.rar #just one connection to that file
www.other.domain.net/other_file.iso #just connection to this file
www.other_domain1.com/other_file1.rar #just one connection to 
that file


I hope you understand me and can help me, I have my boss hurrying 
me !!!



There is no easy way to test this in Squid.

You need an external_acl_type helper which gets given the URI and 
decides
whether it is permitted or not. That decision can be made by 
querying

Squid
cache manager for the list of active_requests and seeing if the 
URL

appears
more than once.


Hello Amos, following your instructions I make this 
external_acl_type

helper:

#!/bin/bash
result=`squidclient -h 192.168.19.19 mgr:active_requests | grep -c 
$1`

if [ $result -eq 0 ]
then
echo 'OK'
else
echo 'ERR'
fi

# If I have the same URI then I denied. I make a few test and it 
work
for me. The problem is when I add the rule to the squid. I make 
this:


acl extensions url_regex /etc/squid3/extensions
external_acl_type one_conn %URI /home/carlos/script
acl limit external one_conn

# where extensions have:


\.(iso|avi|wav|mp3|mp4|mpeg|swf|flv|mpg|wma|ogg|wmv|asx|asf|deb|rpm|exe|zip|tar|tgz|rar|ppt|doc|tiff|pdf)$

http_access deny extensions limit


So when I make squid3 -k reconfigure the squid stop working

What can be happening ???



* The helper needs to be running in a constant loop.
You can find an example

http://bazaar.launchpad.net/~squid/squid/3.2/view/head:/helpers/url_rewrite/fake/url_fake_rewrite.sh
although that is re-writer and you do need to keep the OK/ERR for 
external

ACL.


Sorry, this is my first helper, I do not understand the meaning of
running in a constant loop, in the example I see something like I do.
Making some test I found that without this line :
result=`squidclient -h 192.168.19.19 mgr:active_requests | grep -c 
$1`

the helper not crash, dont work event too, but do not crash, so i
consider this is in some way the problem.



Squid starts helpers then uses the STDIN channel to pass it a series of 
requests, reading STDOUt channel for the results. The helper once 
started is expected to continue until a EOL/close/terminate signal is 
received on its STDIN.


Your helper is exiting without being asked to be Squid after only one 
request. That is logged by Squid as a crash.






* eq 0 - there should always be 1 request matching the URL. Which 
is the
request you are testing to see if its 1 or not. You are wanting to 
deny for

the case where there are *2* requests in existence.


This is true, but the way I saw was: If the URL do not exist, so
can't be duplicate, I think isn't wrong !!


It can't not exist. Squid is already servicing the request you are 
testing about.


Like this:

 receive HTTP request - (count=1)
  - test ACL (count=1 - OK)
  - done (count=0)

 receive a HTTP request (count-=1)
 - test ACL (count=1 - OK)
 receive b HTTP request (count=2)
  - test ACL (count=2 - ERR)
  - reject b (count=1)
 done a (count=0)





* ensure you have manager requests form localhost not going through 
the ACL

test.


I was making this wrong, the localhost was going through the ACL, but
I just changed !!! The problem persist, What can I do ???


which problem?


Amos


Re: [squid-users] limiting connections

2012-03-26 Thread Eliezer Croitoru

On 26/03/2012 23:13, Carlos Manuel Trepeu Pupo wrote:


#!/bin/bash
result=`squidclient -h 192.168.19.19 mgr:active_requests | grep -c $1`
if [ $result -eq 0 ]
then
echo 'OK'
else
echo 'ERR'
fi

the code should be something like that:

#!/bin/bash
while read line; do
result=`squidclient -h 192.168.19.19 mgr:active_requests | grep -c 
$line`
if [ $result -eq 0 ]
then
echo 'OK'
else
echo 'ERR'
fi
done

but as i was looking at the mgr:active_requests in noticed that squid 
responses very slow and it can take a while to get answer from it sometimes.


Regards,
Eliezer





# If I have the same URI then I denied. I make a few test and it work
for me. The problem is when I add the rule to the squid. I make this:

acl extensions url_regex /etc/squid3/extensions
external_acl_type one_conn %URI /home/carlos/script
acl limit external one_conn

# where extensions have:

\.(iso|avi|wav|mp3|mp4|mpeg|swf|flv|mpg|wma|ogg|wmv|asx|asf|deb|rpm|exe|zip|tar|tgz|rar|ppt|doc|tiff|pdf)$

http_access deny extensions limit


So when I make squid3 -k reconfigure the squid stop working

What can be happening ???



* The helper needs to be running in a constant loop.
You can find an example
http://bazaar.launchpad.net/~squid/squid/3.2/view/head:/helpers/url_rewrite/fake/url_fake_rewrite.sh
although that is re-writer and you do need to keep the OK/ERR for external
ACL.


Sorry, this is my first helper, I do not understand the meaning of
running in a constant loop, in the example I see something like I do.
Making some test I found that without this line :
result=`squidclient -h 192.168.19.19 mgr:active_requests | grep -c $1`
the helper not crash, dont work event too, but do not crash, so i
consider this is in some way the problem.



* eq 0 - there should always be 1 request matching the URL. Which is the
request you are testing to see if its1 or not. You are wanting to deny for
the case where there are *2* requests in existence.


This is true, but the way I saw was: If the URL do not exist, so
can't be duplicate, I think isn't wrong !!



* ensure you have manager requests form localhost not going through the ACL
test.


I was making this wrong, the localhost was going through the ACL, but
I just changed !!! The problem persist, What can I do ???




Amos




--
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting for Nonprofit organizations
eliezer at ngtech.co.il


Re: [squid-users] limiting connections

2012-03-24 Thread Carlos Manuel Trepeu Pupo
On Thu, Mar 22, 2012 at 10:00 PM, Amos Jeffries squ...@treenet.co.nz wrote:
 On 23/03/2012 5:42 a.m., Carlos Manuel Trepeu Pupo wrote:

 I need to block each user to make just one connection to download
 specific extension files, but I dont know how to tell that can make
 just one connection to each file and not just one connection to every
 file with this extension.

 i.e:
 www.google.com #All connection that required
 www.any.domain.com/my_file.rar #just one connection to that file
 www.other.domain.net/other_file.iso #just connection to this file
 www.other_domain1.com/other_file1.rar #just one connection to that file

 I hope you understand me and can help me, I have my boss hurrying me !!!


 There is no easy way to test this in Squid.

 You need an external_acl_type helper which gets given the URI and decides
 whether it is permitted or not. That decision can be made by querying Squid
 cache manager for the list of active_requests and seeing if the URL appears
 more than once.

Hello Amos, following your instructions I make this external_acl_type helper:

#!/bin/bash
result=`squidclient -h 192.168.19.19 mgr:active_requests | grep -c $1`
if [ $result -eq 0 ]
then
echo 'OK'
else
echo 'ERR'
fi

# If I have the same URI then I denied. I make a few test and it work
for me. The problem is when I add the rule to the squid. I make this:

acl extensions url_regex /etc/squid3/extensions
external_acl_type one_conn %URI /home/carlos/script
acl limit external one_conn

# where extensions have:
\.(iso|avi|wav|mp3|mp4|mpeg|swf|flv|mpg|wma|ogg|wmv|asx|asf|deb|rpm|exe|zip|tar|tgz|rar|ppt|doc|tiff|pdf)$

http_access deny extensions limit


So when I make squid3 -k reconfigure the squid stop working

What can be happening ???

This is my log of squid:
Mar 24 09:25:04 test squid[28075]: helperHandleRead: unexpected read
from one_conn #1, 3 bytes 'OK '
Mar 24 09:25:04 test squid[28075]: helperHandleRead: unexpected read
from one_conn #2, 3 bytes 'OK '
Mar 24 09:25:04 test squid[28075]: WARNING: one_conn #1 (FD 15) exited
Mar 24 09:25:04 test squid[28075]: WARNING: one_conn #2 (FD 16) exited
Mar 24 09:25:04 test squid[28075]: CACHEMGR: unknown@192.168.19.19
requesting 'active_requests'
Mar 24 09:25:04 test squid[28075]: helperHandleRead: unexpected read
from one_conn #3, 3 bytes 'OK '
Mar 24 09:25:04 test squid[28075]: WARNING: one_conn #3 (FD 24) exited
Mar 24 09:25:04 test squid[28075]: helperHandleRead: unexpected read
from one_conn #4, 4 bytes 'ERR '
Mar 24 09:25:04 test squid[28075]: WARNING: one_conn #4 (FD 27) exited
Mar 24 09:25:04 test squid[28075]: Too few one_conn processes are running
Mar 24 09:25:04 test squid[28075]: storeDirWriteCleanLogs: Starting...
Mar 24 09:25:04 test squid[28075]: WARNING: Closing open FD   12
Mar 24 09:25:04 test squid[28075]:   Finished.  Wrote 25613 entries.
Mar 24 09:25:04 test squid[28075]:   Took 0.00 seconds (7740404.96 entries/sec).
Mar 24 09:25:04 test squid[28075]: The one_conn helpers are crashing
too rapidly, need help!



 Amos



Re: [squid-users] limiting connections

2012-03-24 Thread Amos Jeffries

On 25/03/2012 7:23 a.m., Carlos Manuel Trepeu Pupo wrote:

On Thu, Mar 22, 2012 at 10:00 PM, Amos Jeffries wrote:

On 23/03/2012 5:42 a.m., Carlos Manuel Trepeu Pupo wrote:

I need to block each user to make just one connection to download
specific extension files, but I dont know how to tell that can make
just one connection to each file and not just one connection to every
file with this extension.

i.e:
www.google.com #All connection that required
www.any.domain.com/my_file.rar #just one connection to that file
www.other.domain.net/other_file.iso #just connection to this file
www.other_domain1.com/other_file1.rar #just one connection to that file

I hope you understand me and can help me, I have my boss hurrying me !!!


There is no easy way to test this in Squid.

You need an external_acl_type helper which gets given the URI and decides
whether it is permitted or not. That decision can be made by querying Squid
cache manager for the list of active_requests and seeing if the URL appears
more than once.

Hello Amos, following your instructions I make this external_acl_type helper:

#!/bin/bash
result=`squidclient -h 192.168.19.19 mgr:active_requests | grep -c $1`
if [ $result -eq 0 ]
then
echo 'OK'
else
echo 'ERR'
fi

# If I have the same URI then I denied. I make a few test and it work
for me. The problem is when I add the rule to the squid. I make this:

acl extensions url_regex /etc/squid3/extensions
external_acl_type one_conn %URI /home/carlos/script
acl limit external one_conn

# where extensions have:
\.(iso|avi|wav|mp3|mp4|mpeg|swf|flv|mpg|wma|ogg|wmv|asx|asf|deb|rpm|exe|zip|tar|tgz|rar|ppt|doc|tiff|pdf)$

http_access deny extensions limit


So when I make squid3 -k reconfigure the squid stop working

What can be happening ???


* The helper needs to be running in a constant loop.
You can find an example 
http://bazaar.launchpad.net/~squid/squid/3.2/view/head:/helpers/url_rewrite/fake/url_fake_rewrite.sh
although that is re-writer and you do need to keep the OK/ERR for 
external ACL.


* eq 0 - there should always be 1 request matching the URL. Which is 
the request you are testing to see if its 1 or not. You are wanting to 
deny for the case where there are *2* requests in existence.


* ensure you have manager requests form localhost not going through the 
ACL test.



Amos



[squid-users] limiting connections

2012-03-22 Thread Carlos Manuel Trepeu Pupo
I need to block each user to make just one connection to download
specific extension files, but I dont know how to tell that can make
just one connection to each file and not just one connection to every
file with this extension.

i.e:
www.google.com #All connection that required
www.any.domain.com/my_file.rar #just one connection to that file
www.other.domain.net/other_file.iso #just connection to this file
www.other_domain1.com/other_file1.rar #just one connection to that file

I hope you understand me and can help me, I have my boss hurrying me !!!


Re: [squid-users] limiting connections

2012-03-22 Thread Amos Jeffries

On 23/03/2012 5:42 a.m., Carlos Manuel Trepeu Pupo wrote:

I need to block each user to make just one connection to download
specific extension files, but I dont know how to tell that can make
just one connection to each file and not just one connection to every
file with this extension.

i.e:
www.google.com #All connection that required
www.any.domain.com/my_file.rar #just one connection to that file
www.other.domain.net/other_file.iso #just connection to this file
www.other_domain1.com/other_file1.rar #just one connection to that file

I hope you understand me and can help me, I have my boss hurrying me !!!


There is no easy way to test this in Squid.

You need an external_acl_type helper which gets given the URI and 
decides whether it is permitted or not. That decision can be made by 
querying Squid cache manager for the list of active_requests and seeing 
if the URL appears more than once.


Amos



[squid-users] Limiting connections per user - not per IP

2010-03-21 Thread David Parks
I expect a lot of users from the same IP (NAT), is there a way to limit 
concurrent connections by authenticated user rather than just by IP (acl 
maxconn appears to do it only by IP)?
Thx,
David



Re: [squid-users] Limiting connections per user - not per IP

2010-03-21 Thread Amos Jeffries

David Parks wrote:

I expect a lot of users from the same IP (NAT), is there a way to
limit concurrent connections by authenticated user rather than just
by IP (acl maxconn appears to do it only by IP)? Thx, David



Not directly.

You can hack something up with a combo of max_user_ip and maxconn.
http://www.squid-cache.org/Doc/config/acl/

Or you can roll your own custom logic into a external_acl_type helper.

Amos
--
Please be using
  Current Stable Squid 2.7.STABLE8 or 3.0.STABLE25
  Current Beta Squid 3.1.0.18