Re: [squid-users] How to limit duration of SSL connections
Right now I don't have access to squid's logs but I remember that there wasn't enough usable information. I'd block SSL connections to numeric IP but there are some websites (including Hotmail among others) that clients connect under the same pattern. I'll post again to the list when I have access.log in my PC again. Amos Jeffries escribió: Hi friends: I'm running Squid on server running Debian Etch for a customer of mine. I'm using sarg to generate reports of each user behing Squid accesing Internet. There are a lot of restriction about non-related to work websites like music on line, webchats, MSN, Yahoo, hi5, among other sites goods for wasting time. All of my rules are blocking them perfectly except for some users that I do not how connect to "random" IP addresses and port 443 using (I asume) SSL tunnels. Those connections are too long, they have a duration of 1 minute, 1 hour even 5 o 8 hours as I see in my sarg reports. I was working with a bash script that parses access.log and detects those IP address to block them later but the same users always find different IP address to "bypass" Squid. I believe they're using some kind of tunneling software like hopster, ultrasurfer, freegate or who know what! Sounds a bit like skype. What does access.log show for one of these connections? ie "CONNECT 1.2.3.4:443 HTTP/1.0" ?? http://wiki.squid-cache.org/KnowledgeBase/FilteringChat I'm not allowing any traffic to pass my firewall, users only can reach Internet through Squid exclusively. Is there a way to detect these kind of tunneling software? I was thinking on limit the duration of a SSL connection since a normal SSL request in https it takes just a few seconds, right? No, as long as the clients web browser needs it to take. I've had sessions with my bank in excess of an hour at times. Is squid able to limit how long can a SSL connection be? Looks like you want client_lifetime, but take note of the WARNING... http://www.squid-cache.org/Versions/v3/3.0/cfgman/client_lifetime.html Thanks,... and sorry .. My english isn't good Amos
Re: [squid-users] How to limit duration of SSL connections
> Hi friends: > > I'm running Squid on server running Debian Etch for a customer of mine. > I'm using sarg to generate reports of each user behing Squid accesing > Internet. > There are a lot of restriction about non-related to work websites like > music on line, webchats, MSN, Yahoo, hi5, among other sites goods for > wasting time. All of my rules are blocking them perfectly except for > some users that I do not how connect to "random" IP addresses and port > 443 using (I asume) SSL tunnels. > > Those connections are too long, they have a duration of 1 minute, 1 hour > even 5 o 8 hours as I see in my sarg reports. > > I was working with a bash script that parses access.log and detects > those IP address to block them later but the same users always find > different IP address to "bypass" Squid. > I believe they're using some kind of tunneling software like hopster, > ultrasurfer, freegate or who know what! Sounds a bit like skype. What does access.log show for one of these connections? ie "CONNECT 1.2.3.4:443 HTTP/1.0" ?? http://wiki.squid-cache.org/KnowledgeBase/FilteringChat > > I'm not allowing any traffic to pass my firewall, users only can reach > Internet through Squid exclusively. > > Is there a way to detect these kind of tunneling software? I was > thinking on limit the duration of a SSL connection since a normal SSL > request in https it takes just a few seconds, right? No, as long as the clients web browser needs it to take. I've had sessions with my bank in excess of an hour at times. > Is squid able to > limit how long can a SSL connection be? Looks like you want client_lifetime, but take note of the WARNING... http://www.squid-cache.org/Versions/v3/3.0/cfgman/client_lifetime.html > > Thanks,... and sorry .. My english isn't good > Amos
[squid-users] How to limit duration of SSL connections
Hi friends: I'm running Squid on server running Debian Etch for a customer of mine. I'm using sarg to generate reports of each user behing Squid accesing Internet. There are a lot of restriction about non-related to work websites like music on line, webchats, MSN, Yahoo, hi5, among other sites goods for wasting time. All of my rules are blocking them perfectly except for some users that I do not how connect to "random" IP addresses and port 443 using (I asume) SSL tunnels. Those connections are too long, they have a duration of 1 minute, 1 hour even 5 o 8 hours as I see in my sarg reports. I was working with a bash script that parses access.log and detects those IP address to block them later but the same users always find different IP address to "bypass" Squid. I believe they're using some kind of tunneling software like hopster, ultrasurfer, freegate or who know what! I'm not allowing any traffic to pass my firewall, users only can reach Internet through Squid exclusively. Is there a way to detect these kind of tunneling software? I was thinking on limit the duration of a SSL connection since a normal SSL request in https it takes just a few seconds, right? Is squid able to limit how long can a SSL connection be? Thanks,... and sorry .. My english isn't good