Fwd: [squid-users] TCP_Denied for when requesting IP as URL over SSL using squid proxy server.

2009-12-08 Thread kevin band
I didn't realise I'd sent this directly to Amos, I meant to reply to
the mailing list.


-- Forwarded message --
From: kevin band kdb...@gmail.com
Date: 2009/12/7
Subject: Re: [squid-users] TCP_Denied for when requesting IP as URL
over SSL using squid proxy server.
To: Amos Jeffries squ...@treenet.co.nz


Hi Amos,

Thanks for the reply, I'm happy to accept what you say, but is there
anything specific that tells you that it's the remote web-server
rather than the squid-proxy that's rejecting the connection?

Regarding, dstdomain, yes I am familiar with that, but it doesn't meet
our needs in this instance, because there are multiple marks and
spencers domains that we need to allow access to, and they seem to
create a new one every few weeks.
We've been asked to setup a rule that wild-cards anything for
marksandspencer. They have a wide variety of formats in their URLs,
e.g. www.marksandspencer.com, suppliers.marksandspencer.com,
suppliers.marksandspencercate.com, the regex rule was the best
compromise.

Thanks again.

Kevin.

2009/12/7 Amos Jeffries squ...@treenet.co.nz:
 kevin band wrote:

 Hi,

 I'm hoping somebody can help me here, because I'm at a loss about what
 to do next.

 Basically we have squid running as a proxy server to restrict access
 to just those sites which we've included in our ACL's
 I have noticed recently that it isn't handling HTTPS reqests properly
 if the URL contains an IP address instead of a domain name.

 The reason this is a particular problem is that although the users can
 connect to the page using the domain name, something within that
 domain is then forwarding requests to the same web-server using its IP
 address.
 I'm sure I have my ACL's setup correctly because squid will forward
 the request using either URL if I send the requests using HTTP.  It
 then times out on the web-server because it only allows https, but at
 least the request is being forwarded to the web-server rather than
 being denied in squid

 The remote web server(s) is rejecting the connections. Probably because the
 SSL certificates require a domain name as part of their authentication
 validation.

 It's probably a broken client browser or maybe the website itself sending
 funky page URLs with the raw-IP inside. If you care you need to find out
 which and complain to whoever made the broken bits. Squid is just an
 innocent middleman here.


 Here's an extract from the logs that might explain it better :-

    158.41.4.44 - - [04/Dec/2009:15:56:47 +] GET
 http://stpaccess.marksandspencer.com/ HTTP/1.1 504 1024 TCP_MISS:NONE
    158.41.4.44 - - [04/Dec/2009:15:57:02 +] CONNECT
 stpaccess.marksandspencer.com:443 HTTP/1.0 200 7783 TCP_MISS:DIRECT
    158.41.4.44 - - [04/Dec/2009:16:01:53 +] GET
 http://63.130.82.113/Citrix/MetaFrameXP/default/login.asp HTTP/1.1
 504 1064 TCP_MISS:NONE
    158.41.4.44 - - [04/Dec/2009:16:03:13 +] CONNECT
 63.130.82.113:443 HTTP/1.0 403 980 TCP_DENIED:NONE


 And config extracts:

    acl SSL_ports port 443 563 444
    acl Safe_ports port 80 8002 23142 5481 5181 5281 5381 5481 5581
 5400 5500       # http
    acl Safe_ports port 23142       # OPEL project
    acl Safe_ports port 21          # ftp
    acl Safe_ports port 443 444 563 # https, snew#s

    acl CONNECT method CONNECT

    acl regex_ms dstdom_regex   -i
 /home/security/regex_marksandspencer.txt
    acl urlregex_mands url_regex -i
 /home/security/regex_marksandspencer_ip.txt
    acl mands_allowed_nets  src  /home/security/mands_allowed_nets.txt

    http_access allow manager localhost
    http_access deny manager
    http_access deny !Safe_ports
    http_access deny CONNECT !SSL_ports

    http_access allow regex_ms  mands_allowed_nets
    http_access allow urlregex_mands mands_allowed_nets
    http_access deny all

 There are actually a lot more ACL's than this, but these are the only
 ones I think are relevant

 relevant extracts from files linked to ACLs:
  regex_marksandspencer.txt
      .*marksandspencer.*com

  regex_marksandspencer_ip.txt
      .*.63.130.82.113


 Thanks for any help.

 Kevin,

 Kevin, meet dstdomain:

  acl markandspencer dstdomain .marksandspencer.com 63.130.82.113
  http_access allow markandspencer mands_allowed_nets

 10x or more faster than regex. Matching marksandspencer.com, all sub-domains
 and the raw-IP address form.

 Amos
 --
 Please be using
  Current Stable Squid 2.7.STABLE7 or 3.0.STABLE20
  Current Beta Squid 3.1.0.15



Fwd: [squid-users] TCP_Denied for when requesting IP as URL over SSL using squid proxy server.

2009-12-08 Thread kevin band
Forwarded to mailing list


-- Forwarded message --
From: kevin band kdb...@gmail.com
Date: 2009/12/7
Subject: Re: [squid-users] TCP_Denied for when requesting IP as URL
over SSL using squid proxy server.
To: Amos Jeffries squ...@treenet.co.nz


 Taking a much closer look now I change my mind. It probably is Squid
 rejecting the requests...  I see the IP address regex is wrong, forcing only
 IPs from the range 163.*/8 to be permitted.

 Note on regex patterns in Squid:
  a prefix .* is assumed when the pattern does not start with ^
  a trailing .* is assumed when the pattern does not end with $

  regex_marksandspencer_ip.txt
     .*.63.130.82.113


Hmm, now I am confused and I'm not sure I understand your last comment
about only IP's in range 163.*/8 being permitted.  I'm was fairly sure
that the regex rule was correct because squid is allowing it through
to the web-server providing I submit the request via HTTP.  It only
gets the denied message when I try it via HTTPS.


Re: [squid-users] TCP_Denied for when requesting IP as URL over SSL using squid proxy server.

2009-12-08 Thread kevin band
I've managed to get this working, but I'm not happy because in doing
so I've created a different issue.

My problems all started a few weeks ago when I was trying to tighten
up the rules.
Basically we have two squid proxy servers which are supposed to
contain the same configuration.
I noticed that on one of the servers, there was an extra rule that
wasn't there on the other :-

http_access allow CONNECT SSL_ports

The net effect of this rule was that anyone requesting any URL via
https would be allowed through the squid proxy server regardless of
the settings in my whitelist files.

I've now put this rule back in place and I now can get to the
63.130.82.113 address using https.
The problem is that I can now get to any URL via https, even though I
have rules in place which are supposed to only allow access to the
websites that I have put into my whitelist files.

Is this a bug in the way squid is handling the CONNECT method?

Any suggestions as to how I can tighten things up again but still
allow through the 63.130.82.113 request via https?  As I've said
before the rules work OK for http.


Re: [squid-users] TCP_Denied for when requesting IP as URL over SSL using squid proxy server.

2009-12-08 Thread Amos Jeffries

kevin band wrote:

I've managed to get this working, but I'm not happy because in doing
so I've created a different issue.

My problems all started a few weeks ago when I was trying to tighten
up the rules.
Basically we have two squid proxy servers which are supposed to
contain the same configuration.
I noticed that on one of the servers, there was an extra rule that
wasn't there on the other :-

http_access allow CONNECT SSL_ports

The net effect of this rule was that anyone requesting any URL via
https would be allowed through the squid proxy server regardless of
the settings in my whitelist files.

I've now put this rule back in place and I now can get to the
63.130.82.113 address using https.
The problem is that I can now get to any URL via https, even though I
have rules in place which are supposed to only allow access to the
websites that I have put into my whitelist files.

Is this a bug in the way squid is handling the CONNECT method?

Any suggestions as to how I can tighten things up again but still
allow through the 63.130.82.113 request via https?  As I've said
before the rules work OK for http.


The IP pattern you had was off:
  .*.63.130.82.113

Redux:

  ** regex assumes all patterns not beginning with ^ have an implicit 
.* prefix.

Therefore:  .*.63.130.82.113   ==  .63.130.82.113

 ** regex '.' means any character.

Therefore:  .63.130.82.113  == 
[a-zA-Z0-9\.]163[a-zA-Z0-9\.]130[a-zA-Z0-9\.]82[a-zA-Z0-9\.]113


 ** you have that pattern seeking IP addresses
Therefore:  .63.130.82.113  ==  [0-9\.]63\.130\.82\.113


IMO you need to write the regex as:   ^63\.130\.82\.113

I'm not sure why the raw-IP got through in regular requests. Possibly 
some other pattern or ACL matched and permitted it.


Amos
--
Please be using
  Current Stable Squid 2.7.STABLE7 or 3.0.STABLE20
  Current Beta Squid 3.1.0.15


Re: [squid-users] TCP_Denied for when requesting IP as URL over SSL using squid proxy server.

2009-12-08 Thread kevin band
No, the point is, when the rule :
http_access allow CONNECT SSL_ports
is in the configuration, ALL SSL requests are permitted, regardless of
any other restrictions.  HTTP is restricted correctly.

2009/12/8 Amos Jeffries squ...@treenet.co.nz:
 kevin band wrote:

 I've managed to get this working, but I'm not happy because in doing
 so I've created a different issue.

 My problems all started a few weeks ago when I was trying to tighten
 up the rules.
 Basically we have two squid proxy servers which are supposed to
 contain the same configuration.
 I noticed that on one of the servers, there was an extra rule that
 wasn't there on the other :-

    http_access allow CONNECT SSL_ports

 The net effect of this rule was that anyone requesting any URL via
 https would be allowed through the squid proxy server regardless of
 the settings in my whitelist files.

 I've now put this rule back in place and I now can get to the
 63.130.82.113 address using https.
 The problem is that I can now get to any URL via https, even though I
 have rules in place which are supposed to only allow access to the
 websites that I have put into my whitelist files.

 Is this a bug in the way squid is handling the CONNECT method?

 Any suggestions as to how I can tighten things up again but still
 allow through the 63.130.82.113 request via https?  As I've said
 before the rules work OK for http.

 The IP pattern you had was off:
  .*.63.130.82.113

 Redux:

  ** regex assumes all patterns not beginning with ^ have an implicit .*
 prefix.
 Therefore:  .*.63.130.82.113   ==  .63.130.82.113

  ** regex '.' means any character.

 Therefore:  .63.130.82.113  ==
 [a-zA-Z0-9\.]163[a-zA-Z0-9\.]130[a-zA-Z0-9\.]82[a-zA-Z0-9\.]113

  ** you have that pattern seeking IP addresses
 Therefore:  .63.130.82.113  ==  [0-9\.]63\.130\.82\.113


 IMO you need to write the regex as:   ^63\.130\.82\.113

 I'm not sure why the raw-IP got through in regular requests. Possibly some
 other pattern or ACL matched and permitted it.

 Amos
 --
 Please be using
  Current Stable Squid 2.7.STABLE7 or 3.0.STABLE20
  Current Beta Squid 3.1.0.15



Re: Fwd: [squid-users] TCP_Denied for when requesting IP as URL over SSL using squid proxy server.

2009-12-08 Thread Chris Robertson

kevin band wrote:

Hi Amos,

Thanks for the reply, I'm happy to accept what you say, but is there
anything specific that tells you that it's the remote web-server
rather than the squid-proxy that's rejecting the connection?

Regarding, dstdomain, yes I am familiar with that, but it doesn't meet
our needs in this instance, because there are multiple marks and
spencers domains that we need to allow access to, and they seem to
create a new one every few weeks.
We've been asked to setup a rule that wild-cards anything for
marksandspencer. They have a wide variety of formats in their URLs,
e.g. www.marksandspencer.com, suppliers.marksandspencer.com,
suppliers.marksandspencercate.com, the regex rule was the best
compromise.
  


For what it's worth, dstdomain has the capability of performing wild 
card matches (by using a leading period).


acl marskandspencer dstdomain .marksandspencer.com 
.marksandspencercate.com 63.130.82.113


will match the domains you mentioned above, as well as the IP address.


Thanks again.

Kevin.
  


Chris



[squid-users] TCP_Denied for when requesting IP as URL over SSL using squid proxy server.

2009-12-07 Thread kevin band
Hi,

I'm hoping somebody can help me here, because I'm at a loss about what
to do next.

Basically we have squid running as a proxy server to restrict access
to just those sites which we've included in our ACL's
I have noticed recently that it isn't handling HTTPS reqests properly
if the URL contains an IP address instead of a domain name.

The reason this is a particular problem is that although the users can
connect to the page using the domain name, something within that
domain is then forwarding requests to the same web-server using its IP
address.
I'm sure I have my ACL's setup correctly because squid will forward
the request using either URL if I send the requests using HTTP.  It
then times out on the web-server because it only allows https, but at
least the request is being forwarded to the web-server rather than
being denied in squid

Here's an extract from the logs that might explain it better :-

158.41.4.44 - - [04/Dec/2009:15:56:47 +] GET
http://stpaccess.marksandspencer.com/ HTTP/1.1 504 1024 TCP_MISS:NONE
158.41.4.44 - - [04/Dec/2009:15:57:02 +] CONNECT
stpaccess.marksandspencer.com:443 HTTP/1.0 200 7783 TCP_MISS:DIRECT
158.41.4.44 - - [04/Dec/2009:16:01:53 +] GET
http://63.130.82.113/Citrix/MetaFrameXP/default/login.asp HTTP/1.1
504 1064 TCP_MISS:NONE
158.41.4.44 - - [04/Dec/2009:16:03:13 +] CONNECT
63.130.82.113:443 HTTP/1.0 403 980 TCP_DENIED:NONE


And config extracts:

acl SSL_ports port 443 563 444
acl Safe_ports port 80 8002 23142 5481 5181 5281 5381 5481 5581
5400 5500   # http
acl Safe_ports port 23142   # OPEL project
acl Safe_ports port 21  # ftp
acl Safe_ports port 443 444 563 # https, snew#s

acl CONNECT method CONNECT

acl regex_ms dstdom_regex   -i /home/security/regex_marksandspencer.txt
acl urlregex_mands url_regex -i
/home/security/regex_marksandspencer_ip.txt
acl mands_allowed_nets  src  /home/security/mands_allowed_nets.txt

http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports

http_access allow regex_ms  mands_allowed_nets
http_access allow urlregex_mands mands_allowed_nets
http_access deny all

There are actually a lot more ACL's than this, but these are the only
ones I think are relevant

relevant extracts from files linked to ACLs:
  regex_marksandspencer.txt
  .*marksandspencer.*com

  regex_marksandspencer_ip.txt
  .*.63.130.82.113


Thanks for any help.

Kevin,


Re: [squid-users] TCP_Denied for when requesting IP as URL over SSL using squid proxy server.

2009-12-07 Thread Amos Jeffries

kevin band wrote:

Hi,

I'm hoping somebody can help me here, because I'm at a loss about what
to do next.

Basically we have squid running as a proxy server to restrict access
to just those sites which we've included in our ACL's
I have noticed recently that it isn't handling HTTPS reqests properly
if the URL contains an IP address instead of a domain name.

The reason this is a particular problem is that although the users can
connect to the page using the domain name, something within that
domain is then forwarding requests to the same web-server using its IP
address.
I'm sure I have my ACL's setup correctly because squid will forward
the request using either URL if I send the requests using HTTP.  It
then times out on the web-server because it only allows https, but at
least the request is being forwarded to the web-server rather than
being denied in squid


The remote web server(s) is rejecting the connections. Probably because 
the SSL certificates require a domain name as part of their 
authentication validation.


It's probably a broken client browser or maybe the website itself 
sending funky page URLs with the raw-IP inside. If you care you need to 
find out which and complain to whoever made the broken bits. Squid is 
just an innocent middleman here.




Here's an extract from the logs that might explain it better :-

158.41.4.44 - - [04/Dec/2009:15:56:47 +] GET
http://stpaccess.marksandspencer.com/ HTTP/1.1 504 1024 TCP_MISS:NONE
158.41.4.44 - - [04/Dec/2009:15:57:02 +] CONNECT
stpaccess.marksandspencer.com:443 HTTP/1.0 200 7783 TCP_MISS:DIRECT
158.41.4.44 - - [04/Dec/2009:16:01:53 +] GET
http://63.130.82.113/Citrix/MetaFrameXP/default/login.asp HTTP/1.1
504 1064 TCP_MISS:NONE
158.41.4.44 - - [04/Dec/2009:16:03:13 +] CONNECT
63.130.82.113:443 HTTP/1.0 403 980 TCP_DENIED:NONE


And config extracts:

acl SSL_ports port 443 563 444
acl Safe_ports port 80 8002 23142 5481 5181 5281 5381 5481 5581
5400 5500   # http
acl Safe_ports port 23142   # OPEL project
acl Safe_ports port 21  # ftp
acl Safe_ports port 443 444 563 # https, snew#s

acl CONNECT method CONNECT

acl regex_ms dstdom_regex   -i /home/security/regex_marksandspencer.txt
acl urlregex_mands url_regex -i
/home/security/regex_marksandspencer_ip.txt
acl mands_allowed_nets  src  /home/security/mands_allowed_nets.txt

http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports

http_access allow regex_ms  mands_allowed_nets
http_access allow urlregex_mands mands_allowed_nets
http_access deny all

There are actually a lot more ACL's than this, but these are the only
ones I think are relevant

relevant extracts from files linked to ACLs:
  regex_marksandspencer.txt
  .*marksandspencer.*com

  regex_marksandspencer_ip.txt
  .*.63.130.82.113


Thanks for any help.

Kevin,


Kevin, meet dstdomain:

  acl markandspencer dstdomain .marksandspencer.com 63.130.82.113
  http_access allow markandspencer mands_allowed_nets

10x or more faster than regex. Matching marksandspencer.com, all 
sub-domains and the raw-IP address form.


Amos
--
Please be using
  Current Stable Squid 2.7.STABLE7 or 3.0.STABLE20
  Current Beta Squid 3.1.0.15