Re: [squid-users] Transparent proxy. router + dedicated server

2008-03-11 Thread Amos Jeffries

Rafal Ramocki wrote:

Amos Jeffries pisze:

Rafal Ramocki wrote:

Amos Jeffries wrote:

Hello,

I have problem with my squid setup. For quite long time I've been 
using

Squid 2.6 STABLE-17. I decidet to switch to squid 3.0 but there is
problem.

My configuration is:

large network - nat router (linux) - router (hardware ATM) - 
internet

  \   /
squid

Most of traffic is nat'ed on nat router, and forwarded to border
hardware atm router. HTTP traffic (port 80) is DNAT'ed to machine with
squid. And that setup worked fine for now. But after switching to 
3.0 I

have following error message:

ERROR
The requested URL could not be retrieved

While trying to retrieve the URL: /

The following error was encountered:

 * Invalid URL

Here are few directives from my configuration file.

http_port 80 transparent
icp_port 0
htcp_port 0
tcp_outgoing_address X.X.X.X
dns_nameservers X.X.X.X


I have been working on it for a quite long time. I have been googling
but i have found information about one server setups. Even in squid 
faq

there is only that configuration.

Please help ;)


What you have is in no way transparent. As long as DNAT was used 
never has

been.


That setup worked for me for something about 4 years. Transparent for 
me menas with no configuration in browsers.



Transparent interception is done with REDIRECT (netfilter) or TPROXY-2
when squid sits on the NAT box with the full NAT tables available to 
it.


It is not possilble in my case. My network is 3000+ nodes. Both 
machines are under heavy load. And i just cant place squid, filtering 
and traffic controll on one single machine. I also don't want to 
place squid after router as long as that setup is less fail proof.


Using DNAT to another box isolates squid from the information it 
needs to
work transparently, 


Funny is that squid never needed that information ;)


but it can still be faked with an semi-open proxy
config.

 

You need the following for starters:

  # cope with standard web requests ...
  http_port 80 vhost
  # SECURE the access
  acl localnet src 10.0.0.0/8
  http_access deny !localnet

** alter the ACL to contain the IP ranges you are intercepting.


I've already tried that similar configuration. And I've tried once 
more. The reslt is:


ERROR
The requested URL could not be retrieved

While trying to retrieve the URL: http://www.debian.org/

The following error was encountered:

* Unable to forward this request at this time.

This request could not be forwarded to the origin server or to any 
parent caches. The most likely cause for this error is that:


* The cache administrator does not allow this cache to make 
direct connections to origin servers, and

* All configured parent caches are currently unreachable.

In cache.log I have:

2008/03/10 09:16:53| Failed to select source for 
'http://www.debian.org/'

2008/03/10 09:16:53|   always_direct = 0
2008/03/10 09:16:53|never_direct = 0
2008/03/10 09:16:53|timedout = 0


I think in that setup directives cache_peer* are mandatory. But I 
cant't define in that way whole internet ;)


No not mandatory. The semi-open-proxy config should be lettign 
internal requests out in your setup.





NP: broken non-HTTP-compliant client software will still get that error
page, but compliant software like most browsers will get through okay.


That is OK for me. I also want to ensure that by port 80 is 
transmited only http traffic and not for example p2p.


Any ideas? Because I'm running out. :)


This second problem you are now hitting (Unable to forward) shows a 
problem making general web requests.


 - check that the squid box is able to make outbound port-80 requests.
   (particularly without looping back through itself!)


Yes it can. When i configure proxy in browser it works fine. I have 
similar test environment. In the same configuration when I'm redirecting 
it works fine, when I'm configuring browser to use squid it works fine. 
It doesn't work when I'm dnating from other machine.



 - check that the squid box can resolve the domain its fetching


Squid box can resolve DNS. Squid it self should resolbe to. 
Configuration is the same like in squid i'm currently using (2.6 Stable 
17).




In which case it looks like either something else in the config blocking 
that request or a transport-layer problem.


If you set:   debug_options ALL,5

What shows up as the preceeding 150 or so cache.log lines to the Failed 
to select source for entry?



Amos
--
Please use Squid 2.6STABLE17+ or 3.0STABLE1+
There are serious security advisories out on all earlier releases.


Re: [squid-users] Transparent proxy. router + dedicated server

2008-03-10 Thread Amos Jeffries

Rafal Ramocki wrote:

Amos Jeffries wrote:

Hello,

I have problem with my squid setup. For quite long time I've been using
Squid 2.6 STABLE-17. I decidet to switch to squid 3.0 but there is
problem.

My configuration is:

large network - nat router (linux) - router (hardware ATM) - internet
  \   /
squid

Most of traffic is nat'ed on nat router, and forwarded to border
hardware atm router. HTTP traffic (port 80) is DNAT'ed to machine with
squid. And that setup worked fine for now. But after switching to 3.0 I
have following error message:

ERROR
The requested URL could not be retrieved

While trying to retrieve the URL: /

The following error was encountered:

 * Invalid URL

Here are few directives from my configuration file.

http_port 80 transparent
icp_port 0
htcp_port 0
tcp_outgoing_address X.X.X.X
dns_nameservers X.X.X.X


I have been working on it for a quite long time. I have been googling
but i have found information about one server setups. Even in squid faq
there is only that configuration.

Please help ;)


What you have is in no way transparent. As long as DNAT was used never 
has

been.


That setup worked for me for something about 4 years. Transparent for me 
menas with no configuration in browsers.



Transparent interception is done with REDIRECT (netfilter) or TPROXY-2
when squid sits on the NAT box with the full NAT tables available to it.


It is not possilble in my case. My network is 3000+ nodes. Both machines 
are under heavy load. And i just cant place squid, filtering and traffic 
controll on one single machine. I also don't want to place squid after 
router as long as that setup is less fail proof.



Using DNAT to another box isolates squid from the information it needs to
work transparently, 


Funny is that squid never needed that information ;)


but it can still be faked with an semi-open proxy
config.

 

You need the following for starters:

  # cope with standard web requests ...
  http_port 80 vhost
  # SECURE the access
  acl localnet src 10.0.0.0/8
  http_access deny !localnet

** alter the ACL to contain the IP ranges you are intercepting.


I've already tried that similar configuration. And I've tried once more. 
The reslt is:


ERROR
The requested URL could not be retrieved

While trying to retrieve the URL: http://www.debian.org/

The following error was encountered:

* Unable to forward this request at this time.

This request could not be forwarded to the origin server or to any 
parent caches. The most likely cause for this error is that:


* The cache administrator does not allow this cache to make direct 
connections to origin servers, and

* All configured parent caches are currently unreachable.

In cache.log I have:

2008/03/10 09:16:53| Failed to select source for 'http://www.debian.org/'
2008/03/10 09:16:53|   always_direct = 0
2008/03/10 09:16:53|never_direct = 0
2008/03/10 09:16:53|timedout = 0


I think in that setup directives cache_peer* are mandatory. But I cant't 
define in that way whole internet ;)


No not mandatory. The semi-open-proxy config should be lettign internal 
requests out in your setup.





NP: broken non-HTTP-compliant client software will still get that error
page, but compliant software like most browsers will get through okay.


That is OK for me. I also want to ensure that by port 80 is transmited 
only http traffic and not for example p2p.


Any ideas? Because I'm running out. :)


This second problem you are now hitting (Unable to forward) shows a 
problem making general web requests.


 - check that the squid box is able to make outbound port-80 requests.
   (particularly without looping back through itself!)

 - check that the squid box can resolve the domain its fetching


Amos
--
Please use Squid 2.6STABLE17+ or 3.0STABLE1+
There are serious security advisories out on all earlier releases.


Re: [squid-users] Transparent proxy. router + dedicated server

2008-03-10 Thread Rafal Ramocki

Amos Jeffries pisze:

Rafal Ramocki wrote:

Amos Jeffries wrote:

Hello,

I have problem with my squid setup. For quite long time I've been using
Squid 2.6 STABLE-17. I decidet to switch to squid 3.0 but there is
problem.

My configuration is:

large network - nat router (linux) - router (hardware ATM) - 
internet

  \   /
squid

Most of traffic is nat'ed on nat router, and forwarded to border
hardware atm router. HTTP traffic (port 80) is DNAT'ed to machine with
squid. And that setup worked fine for now. But after switching to 3.0 I
have following error message:

ERROR
The requested URL could not be retrieved

While trying to retrieve the URL: /

The following error was encountered:

 * Invalid URL

Here are few directives from my configuration file.

http_port 80 transparent
icp_port 0
htcp_port 0
tcp_outgoing_address X.X.X.X
dns_nameservers X.X.X.X


I have been working on it for a quite long time. I have been googling
but i have found information about one server setups. Even in squid faq
there is only that configuration.

Please help ;)


What you have is in no way transparent. As long as DNAT was used 
never has

been.


That setup worked for me for something about 4 years. Transparent for 
me menas with no configuration in browsers.



Transparent interception is done with REDIRECT (netfilter) or TPROXY-2
when squid sits on the NAT box with the full NAT tables available to it.


It is not possilble in my case. My network is 3000+ nodes. Both 
machines are under heavy load. And i just cant place squid, filtering 
and traffic controll on one single machine. I also don't want to place 
squid after router as long as that setup is less fail proof.


Using DNAT to another box isolates squid from the information it 
needs to
work transparently, 


Funny is that squid never needed that information ;)


but it can still be faked with an semi-open proxy
config.

 

You need the following for starters:

  # cope with standard web requests ...
  http_port 80 vhost
  # SECURE the access
  acl localnet src 10.0.0.0/8
  http_access deny !localnet

** alter the ACL to contain the IP ranges you are intercepting.


I've already tried that similar configuration. And I've tried once 
more. The reslt is:


ERROR
The requested URL could not be retrieved

While trying to retrieve the URL: http://www.debian.org/

The following error was encountered:

* Unable to forward this request at this time.

This request could not be forwarded to the origin server or to any 
parent caches. The most likely cause for this error is that:


* The cache administrator does not allow this cache to make direct 
connections to origin servers, and

* All configured parent caches are currently unreachable.

In cache.log I have:

2008/03/10 09:16:53| Failed to select source for 'http://www.debian.org/'
2008/03/10 09:16:53|   always_direct = 0
2008/03/10 09:16:53|never_direct = 0
2008/03/10 09:16:53|timedout = 0


I think in that setup directives cache_peer* are mandatory. But I 
cant't define in that way whole internet ;)


No not mandatory. The semi-open-proxy config should be lettign internal 
requests out in your setup.





NP: broken non-HTTP-compliant client software will still get that error
page, but compliant software like most browsers will get through okay.


That is OK for me. I also want to ensure that by port 80 is transmited 
only http traffic and not for example p2p.


Any ideas? Because I'm running out. :)


This second problem you are now hitting (Unable to forward) shows a 
problem making general web requests.


 - check that the squid box is able to make outbound port-80 requests.
   (particularly without looping back through itself!)


Yes it can. When i configure proxy in browser it works fine. I have 
similar test environment. In the same configuration when I'm redirecting 
it works fine, when I'm configuring browser to use squid it works fine. 
It doesn't work when I'm dnating from other machine.



 - check that the squid box can resolve the domain its fetching


Squid box can resolve DNS. Squid it self should resolbe to. 
Configuration is the same like in squid i'm currently using (2.6 Stable 17).


--
Rafal Ramocki




[squid-users] Transparent proxy. router + dedicated server

2008-03-09 Thread Rafal Ramocki

Hello,

I have problem with my squid setup. For quite long time I've been using 
Squid 2.6 STABLE-17. I decidet to switch to squid 3.0 but there is problem.


My configuration is:

large network - nat router (linux) - router (hardware ATM) - internet
 \   /
   squid

Most of traffic is nat'ed on nat router, and forwarded to border 
hardware atm router. HTTP traffic (port 80) is DNAT'ed to machine with 
squid. And that setup worked fine for now. But after switching to 3.0 I 
have following error message:


ERROR
The requested URL could not be retrieved

While trying to retrieve the URL: /

The following error was encountered:

* Invalid URL

Here are few directives from my configuration file.

http_port 80 transparent
icp_port 0
htcp_port 0
tcp_outgoing_address X.X.X.X
dns_nameservers X.X.X.X


I have been working on it for a quite long time. I have been googling 
but i have found information about one server setups. Even in squid faq 
there is only that configuration.


Please help ;)


--
Rafal Ramocki


Re: [squid-users] Transparent proxy. router + dedicated server

2008-03-09 Thread Amos Jeffries
 Hello,

 I have problem with my squid setup. For quite long time I've been using
 Squid 2.6 STABLE-17. I decidet to switch to squid 3.0 but there is
 problem.

 My configuration is:

 large network - nat router (linux) - router (hardware ATM) - internet
   \   /
 squid

 Most of traffic is nat'ed on nat router, and forwarded to border
 hardware atm router. HTTP traffic (port 80) is DNAT'ed to machine with
 squid. And that setup worked fine for now. But after switching to 3.0 I
 have following error message:

 ERROR
 The requested URL could not be retrieved

 While trying to retrieve the URL: /

 The following error was encountered:

  * Invalid URL

 Here are few directives from my configuration file.

 http_port 80 transparent
 icp_port 0
 htcp_port 0
 tcp_outgoing_address X.X.X.X
 dns_nameservers X.X.X.X


 I have been working on it for a quite long time. I have been googling
 but i have found information about one server setups. Even in squid faq
 there is only that configuration.

 Please help ;)

What you have is in no way transparent. As long as DNAT was used never has
been.

Transparent interception is done with REDIRECT (netfilter) or TPROXY-2
when squid sits on the NAT box with the full NAT tables available to it.

Using DNAT to another box isolates squid from the information it needs to
work transparently, but it can still be faked with an semi-open proxy
config.

You need the following for starters:

  # cope with standard web requests ...
  http_port 80 vhost
  # SECURE the access
  acl localnet src 10.0.0.0/8
  http_access deny !localnet

** alter the ACL to contain the IP ranges you are intercepting.

NP: broken non-HTTP-compliant client software will still get that error
page, but compliant software like most browsers will get through okay.

Amos