Re: [squid-users] Compiling last 3.3.0.2 or 3.2.8

2013-03-03 Thread Jorge Bastos

Hi Andy,

Ya, perfect, the error wasn't explicit (for me), but yes, I was using 
headers for Berkeley 3.x, updated to 5.3 headers and perfect!



Thanks,

On 2013-03-03 21:37, Andrew Beverley wrote:


On Sun, 2013-03-03 at 14:38 +0000, Jorge Bastos wrote:

Howdy, When trying to compile latest 3.3.0.2 I have he below 
information, also tried other versions like 3.2.8 and some problem. Is 
this a library that need to be updated?


I suspect that you are using an out of date and/or wrong version of
Berkeley DB. Any idea what version you are using? Compatibility for
V1.85 was removed a while ago (see commit 11806), but any version from
the last few years should work.

Andy


[squid-users] Compiling last 3.3.0.2 or 3.2.8

2013-03-03 Thread Jorge Bastos

Howdy,

When trying to compile latest 3.3.0.2 I have he below information, also 
tried other versions like 3.2.8 and some problem.

Is this a library that need to be updated?

Thanks in advanced,
Jorge,

---
make[3]: Entering directory 
`/usr/local/src/squid/squid-3.3.2-20130303-r12509/helpers/external_acl/session'
g++ -DHAVE_CONFIG_H  -I../../.. -I../../../include -I../../../lib 
-I../../../src -I../../../include  -Wall -Wpointer-arith 
-Wwrite-strings -Wcomments -Werror -pipe -D_REENTRANT -g -O2 -std=c++0x 
-MT ext_session_acl.o -MD -MP -MF .deps/ext_session_acl.Tpo -c -o 
ext_session_acl.o ext_session_acl.cc

ext_session_acl.cc: In function 'void init_db()':
ext_session_acl.cc:80:74: error: invalid conversion from 'int' to 
'DBTYPE' [-fpermissive]

ext_session_acl.cc:80:74: error: too many arguments to function
ext_session_acl.cc:88:72: error: invalid conversion from 'int' to 
'DBTYPE' [-fpermissive]

ext_session_acl.cc:88:72: error: too many arguments to function
make[3]: *** [ext_session_acl.o] Error 1
make[3]: Leaving directory 
`/usr/local/src/squid/squid-3.3.2-20130303-r12509/helpers/external_acl/session'

make[2]: *** [all-recursive] Error 1
make[2]: Leaving directory 
`/usr/local/src/squid/squid-3.3.2-20130303-r12509/helpers/external_acl'

make[1]: *** [all-recursive] Error 1
make[1]: Leaving directory 
`/usr/local/src/squid/squid-3.3.2-20130303-r12509/helpers'

make: *** [all-recursive] Error 1



RE: [squid-users] Cannot access an website, www.asus.com

2012-01-21 Thread Jorge Bastos
Hi Amos,


> Start with figuring out which server is failing. Your Squid access.log
> entry for the request should contain the IP address of the server which
> was contacted to fetch the response.
> 
> Read Error means the connection got setup at the TCP level but then no
> data packets for that connection got through. Usually PMTU black hole
> problems, Window Scaling problems, ECN problems. TCP and ICMP level
> things like that breaking the packet management.

Yap, updated the Kernel, as I need for other reasons and now working!
Thanks for the explanation!

Jorge,



[squid-users] Cannot access an website, www.asus.com

2012-01-20 Thread Jorge Bastos
Howdy mates,
I have a problem with an website, not that I really ASUS... in fact their
hardware makes me sick, but I found this problem when a client gave me a PC
to fix with an ASUS mboard.

When I try to access I have:

---
ERROR
The requested URL could not be retrieved

The following error was encountered while trying to retrieve the URL:
http://www.asus.com/

Read Error

The system returned: (104) Connection reset by peer

An error condition occurred while reading data from the network. Please
retry your request.

Your cache administrator is webmaster.

Generated Fri, 20 Jan 2012 17:50:55 GMT by localhost (squid/3.1.18)
---

Squid is action as transparent-proxy: http_port 192.168.1.1:8080 intercept

How could I debug this?

Thanks in advanced,
Jorge Bastos,



Re: [squid-users] Add top information to all webpages (like godaddy AD)

2011-10-11 Thread Jorge Bastos

Hi all,

Thanks for your answers.

Well, I'm no c/c++ skill'able
guy :(

The apache footer module was a good solution but, in this case I can't 
use it in combination with squid.

I thought it may exist some
easy way of doing this..

Well, I'll keep searching to see if I can find anything.

If somehow anyone knows something please tell me.
Thanks in advanced,

Jorge,


On 11.10.2011 17:23, Luis Daniel Lucio Quiroz wrote:


2011/10/11 Hasanen AL-Bana :

you don't want to create iframe for request , some requests will be 
for
queries made by ajax and you don't want to touch that...it is better 
to

add it only to the home page. On Tue, Oct 11, 2011 at 2:43 PM, Ed W
wrote:


So you could be smarter and instead inject some javascript which
checks if you are in a frameset and if not creates one. This of
course has some subtleties with ajax... Ed W On 11/10/2011 12:28,
Hasanen AL-Bana wrote:


I believe yes ! but it will cause lots of troubles with pages like
facebook & gmail you can redirect all requests to a url_rewriter
script. squid will pass the requested url to the script , then the
script must generate a page with 2 iFrames , first iFrame will 
hold

the Ad, the second iFrame goes bellow the first one and will
contain the original requested page. but think of the problems you
will face because squid will add that to each request which will
break all the page ,hence the script must be smart enough to
process only root pages like index.php index.html  On Tue, Oct
11, 2011 at 2:20 PM, Jorge Bastos wrote:


Howdy, I'd like to do something that I don't know if that's
possible somehow. I have squid configured as transparent, and I'd
like to add on everypage that the user visits, information on the
top of the pages, like an AD. Is this possible? For example
Godaddy has this on the free hosting they provide. Thanks in
advanced, Jorge Bastos,


This is a tricky one,

if yo have c/c++ skils you may program a c-icap module.

There is an easier way, but this isno squid's. I remember apache has
an foot_page module that does exactly what you want, you can use it 
in

combination with squid to add your code.

LD

http://www.twitter.com/ldlq [4]




Links:
--
[1] mailto:mysql.jo...@decimal.pt
[2] mailto:li...@wildgooses.com
[3] mailto:hasa...@gmail.com
[4] http://www.twitter.com/ldlq


[squid-users] Add top information to all webpages (like godaddy AD)

2011-10-11 Thread Jorge Bastos

Howdy,

I'd like to do something that I don't know if that's possible somehow.
I have squid configured as transparent, and I'd like to add on 
everypage that the user visits, information on the top of the pages, 
like an AD.


Is this possible?
For example Godaddy has this on the free hosting they provide.

Thanks in advanced,
Jorge Bastos,


[squid-users] Limit an ACL to allow only 30m each day

2011-04-21 Thread Jorge Bastos
Howdy,

I've been checking the time condition but I didn't found what I need to,
and also don't know if that's possible.

I'd like to setup an ACL to block facebook.com, and allow a timeout of 300
seconds per day, on working days.
I saw that with the time condition I specify to run only on working days
by using the "D" abbreviation, as well the time between 09:00-18:00.

Is there a way that combining the two options I said, to block the user if
he accessed the facebook.com for more than 300 seconds on each day,
displaying a configurable page or the standard squid error page?

Thanks,
Jorge,



[squid-users] Cannot view a specific webpage, www.rpt.pt

2010-03-27 Thread Jorge Bastos
Howdy people,

I cannot view a WebPage via squid, I think (but not sure) that it doesn't
use any other port than the HTTP default.
Can you check the possible reason? If a squid problem or the page that
uses something else than HTTP?

The link is:

http://ww1.rtp.pt/blogs/programas/jornaldatarde/?2-parte-do-Jornal-da-Tarde-de-2010-03-24.rtp&post=7307


Thanks,
Jorge,



[squid-users] Exiting due to failures

2009-08-11 Thread Jorge Bastos
Howdy,
For the 1st time something like this happened to me.
What can be this failures? I mean, what type of failures?

Jorge,

--
Aug 11 12:57:01 cisne squid[28669]: Squid Parent: child process 9216 exited
due to signal 6
Aug 11 12:57:04 cisne squid[28669]: Squid Parent: child process 9448 started
Aug 11 12:57:06 cisne squid[28669]: Squid Parent: child process 9448 exited
due to signal 6
Aug 11 12:57:09 cisne squid[28669]: Squid Parent: child process 9451 started
Aug 11 12:57:12 cisne squid[28669]: Squid Parent: child process 9451 exited
due to signal 6
Aug 11 12:57:15 cisne squid[28669]: Squid Parent: child process 9454 started
Aug 11 12:57:20 cisne squid[28669]: Squid Parent: child process 9454 exited
due to signal 6
Aug 11 12:57:23 cisne squid[28669]: Squid Parent: child process 9457 started
Aug 11 12:57:26 cisne squid[28669]: Squid Parent: child process 9457 exited
due to signal 6
Aug 11 12:57:29 cisne squid[28669]: Squid Parent: child process 9460 started
Aug 11 12:57:31 cisne squid[28669]: Squid Parent: child process 9460 exited
due to signal 6
Aug 11 12:57:31 cisne squid[28669]: Exiting due to repeated, frequent
failures
Aug 11 13:02:57 cisne squid[9519]: Squid Parent: child process 9521 started



RE: [squid-users] Squid on transparent proxy for 443 request

2009-04-27 Thread Jorge Bastos
> Concerns?
>  1) transparent interception ==  man-in-middle attack.
>  2) private details of clients are opened to you and anyone who gets
> access to the middle machine.
>  3) clients may be made aware by the security systems involved that you
> are attacking them.
> 
> The only semi-legitimate arguments towards doing it in the first place
> is
> for anti-virus scanning etc. Which adequate server or client AV systems
> make useless anyway. All other control measures are human rights
> violations of privacy, which is illegal in most parts of the world.

I understand, but, for example my ISP does transparent proxy'ing, it has an
option to disable it but by default it's on. When talking about illegal
stuff, in a big company this an Nacional ISP nobody will do anything about
it.
Transparent proxy for me, has only has purpose, IM blocking!



RE: [squid-users] Squid on transparent proxy for 443 request

2009-04-26 Thread Jorge Bastos
Oh I see,
I won't bother then, was just for a experience.
But anyway, since I'm only passing traffic from 80 through squid, I want to
add 443 traffic also.
What aspects do I have to concerns about this, on how to active transparent
mode for 443?



> -Original Message-
> From: Amos Jeffries [mailto:squ...@treenet.co.nz]
> Sent: domingo, 26 de Abril de 2009 1:56
> To: Jorge Bastos
> Cc: squid-users@squid-cache.org
> Subject: Re: [squid-users] Squid on transparent proxy for 443 request
> 
> Jorge Bastos wrote:
> > Hi there,
> > What are the concerns that I need to have to make squid act as a
> transrent
> > proxy on port 443?
> > I need to catch the data that is being sent from a website that works
> under
> > https, is it possible? Data
> >
> > Right now I only use it for standard port 80.
> >
> 
> Not possible. HTTPS guarantees the client can see 100% of the machines
> for itself to the source.
> 
> One user has recently pointed out that redirecting HTTPS URL's to a
> local domain reverse-proxied by Squid might work though. The client
> believes and accepts Squid credentials as its proper destination site
> and Squid handles decryption->re-encryption going HTTPS to the remote
> site.
> 
> That is very similar to how SSLBump works with CONNECT requests in 3.1.
> But may get past the invalid certificate issues.
> 
> Amos
> --
> Please be using
>Current Stable Squid 2.7.STABLE6 or 3.0.STABLE14
>Current Beta Squid 3.1.0.7




[squid-users] Squid on transparent proxy for 443 request

2009-04-25 Thread Jorge Bastos
Hi there,
What are the concerns that I need to have to make squid act as a transrent
proxy on port 443?
I need to catch the data that is being sent from a website that works under
https, is it possible? Data 

Right now I only use it for standard port 80.



RE: [squid-users] Web Messengers

2009-04-24 Thread Jorge Bastos
> It's only a plague if you are not billing clients for their bandwidth
> usage :)
> 
> http://wiki.squid-cache.org/ConfigExamples/Chat
> 
> If you know of other clients that go through Squid and can be
> identified, please let me know what and how the ID is done.

Hi Amos,
For MSN case and it's webmessengers, does the mime-type will handle them
all?
---
acl msn url_regex -i gateway.dll
acl msnd dstdomain messenger.msn.com gateway.messenger.hotmail.com
=>acl msn1 req_mime_type ^application/x-msn-messenger$<=
---



[squid-users] Web Messengers

2009-04-23 Thread Jorge Bastos
Hi guys,

I'm starting to investigate on how to block a plague, that is the web
messengers.
There's every day more web messengers available, and it's almost impossible
to be adding then as a iptables rule to block them, or in Squid.
Is there a way to block this in squid, that all get blocked?

Jorge,



RE: [squid-users] Initial webpage before surfing on squid

2009-04-14 Thread Jorge Bastos
> Whoops.  Try...
> 
> external_acl_type session ttl=14400 negative_ttl=0 children=1
> concurrency=200 %SRC /usr/lib/squid3/squid_session -t 14400

Well, seems to be working fine now :)
The only problem is that I get a "access denied" on every first access on
each machine.
My conf:

external_acl_type session ttl=14400 negative_ttl=0 children=1
concurrency=200 %SRC /usr/lib/squid3/squid_session -t 14400
acl session external session
##acl A dstdomain 195.23.114.74
##acl B urlpath_regex /inicial.php
##http_access allow A B
http_access deny !session
deny_info http://195.23.114.74/inicial.php?url=%s session

the lines that are commented were suggested by Amos, but even if I uncomment
them, I still get the access denied on the first access, instead of the
"deny_info" I've provide.

What am I missing here?


> It's based on whatever you send to it.  In my initial suggestion, I
> used
> the URI as the key.  I forgot that we really only need to send the
> requested URI to the deny_info page.

Well I understood this part clearly now, I also saw now that this variables
are on my squid.conf, and understood them.


> 
> > For the session, there's any tool that I can see the active sessions
> and the
> > rest of information about them, ETA time etc etc?
> >
> 
> Probably not.  But you have the source to the helper, so perhaps you
> can commission such a tool...

Well will take a look after this is working fine :)

Jorge,




RE: [squid-users] Initial webpage before surfing on squid

2009-04-13 Thread Jorge Bastos
> Damn! :)
> What can i do? If i write to the squid_session, will he answer? I saw
> sometimes that he post here on the list.

I mean, write to the squid_session creator, Henrik



RE: [squid-users] Initial webpage before surfing on squid

2009-04-13 Thread Jorge Bastos
> > Well, I was doing some search and didn't find any tools for this.
> > Where to start?
> >
> 
> Past my knowledge level now. :)
> 
> Amos

Damn! :)
What can i do? If i write to the squid_session, will he answer? I saw
sometimes that he post here on the list.



RE: [squid-users] Initial webpage before surfing on squid

2009-04-13 Thread Jorge Bastos
> Great!, now that works perfectly! And all sites entered are redirecting
> me
> first to the page I specified.
> Now I need to understand a few things and symptoms.
> When I open a browser window, I digit http://kernel.org, and I get
> redirected to http://1.1.1../inicial.php, and then I click the button
> I've
> made, and continue surfing in kernel.org page, I then go to the address
> bar,
> and digit www.samba.org, I'm redirected again to the page to
> http://1.1.1../inicial.php .
> 
> Maybe it is not creating the session at all?
> When the session is created, is created by squid based on IP ADDR, or
> it's a
> coockie on the client's browser?
> 
> For the session, there's any tool that I can see the active sessions
> and the
> rest of information about them, ETA time etc etc?

Well, I was doing some search and didn't find any tools for this.
Where to start?



RE: [squid-users] Initial webpage before surfing on squid

2009-04-12 Thread Jorge Bastos
> acl A dstdomain 195.23.114.74
> acl B urlpath_regex /inicial.php
> 
> http_access allow A B
> http_access deny !session

Great!, now that works perfectly! And all sites entered are redirecting me
first to the page I specified.
Now I need to understand a few things and symptoms.
When I open a browser window, I digit http://kernel.org, and I get
redirected to http://1.1.1../inicial.php, and then I click the button I've
made, and continue surfing in kernel.org page, I then go to the address bar,
and digit www.samba.org, I'm redirected again to the page to
http://1.1.1../inicial.php .

Maybe it is not creating the session at all?
When the session is created, is created by squid based on IP ADDR, or it's a
coockie on the client's browser?

For the session, there's any tool that I can see the active sessions and the
rest of information about them, ETA time etc etc?

Jorge,



RE: [squid-users] Initial webpage before surfing on squid

2009-04-12 Thread Jorge Bastos
> That passes a 302:http://195.23.114.74/inicial.php?url=...  back to the
> client.
> 
> If the client then requests from Squid:
>http://195.23.114.74/inicial.php?url=...
> 
> and if squid is not configured to unconditionally accept the
> "http://195.23.114.74/inicial.php"; requests this will result in the
> client being sent:
> 302:http://195.23.114.74/inicial.php?url=http://195.23.114.74/inicial.p
> hp?url=


Amos,
How to I tell squid to accept it then?
Forgive me but I'm very fresh on that, do not know how to do it. The
squid_session manpage also doesn't mention nothing of that.



RE: [squid-users] Initial webpage before surfing on squid

2009-04-12 Thread Jorge Bastos
 
> Try this:
> 
>echo rawurldecode($_GET['url']);
> 
> 
> Its a little weird that the redirect URL is being added as the sub-URI.
> Are you sure your http_access are permitting access to the splash URI
> before checking the session handler?
> 
> Amos

Hi Amos,
Well, the rawurldecode() returned the same value.
About the question you are making, well forgive me but don't know how to
answer, the directives for the session setup are:

---
external_acl_type session ttl=14400 negative_ttl=0 children=1
concurrency=200 %URI /usr/lib/squid3/squid_session -t 14400
acl session external session
http_access deny !session
deny_info http://195.23.114.74/inicial.php?url=%s session
---

What can be wrong here?



RE: [squid-users] Initial webpage before surfing on squid

2009-04-11 Thread Jorge Bastos
Chris,
I was doing some tests to see the value that is passed by the %s variable,
and the value that goes to the output is:

---
http://195.23.114.74/inicial.php?url=http%3A%2F%2F195.23.114.74%2Finicial.ph
p%3Furl%3Dhttp%253A%252F%252F195.23.114.74%252Finicial.php%253Furl%253Dhttp%
25253A%25252F%25252F195.23.114.74%25252Finicial.php%25253Furl%25253Dhttp%252
5253A%2525252F%2525252Fhotmail.com%2525252F
---
deny_info http://195.23.114.74/inicial.php?url=%s session
---

The code inside inicial.php is:



Shouldn't I get the clean value of the site I've entered first,
"http://hotmail.com"; ?



RE: [squid-users] Initial webpage before surfing on squid

2009-04-11 Thread Jorge Bastos
> Hi Chris,
> Well yes, I was very confused why was it complaining about AUTH, and in
> the
> squid_session manpage where's nothing that mentions %URI didn't touch
> that.
> 
> So far it seems to working good, I'm being redirected to the page I
> specified, now I'm going to create the page to read the query string,
> and
> then report success or not.
> Thanks a lot Chris :)
> 
> Jorge,

Well now I have another thing, for example, if I enter in the browser
www.google.com, it bypasses the inicial webpage, but if I enter
www.hotmail.com it show's the initial webpage.
What can it be?

When I use "-d /etc/squid_session" even the www.hotmail.com that works on
the above, doesn't work, I go directly to the page I requested.


My conf:

external_acl_type session ttl=14400 negative_ttl=0 children=1
concurrency=200 %URI /usr/lib/squid3/squid_session -
t 14400 -d /etc/squid_session.txt
acl session external session
http_access deny !session
deny_info http://www.lacticoop.pt?url=%s session



RE: [squid-users] Initial webpage before surfing on squid

2009-04-11 Thread Jorge Bastos
> This should probably read...
> 
> external_acl_type session ttl=14400 negative_ttl=0 children=1
> concurrency=200 %URI /usr/lib/squid3/squid_session
> 
> since you are intending to "remember" the original URI requested.
> Using %LOGIN in an external_acl_type declaration implies you are using
> authentication.
> 
> > acl session external session
> > http_access deny !session
> > deny_info http://your.server/bannerpage?url=%s session
> > ---
> >
> 
> Chris

Hi Chris,
Well yes, I was very confused why was it complaining about AUTH, and in the
squid_session manpage where's nothing that mentions %URI didn't touch that.

So far it seems to working good, I'm being redirected to the page I
specified, now I'm going to create the page to read the query string, and
then report success or not.
Thanks a lot Chris :)

Jorge,



RE: [squid-users] Initial webpage before surfing on squid

2009-04-10 Thread Jorge Bastos
> Ops!
> Forgive me Chris,
> There's the squid_session on debian too, didn't saw it, sorry.
> I now need a bit of help on the webpage with this parameters:
> ---
> deny_info http://your.server/bannerpage?url=%s session
> 
>Then  set  up  http://your.server/bannerpage to display a
> session
> startup page and
>then redirect the user back to the requested URL given in the
> url
> query parameter.
> ---
> Is there an example for this?

Apart from the example for this page, and how to redirect the user to the
page he requested, it's failing with:

Starting Squid HTTP Proxy 3.0: squid32009/04/10 18:17:00| Can't use proxy
auth because no authentication schemes are fully configured.
FATAL: ERROR: Invalid ACL: acl session external session

My config (I know there's something wrong, just don't know what):

---
external_acl_type session ttl=14400 negative_ttl=0 children=1
concurrency=200 %LOGIN /usr/lib/squid3/squid_session
acl session external session
http_access deny !session
deny_info http://your.server/bannerpage?url=%s session
---



RE: [squid-users] Initial webpage before surfing on squid

2009-04-10 Thread Jorge Bastos
> Jorge Bastos wrote:
> > Chris,
> > Where can I find the documentation for this helper?
> > I'm surfing squid's page and don't see it :S or am I blind?
> >
> > Jorge,
> 
> There is a man page included with the Squid source.  Under the source
> directory it's helpers/external_acl/session/squid_session.8
> 
> Chris

(Sorry if double posted, messages seems not to get in the destination)

Ops!
Forgive me Chris,
There's the squid_session on debian too, didn't saw it, sorry.
I now need a bit of help on the webpage with this parameters:
---
deny_info http://your.server/bannerpage?url=%s session

   Then  set  up  http://your.server/bannerpage to display a session
startup page and
   then redirect the user back to the requested URL given in the url
query parameter.
---
Is there an example for this?



RE: [squid-users] Initial webpage before surfing on squid

2009-04-10 Thread Jorge Bastos

> > There is a man page included with the Squid source.  Under the source
> > directory it's helpers/external_acl/session/squid_session.8
> >
> > Chris
> 
> Chris,
> Well I was looking at it right now.
> And, I think I have another problem, I use the Debian Packages, so, can
> o
> compile this from source and copy the fresh compiled helper and use it?
> 
> I can see that the docs are there, nice.
> Just need to know if I can compile this and do as I said, can i?

Ops!
Forgive me Chris,
There's the squid_session on debian too, didn't saw it, sorry.
I now need a bit of help on the webpage with this parameters:
---
deny_info http://your.server/bannerpage?url=%s session

   Then  set  up  http://your.server/bannerpage to display a session
startup page and
   then redirect the user back to the requested URL given in the url
query parameter.
---
Is there an example for this?



RE: [squid-users] Initial webpage before surfing on squid

2009-04-10 Thread Jorge Bastos
> > There is a man page included with the Squid source.  Under the source
> > directory it's helpers/external_acl/session/squid_session.8
> >
> > Chris
> 
> Chris,
> Well I was looking at it right now.
> And, I think I have another problem, I use the Debian Packages, so, can
> o
> compile this from source and copy the fresh compiled helper and use it?
> 
> I can see that the docs are there, nice.
> Just need to know if I can compile this and do as I said, can i?

Ok i compiled 3.1.0.7, and the helper seems to be working.
--
cisne:/usr/local/src/squid/squid-3.1.0.7/helpers/external_acl/session#
./squid_session -h
./squid_session: invalid option -- 'h'
Usage: ./squid_session [-t session_timeout] [-b dbpath] [-a]
-t sessiontimeout   Idle timeout after which sessions will be
forgotten
-b dbpath   Path where persistent session database will
be kept
-a  Active mode requiring LOGIN argument to
start a session
cisne:/usr/local/src/squid/squid-3.1.0.7/helpers/external_acl/session#
--

Can I use it again the Squid on Debian 3.0-STABLE13 ?




RE: [squid-users] Initial webpage before surfing on squid

2009-04-10 Thread Jorge Bastos
> 
> There is a man page included with the Squid source.  Under the source
> directory it's helpers/external_acl/session/squid_session.8
> 
> Chris

Chris,
Well I was looking at it right now.
And, I think I have another problem, I use the Debian Packages, so, can o
compile this from source and copy the fresh compiled helper and use it?

I can see that the docs are there, nice.
Just need to know if I can compile this and do as I said, can i?



RE: [squid-users] Initial webpage before surfing on squid

2009-04-09 Thread Jorge Bastos
Chris,
Where can I find the documentation for this helper?
I'm surfing squid's page and don't see it :S or am I blind?

Jorge,

> -Original Message-----
> From: Jorge Bastos [mailto:mysql.jo...@decimal.pt]
> Sent: quinta-feira, 9 de Abril de 2009 21:37
> To: crobert...@gci.net; squid-users@squid-cache.org
> Subject: RE: [squid-users] Initial webpage before surfing on squid
> 
> Hi Chris,
> Thank you, going to search for that and study and post anything else I
> need.
> The 1st problem was that I didn't knew for what to search.
> 
> Jorge,
> 
> > -Original Message-
> > From: crobert...@gci.net [mailto:crobert...@gci.net]
> > Sent: quinta-feira, 9 de Abril de 2009 21:06
> > To: squid-users@squid-cache.org
> > Subject: Re: [squid-users] Initial webpage before surfing on squid
> >
> > Jorge Bastos wrote:
> > > Hi there people,
> > >
> > > I need to do a special setup, that I need help on it.
> > > What I need to be done is, when the user open's its browser, on the
> > first
> > > access, whether it's the homepage, or the user entering an website,
> I
> > want
> > > to show them a webpage with info, and below give them an "continue
> > surfing"
> > > link to leave this intro webpage.
> > >
> >
> > Up to this point, the session helper (which is included with Squid
> > 2.6+)
> > fits the bill exactly:
> > http://www.squid-cache.org/mail-archive/squid-users/200808/0272.html.
> >
> > > If the user closes and re-open's the browser, the behavior should
> be
> > the
> > > same.
> > >
> >
> > With this additional requirement...
> >
> > > I saw that this could be done using squid, I just have no clue how
> > to.
> > > Can you guys give a hand on this?
> > >
> >
> > You might be able to do it with a short idle_timeout on the
> > session_helper and a session cookie set by the acceptable use policy
> > (AUP) page.  If the idle_timeout is reached and the cookie is
> tendered
> > by the browser, the AUP page just 302's the request to the
> destination.
> > If the idle_timeout is reached and the cookie is not passed by the
> > browser, the AUP page is displayed, the session cookie is set and a
> > continue link is made available.
> >
> > > Jorge,
> > >
> >
> > Chris




RE: [squid-users] Initial webpage before surfing on squid

2009-04-09 Thread Jorge Bastos
Hi Chris,
Thank you, going to search for that and study and post anything else I need.
The 1st problem was that I didn't knew for what to search.

Jorge,

> -Original Message-
> From: crobert...@gci.net [mailto:crobert...@gci.net]
> Sent: quinta-feira, 9 de Abril de 2009 21:06
> To: squid-users@squid-cache.org
> Subject: Re: [squid-users] Initial webpage before surfing on squid
> 
> Jorge Bastos wrote:
> > Hi there people,
> >
> > I need to do a special setup, that I need help on it.
> > What I need to be done is, when the user open's its browser, on the
> first
> > access, whether it's the homepage, or the user entering an website, I
> want
> > to show them a webpage with info, and below give them an "continue
> surfing"
> > link to leave this intro webpage.
> >
> 
> Up to this point, the session helper (which is included with Squid
> 2.6+)
> fits the bill exactly:
> http://www.squid-cache.org/mail-archive/squid-users/200808/0272.html.
> 
> > If the user closes and re-open's the browser, the behavior should be
> the
> > same.
> >
> 
> With this additional requirement...
> 
> > I saw that this could be done using squid, I just have no clue how
> to.
> > Can you guys give a hand on this?
> >
> 
> You might be able to do it with a short idle_timeout on the
> session_helper and a session cookie set by the acceptable use policy
> (AUP) page.  If the idle_timeout is reached and the cookie is tendered
> by the browser, the AUP page just 302's the request to the destination.
> If the idle_timeout is reached and the cookie is not passed by the
> browser, the AUP page is displayed, the session cookie is set and a
> continue link is made available.
> 
> > Jorge,
> >
> 
> Chris



[squid-users] Initial webpage before surfing on squid

2009-04-09 Thread Jorge Bastos
Hi there people,

I need to do a special setup, that I need help on it.
What I need to be done is, when the user open's its browser, on the first
access, whether it's the homepage, or the user entering an website, I want
to show them a webpage with info, and below give them an "continue surfing"
link to leave this intro webpage.
If the user closes and re-open's the browser, the behavior should be the
same.
I saw that this could be done using squid, I just have no clue how to.
Can you guys give a hand on this?

Jorge,



RE: [squid-users] WebSite Access Problem

2009-03-23 Thread Jorge Bastos
Perfect,
Working!
Thanks,

> -Original Message-
> From: Amos Jeffries [mailto:squ...@treenet.co.nz]
> Sent: segunda-feira, 23 de Março de 2009 11:31
> To: Jorge Bastos
> Cc: crobert...@gci.net; squid-users@squid-cache.org
> Subject: Re: [squid-users] WebSite Access Problem
> 
> Jorge Bastos wrote:
> > Amos,
> > So in squid for now, to access the website directly will be:
> >
> > --
> > acl broken dstdomain www.interponto.com
> > request_header_access Accept-Encoding deny broken
> > --
> >
> > Correct?
> 
> If the problem is chunks and your using Squid-3 yes.
> 
> For Squid-2 it _should_ be decoded, but the ACL just in that case is
> only "header_access ".
> 
> Amos
> 
> >
> >> -Original Message-
> >> From: Jorge Bastos [mailto:mysql.jo...@decimal.pt]
> >> Sent: segunda-feira, 23 de Março de 2009 10:14
> >> To: 'Amos Jeffries'
> >> Cc: crobert...@gci.net; squid-users@squid-cache.org
> >> Subject: RE: [squid-users] WebSite Access Problem
> >>
> >> Going to check Amos,
> >> Thanks, if I need something else I'll let you guys know.
> >>
> >> Jorge,
> >>
> >>> -Original Message-
> >>> From: Amos Jeffries [mailto:squ...@treenet.co.nz]
> >>> Sent: domingo, 22 de Março de 2009 23:37
> >>> To: Jorge Bastos
> >>> Cc: crobert...@gci.net; squid-users@squid-cache.org
> >>> Subject: RE: [squid-users] WebSite Access Problem
> >>>
> >>>>> to the site with a statically assigned window size (ip route add
> >>>>> 83.240.222.162/32 via $GATEWAY window 65535).  If disabling
> window
> >>>> No good.
> >>>> Well, FF3 complains about this:
> >>>> ---
> >>>> Content Encoding Error
> >>>> The page you are trying to view cannot be shown because it uses an
> >>> invalid
> >>>> or unsupported form of compression.
> >>>> The page you are trying to view cannot be shown because it uses an
> >>> invalid
> >>>> or unsupported form of compression.
> >>>>
> >>>> * Please contact the website owners to inform them of this
> >>> problem.
> >>>> ---
> >>>>
> >>>> FF2 just don't show the page, and IE show's only that piece of
> >> code.
> >>>> Even that this is a webpage problem, could squid be changing the
> >>> encoding?
> >>>> Jorge,
> >>>>
> >>> Ah sounds like IIS-6.0 or FrontPage bites again.
> >>>
> >>> http://squidproxy.wordpress.com/2008/04/29/chunked-decoding/
> >>>
> >>> Amos
> >>>
> >
> >
> 
> 
> --
> Please be using
>Current Stable Squid 2.7.STABLE6 or 3.0.STABLE13
>Current Beta Squid 3.1.0.6



RE: [squid-users] WebSite Access Problem

2009-03-23 Thread Jorge Bastos
Amos,
So in squid for now, to access the website directly will be:

--
acl broken dstdomain www.interponto.com
request_header_access Accept-Encoding deny broken
--

Correct?

> -Original Message-
> From: Jorge Bastos [mailto:mysql.jo...@decimal.pt]
> Sent: segunda-feira, 23 de Março de 2009 10:14
> To: 'Amos Jeffries'
> Cc: crobert...@gci.net; squid-users@squid-cache.org
> Subject: RE: [squid-users] WebSite Access Problem
> 
> Going to check Amos,
> Thanks, if I need something else I'll let you guys know.
> 
> Jorge,
> 
> > -Original Message-
> > From: Amos Jeffries [mailto:squ...@treenet.co.nz]
> > Sent: domingo, 22 de Março de 2009 23:37
> > To: Jorge Bastos
> > Cc: crobert...@gci.net; squid-users@squid-cache.org
> > Subject: RE: [squid-users] WebSite Access Problem
> >
> > >> to the site with a statically assigned window size (ip route add
> > >> 83.240.222.162/32 via $GATEWAY window 65535).  If disabling window
> > >
> > > No good.
> > > Well, FF3 complains about this:
> > > ---
> > > Content Encoding Error
> > > The page you are trying to view cannot be shown because it uses an
> > invalid
> > > or unsupported form of compression.
> > > The page you are trying to view cannot be shown because it uses an
> > invalid
> > > or unsupported form of compression.
> > >
> > > * Please contact the website owners to inform them of this
> > problem.
> > > ---
> > >
> > > FF2 just don't show the page, and IE show's only that piece of
> code.
> > > Even that this is a webpage problem, could squid be changing the
> > encoding?
> > >
> > > Jorge,
> > >
> >
> > Ah sounds like IIS-6.0 or FrontPage bites again.
> >
> > http://squidproxy.wordpress.com/2008/04/29/chunked-decoding/
> >
> > Amos
> >
> 




RE: [squid-users] WebSite Access Problem

2009-03-23 Thread Jorge Bastos
Going to check Amos,
Thanks, if I need something else I'll let you guys know.

Jorge,

> -Original Message-
> From: Amos Jeffries [mailto:squ...@treenet.co.nz]
> Sent: domingo, 22 de Março de 2009 23:37
> To: Jorge Bastos
> Cc: crobert...@gci.net; squid-users@squid-cache.org
> Subject: RE: [squid-users] WebSite Access Problem
> 
> >> to the site with a statically assigned window size (ip route add
> >> 83.240.222.162/32 via $GATEWAY window 65535).  If disabling window
> >
> > No good.
> > Well, FF3 complains about this:
> > ---
> > Content Encoding Error
> > The page you are trying to view cannot be shown because it uses an
> invalid
> > or unsupported form of compression.
> > The page you are trying to view cannot be shown because it uses an
> invalid
> > or unsupported form of compression.
> >
> > * Please contact the website owners to inform them of this
> problem.
> > ---
> >
> > FF2 just don't show the page, and IE show's only that piece of code.
> > Even that this is a webpage problem, could squid be changing the
> encoding?
> >
> > Jorge,
> >
> 
> Ah sounds like IIS-6.0 or FrontPage bites again.
> 
> http://squidproxy.wordpress.com/2008/04/29/chunked-decoding/
> 
> Amos
> 




RE: [squid-users] WebSite Access Problem

2009-03-22 Thread Jorge Bastos
> to the site with a statically assigned window size (ip route add
> 83.240.222.162/32 via $GATEWAY window 65535).  If disabling window

No good.
Well, FF3 complains about this:
---
Content Encoding Error
The page you are trying to view cannot be shown because it uses an invalid
or unsupported form of compression.
The page you are trying to view cannot be shown because it uses an invalid
or unsupported form of compression.

* Please contact the website owners to inform them of this problem.
---

FF2 just don't show the page, and IE show's only that piece of code.
Even that this is a webpage problem, could squid be changing the encoding?

Jorge,



RE: [squid-users] WebSite Access Problem

2009-03-22 Thread Jorge Bastos
cisne:~# sysctl net.ipv4.tcp_window_scaling=0
error: 'net.ipv4.tcp_window_scaling=0' is an unknown key
cisne:~#


> -Original Message-
> From: Michael Spiegle [mailto:m...@nauticaltech.com]
> Sent: sexta-feira, 20 de Março de 2009 23:05
> To: Jorge Bastos
> Cc: crobert...@gci.net; squid-users@squid-cache.org
> Subject: Re: [squid-users] WebSite Access Problem
> 
> How about the following?
> 
> sysctl net.ipv4.tcp_window_scaling=0
> 
> 
> Mike
> 
> Jorge Bastos wrote:
> > Sorry,
> > I completely miss that.
> > I was reading, and I don't have that kernel command on the squid
> server:
> >
> > cisne:~# cat /proc/sys/net/ipv4/tcp_default_win_scale
> > cat: /proc/sys/net/ipv4/tcp_default_win_scale: No such file or
> directory
> > cisne:~#
> >
> > More ideias?
> > PS: the problem is in all machines on the local network.
> >
> >> -Original Message-
> >> From: crobert...@gci.net [mailto:crobert...@gci.net]
> >> Sent: sexta-feira, 20 de Março de 2009 20:53
> >> To: squid-users@squid-cache.org
> >> Subject: Re: [squid-users] WebSite Access Problem
> >>
> >> Jorge Bastos wrote:
> >>>> Squid 2.7STABLE6.  Firefox 3.0.7 on Windows XP.  Works fine for
> me.
> >>>>
> >>>> Check http://wiki.squid-cache.org/KnowledgeBase/BrokenWindowSize
> >>>>
> >>>> Chris
> >>>>
> >>> Well it doesn't for me.
> >>> Either IE or FF.
> >>>
> >>> :(
> >>>
> >>> Using direct connection it works.
> >>>
> >> Did you read the linked article?  It often describes the issue in
> >> situations like this.
> >>
> >> Chris
> >
> >




RE: [squid-users] WebSite Access Problem

2009-03-20 Thread Jorge Bastos
Sorry,
I completely miss that.
I was reading, and I don't have that kernel command on the squid server:

cisne:~# cat /proc/sys/net/ipv4/tcp_default_win_scale
cat: /proc/sys/net/ipv4/tcp_default_win_scale: No such file or directory
cisne:~#

More ideias?
PS: the problem is in all machines on the local network.

> -Original Message-
> From: crobert...@gci.net [mailto:crobert...@gci.net]
> Sent: sexta-feira, 20 de Março de 2009 20:53
> To: squid-users@squid-cache.org
> Subject: Re: [squid-users] WebSite Access Problem
> 
> Jorge Bastos wrote:
> >> Squid 2.7STABLE6.  Firefox 3.0.7 on Windows XP.  Works fine for me.
> >>
> >> Check http://wiki.squid-cache.org/KnowledgeBase/BrokenWindowSize
> >>
> >> Chris
> >>
> >
> > Well it doesn't for me.
> > Either IE or FF.
> >
> > :(
> >
> > Using direct connection it works.
> >
> 
> Did you read the linked article?  It often describes the issue in
> situations like this.
> 
> Chris



RE: [squid-users] WebSite Access Problem

2009-03-20 Thread Jorge Bastos
> Squid 2.7STABLE6.  Firefox 3.0.7 on Windows XP.  Works fine for me.
> 
> Check http://wiki.squid-cache.org/KnowledgeBase/BrokenWindowSize
> 
> Chris

Well it doesn't for me.
Either IE or FF.

:(

Using direct connection it works.



RE: [squid-users] WebSite Access Problem

2009-03-20 Thread Jorge Bastos
Well no idea.
The website+server is not mine.

Or could the server being blocking requests made from squid, where there's a
header from squid?
That is ASP, it may be something like that, it's an idea.

How do I hide the header so that the destination servers won't know that I'm
connection through squid?



> -Original Message-
> From: John Doe [mailto:jd...@yahoo.com]
> Sent: sexta-feira, 20 de Março de 2009 16:19
> To: squid-users@squid-cache.org
> Subject: Re: [squid-users] WebSite Access Problem
> 
> 
> From: Jorge Bastos 
> > When I try to access:
> > http://www.interponto.com/
> > Using "3.0.STABLE13-1" on Debian SID
> > I get a blank page on IE or FF, if I try to see what's on the code, I
> have:
> 
> Maybe because the server sends sends back: "Cache-control: private"...?
> 
> JD
> 
> 
> 




[squid-users] WebSite Access Problem

2009-03-20 Thread Jorge Bastos
Hi people,
When I try to access:

http://www.interponto.com/

Using "3.0.STABLE13-1" on Debian SID

I get a blank page on IE or FF, if I try to see what's on the code, I have:

--




--


What can be the problem? Squid or the website?
Accessing without Squid, it works fine, that tell's me that is something
with squid.
Can you guys try to access with squid and see that happens?


Jorge



RE: [squid-users] HDD Configuration Recommendations

2008-09-24 Thread Jorge Bastos
Forgot to say, you'll need one more disk

> -Original Message-
> From: Chris Nighswonger [mailto:[EMAIL PROTECTED]
> Sent: quarta-feira, 24 de Setembro de 2008 18:18
> To: Squid Users
> Subject: [squid-users] HDD Configuration Recommendations
> 
> Hi all,
>   I'm preparing to move my squid to new hardware. I have two 500GB
> SATA HDD's in the new box which will be used to store squid's cache
> on. Any suggestions on the best raid config for these guys so as to
> maximize performance?
> 
> Regards,
> Chris
> 
> --
> Christopher Nighswonger
> Faculty Member
> Network & Systems Director
> Foundations Bible College & Seminary
> www.foundations.edu
> www.fbcradio.org



RE: [squid-users] HDD Configuration Recommendations

2008-09-24 Thread Jorge Bastos
I advice a RAID5, wheter it's soft or hardware raid.


> -Original Message-
> From: Chris Nighswonger [mailto:[EMAIL PROTECTED]
> Sent: quarta-feira, 24 de Setembro de 2008 18:18
> To: Squid Users
> Subject: [squid-users] HDD Configuration Recommendations
> 
> Hi all,
>   I'm preparing to move my squid to new hardware. I have two 500GB
> SATA HDD's in the new box which will be used to store squid's cache
> on. Any suggestions on the best raid config for these guys so as to
> maximize performance?
> 
> Regards,
> Chris
> 
> --
> Christopher Nighswonger
> Faculty Member
> Network & Systems Director
> Foundations Bible College & Seminary
> www.foundations.edu
> www.fbcradio.org



RE: [squid-users] Response on non-existing dns name

2008-09-19 Thread Jorge Bastos
Hum,
I use it as transparent, and users don't need to insert the proxy in the
browser, I simply forward traffic using iptables, so this means that I won't
be able to, correct?



> -Original Message-
> From: Leonardo Rodrigues Magalhães [mailto:[EMAIL PROTECTED]
> Sent: sexta-feira, 19 de Setembro de 2008 0:09
> To: Jorge Bastos
> Cc: 'ML squid'
> Subject: Re: [squid-users] Response on non-existing dns name
> 
> 
> ERR_DNS_FAIL
> 
> it's already there in your errors directory 
> 
> but that wont work if your connections are being transparently
> intercepted. On this situation, own client machine tries to resolve DNS
> and if it cant, the local browser shows it's default error message. If
> browsers are configured to use a proxy (squid in your case), then
> browsers wont try to solve anything and will simply forward the query
> to
> the proxy. So, in this case, proxy can show some error on DNS FAIL or
> several other situations.
> 
> 
> Jorge Bastos escreveu:
> > Hi,
> > I'd like to show users a custom page when they type a domain that is
> > non-existent, is this possible?
> >
> >
> 
> --
> 
> 
>   Atenciosamente / Sincerily,
>   Leonardo Rodrigues
>   Solutti Tecnologia
>   http://www.solutti.com.br
> 
>   Minha armadilha de SPAM, NÃO mandem email
>   [EMAIL PROTECTED]
>   My SPAMTRAP, do not email it
> 
> 
> 




[squid-users] Response on non-existing dns name

2008-09-18 Thread Jorge Bastos
Hi,
I'd like to show users a custom page when they type a domain that is
non-existent, is this possible?

Thanks in advanced,
Jorge



RE: [squid-users] wccp working config example

2008-09-11 Thread Jorge Bastos
Olá Nuno :P

As a transparent proxy it apply's the ACL, I have it working for example to
block MSN using an ACL, and other stuff.
In fact I never used squid in other way than as a transparent proxy and it
always worked.


> -Original Message-
> From: Nuno Silva [mailto:[EMAIL PROTECTED]
> Sent: quinta-feira, 11 de Setembro de 2008 8:43
> To: Amos Jeffries
> Cc: Dan Letkeman; squid-users
> Subject: RE: [squid-users] wccp working config example
> 
> Another question...should i see the requests from users in the
> access.log? (because I'm not)
> I'm trying to filter the requests by category (no porn, no gambling, no
> streamingand so on), mas it seems that if I use the Squid as
> transparent proxy it doesn't apply the ACL's in squid.conf (but if I
> use
> the squid as my proxy, it works)
> 
> 
> Best regards,
> Nuno Silva
> 
> 
> -Mensagem original-
> De: Amos Jeffries [mailto:[EMAIL PROTECTED]
> Enviada: quarta-feira, 10 de Setembro de 2008 15:26
> Para: Nuno Silva
> Cc: Dan Letkeman; squid-users
> Assunto: Re: [squid-users] wccp working config example
> 
> Nuno Silva wrote:
> > Amos.
> >
> > Thank you very much, it started working, i was missing the 'iptables
> -t
> > nat -A POSTROUTING -j MASQUERADE'... what is the purpose of that?
> 
> Normally to NAT traffic coming in you have to NAT the responses back to
> the right places, but it gets tricky very quickly so someone created
> MASQUERADE to unwind all NAT bindings automatically on response
> packets.
> 
> I'm not sure about speed, but its easy to configure.
> 
> >
> > And regarding the output of iptables -t filter -L *:
> >
> > iptables: No chain/target/match by that name
> >
> > Should the output be other?
> 
> Weird, I'd expect a list same as for the -t nat you gave earlier, but
> never mind. I thought maybe there was a forwarding rule or policy
> blocking things. Since MASQUERADE fixed it, we don't need to look any
> further.
> 
> >
> > Best regards and many many many many many many many thanks!
> > Nuno Silva
> >
> 
> Welcome.
> 
> Amos
> --
> Please use Squid 2.7.STABLE4 or 3.0.STABLE8
> 
> 
> No virus found in this incoming message.
> Checked by AVG - http://www.avg.com
> Version: 8.0.169 / Virus Database: 270.6.19/1663 - Release Date:
> 10-09-2008 6:00




RE: [squid-users] ACL named "all"

2008-07-26 Thread Jorge Bastos
Thanks both,

In fact there was a acl all lost in the config file, it was there sinse 2.x
version, I think in 2.x version there was a acl all by default.

Ok it's solved :)

Jorge



> -Original Message-
> From: Amos Jeffries [mailto:[EMAIL PROTECTED]
> Sent: sábado, 26 de Julho de 2008 6:11
> To: Leonardo Rodrigues Magalhães
> Cc: ML squid
> Subject: Re: [squid-users] ACL named "all"
> 
> Leonardo Rodrigues Magalhães wrote:
> >
> >
> > Jorge Bastos escreveu:
> >> Hi people,
> >>
> >> Since first 3.0 version i've noticed this:
> >> 2008/07/25 21:56:24| WARNING: '0.0.0.0/0.0.0.0' is a subnetwork of
> >> '192.168.1.0/255.255.255.0'
> >> 2008/07/25 21:56:24| WARNING: because of this
> >> '192.168.1.0/255.255.255.0' is
> >> ignored to keep splay tree searching predictable
> >> 2008/07/25 21:56:24| WARNING: You should probably remove
> >> '0.0.0.0/0.0.0.0'
> >> from the ACL named 'all'
> >>
> >> But now saw on the STABLE8 version changelog:
> >> - Update Release Notes: 'all' ACL is built-in since
> 3.0.STABLE1
> >>
> >> So, how should I remote this warning?
> >>
> >>
> >
> >in squid 3.0 the 'all' acl is built-in. So if you try to define it
> in
> > your squid.conf, than you'll be redefining an already defined ACL.
> >
> >How to remove the warning ?? simply remove the 'acl all src
> > 0.0.0.0/0.0.0.0' line from your squid.conf !!! Defining this ACL is
> no
> > longer necessary in squid 3.0 STABLE1 and newers.
> >
> 
> Adding to that ... It looks like whomever configured your squid used
> 'all' (whole internet) when they really mean local-network. This has
> serious security implications, which is part of why its now built-in.
> 
> In addition to removing the all ACL definition from your squid.conf.
> You
> in particular need to audit your config access lines to make sure they
> still perform according to your policies.
> 
> Amos
> --
> Please use Squid 2.7.STABLE3 or 3.0.STABLE8



[squid-users] ACL named "all"

2008-07-25 Thread Jorge Bastos
Hi people,

Since first 3.0 version i've noticed this:
2008/07/25 21:56:24| WARNING: '0.0.0.0/0.0.0.0' is a subnetwork of
'192.168.1.0/255.255.255.0'
2008/07/25 21:56:24| WARNING: because of this '192.168.1.0/255.255.255.0' is
ignored to keep splay tree searching predictable
2008/07/25 21:56:24| WARNING: You should probably remove '0.0.0.0/0.0.0.0'
from the ACL named 'all'

But now saw on the STABLE8 version changelog:
- Update Release Notes: 'all' ACL is built-in since 3.0.STABLE1

So, how should I remote this warning?

Jorge



RE: [squid-users] NO_CACHE

2008-04-28 Thread Jorge Bastos
For example for my local network 192.168.1.0/24

acl all_cache src 192.168.1.0/24
no_cache deny all_cache



> -Original Message-
> From: Tiago Durante [mailto:[EMAIL PROTECTED]
> Sent: segunda-feira, 28 de Abril de 2008 15:37
> To: squid-users@squid-cache.org
> Subject: [squid-users] NO_CACHE
> 
> Hi all!
> 
> I'm trying to use this function but until now I couldn't obtain any
> success in my tests.
> I no even will put here the tests that I already made because,
> actually, I can't remember exactly what I already did. =(
> 
> There is anybody using it to don't cache pages? Can I see an example?
> 
> I'm using Squid 2.6.STABLE14...
> 
> Tks a lot!
> 
> 
> --
> Tiago Durante
> 
> ,.,.,.,.,.,.,.,.,.,.,.,.,.,.,.,.,.,.,.,.,.,.,.,.,.,.,.,.,.,.,.,.,.,
> Perseverance is the hard work you do after you
> get tired of doing the hard work you already did.
> -- Newt Gingrich



RE: [squid-users] HowTO "ReWrite" Destination IP?

2008-04-22 Thread Jorge Bastos
Thanks you all guys, but i founded a better solution, DNSMasq feature
"alias" that does this:
---
alias=195.23.123.71,192.168.1.221
---




> -Original Message-
> From: Henrik Nordstrom [mailto:[EMAIL PROTECTED]
> Sent: terça-feira, 22 de Abril de 2008 23:23
> To: Jorge Bastos
> Subject: RE: [squid-users] HowTO "ReWrite" Destination IP?
> 
> tis 2008-04-22 klockan 17:12 +0100 skrev Jorge Bastos:
> > An option, but good, i'd had to do that for each new site and for the
> > hundred already installed.
> > No other way? There's no DNS server that can do this?
> 
> Map IP addresses on the fly in DNS responses, without knowing the
> relevant sites?
> 
> No, I don't think so, but it's easily done using NAT on the server..
> 
> Regards
> Henrik
> 
> 
> 




RE: [squid-users] HowTO "ReWrite" Destination IP?

2008-04-22 Thread Jorge Bastos
An option, but good, i'd had to do that for each new site and for the
hundred already installed.
No other way? There's no DNS server that can do this?


> -Original Message-
> From: Henrik Nordstrom [mailto:[EMAIL PROTECTED]
> Sent: terça-feira, 22 de Abril de 2008 16:56
> To: Jorge Bastos
> Cc: squid-users@squid-cache.org
> Subject: Re: [squid-users] HowTO "ReWrite" Destination IP?
> 
> You can add the web site with ip Y.Y.Y.Y in /etc/hosts
> 
> tis 2008-04-22 klockan 14:40 +0100 skrev Jorge Bastos:
> > Hi,
> > I use Squid as a transparent proxy/interceptor and i'd like to do the
> > following.
> > When a request comes and squid resolves it to the IP X.X.X.X, I'd
> like to
> > change that IP to Y.Y.Y.Y
> >
> > Is this possible?
> > The reason it, the IP X.X.X.X has a QoS policy applied and IP
> Y.Y.Y.Y, that
> > way, I can access the webserver the maximum speed of the webserver.
> >
> > Thanks in advanced,
> > Jorge




[squid-users] HowTO "ReWrite" Destination IP?

2008-04-22 Thread Jorge Bastos
Hi,
I use Squid as a transparent proxy/interceptor and i'd like to do the
following.
When a request comes and squid resolves it to the IP X.X.X.X, I'd like to
change that IP to Y.Y.Y.Y

Is this possible?
The reason it, the IP X.X.X.X has a QoS policy applied and IP Y.Y.Y.Y, that
way, I can access the webserver the maximum speed of the webserver.

Thanks in advanced,
Jorge



RE: [squid-users] client ip's

2008-04-10 Thread Jorge Bastos
In fact I have 3 NIC's.

Yes, the two interfaces I showed in the route print, are defined in
/etc/network/interfaces.




> -Original Message-
> From: julian julian [mailto:[EMAIL PROTECTED]
> Sent: quinta-feira, 10 de Abril de 2008 15:47
> To: Jorge Bastos
> Cc: squid
> Subject: RE: [squid-users] client ip's
> 
> Jorge: have you set the network properly? Are you
> using 192.168.x.x net. The network parameter must be
> wrote in
> ../ifcfg-eth0 and ../ifcfg-eth1 file (because I
> suspect that you have two nics). The route command
> shows some aspect of your network configuration.
> 
> Julián
> 
> --- Jorge Bastos <[EMAIL PROTECTED]> wrote:
> 
> > Hum I got some news on this,
> >
> > I don't know why my system started to give me this
> > information:
> >
> > Kernel IP routing table
> > Destination Gateway Genmask
> > Flags Metric RefUse Iface
> > 192.168.1.0 *   255.255.255.0   U
> >  0  00 eth0
> > 192.168.0.0 *   255.255.255.0   U
> >  0  00 eth1
> > default localhost   0.0.0.0 UG
> >  0  00 eth1
> >
> > Kernel IP routing table
> > Destination Gateway Genmask
> > Flags Metric RefUse Iface
> > 192.168.1.0 0.0.0.0 255.255.255.0   U
> >  0  00 eth0
> > 192.168.0.0 0.0.0.0 255.255.255.0   U
> >  0  00 eth1
> > 0.0.0.0 192.168.0.254   0.0.0.0 UG
> >  0  00 eth1
> >
> >
> > The fact is that the hosts file is correct:
> >
> > cisne:~# cat /etc/hosts
> > 127.0.0.1   localhost
> >
> > I only have this there
> >
> > I know this is not squid related but if you guys can
> > give me a hand.
> > I have no idea why is it resolving 192.168.0.254 to
> > localhost.
> >
> >
> >
> >
> >
> > > -Original Message-
> > > From: Jorge Bastos [mailto:[EMAIL PROTECTED]
> > > Sent: sábado, 5 de Abril de 2008 21:23
> > > To: 'Henrik Nordstrom'
> > > Cc: 'Amos Jeffries'; squid-users@squid-cache.org
> > > Subject: RE: [squid-users] client ip's
> > >
> > > This already worked with some of the 3.0 versions.
> > > Gonna try to play with my iptables rules and let
> > you guys know.
> > >
> > >
> > >
> > >
> > > > -Original Message-
> > > > From: Henrik Nordstrom
> > [mailto:[EMAIL PROTECTED]
> > > > Sent: sábado, 5 de Abril de 2008 19:38
> > > > To: Jorge Bastos
> > > > Cc: 'Amos Jeffries'; squid-users@squid-cache.org
> > > > Subject: RE: [squid-users] client ip's
> > > >
> > > > lr 2008-04-05 klockan 14:24 +0100 skrev Jorge
> > Bastos:
> > > >
> > > > > I updated to last STABLE-4 on debian, but this
> > still happens this
> > > > way.
> > > > > What can I do more?
> > > >
> > > > Good question.
> > > >
> > > > One thing you can try is to downgrade to
> > Squid-2.6. If that shows the
> > > > same symptoms the problem is not within Squid
> > but most likely in your
> > > > firewall ruleset or something else relevant to
> > how the connections
> > > end
> > > > up at your Squid.
> > > >
> > > > Regards
> > > > Henrik
> > >
> >
> >
> >
> 
> 
> __
> Do You Yahoo!?
> Tired of spam?  Yahoo! Mail has the best spam protection around
> http://mail.yahoo.com



RE: [squid-users] client ip's

2008-04-10 Thread Jorge Bastos
Hum I got some news on this,

I don't know why my system started to give me this information:

Kernel IP routing table
Destination Gateway Genmask Flags Metric RefUse Iface
192.168.1.0 *   255.255.255.0   U 0  00 eth0
192.168.0.0 *   255.255.255.0   U 0  00 eth1
default localhost   0.0.0.0 UG0  00 eth1

Kernel IP routing table
Destination Gateway Genmask Flags Metric RefUse Iface
192.168.1.0 0.0.0.0 255.255.255.0   U 0  00 eth0
192.168.0.0 0.0.0.0 255.255.255.0   U 0  00 eth1
0.0.0.0 192.168.0.254   0.0.0.0 UG0  00 eth1


The fact is that the hosts file is correct:

cisne:~# cat /etc/hosts
127.0.0.1   localhost

I only have this there

I know this is not squid related but if you guys can give me a hand.
I have no idea why is it resolving 192.168.0.254 to localhost.





> -Original Message-
> From: Jorge Bastos [mailto:[EMAIL PROTECTED]
> Sent: sábado, 5 de Abril de 2008 21:23
> To: 'Henrik Nordstrom'
> Cc: 'Amos Jeffries'; squid-users@squid-cache.org
> Subject: RE: [squid-users] client ip's
> 
> This already worked with some of the 3.0 versions.
> Gonna try to play with my iptables rules and let you guys know.
> 
> 
> 
> 
> > -Original Message-
> > From: Henrik Nordstrom [mailto:[EMAIL PROTECTED]
> > Sent: sábado, 5 de Abril de 2008 19:38
> > To: Jorge Bastos
> > Cc: 'Amos Jeffries'; squid-users@squid-cache.org
> > Subject: RE: [squid-users] client ip's
> >
> > lr 2008-04-05 klockan 14:24 +0100 skrev Jorge Bastos:
> >
> > > I updated to last STABLE-4 on debian, but this still happens this
> > way.
> > > What can I do more?
> >
> > Good question.
> >
> > One thing you can try is to downgrade to Squid-2.6. If that shows the
> > same symptoms the problem is not within Squid but most likely in your
> > firewall ruleset or something else relevant to how the connections
> end
> > up at your Squid.
> >
> > Regards
> > Henrik
> 




RE: [squid-users] client ip's

2008-04-05 Thread Jorge Bastos
This already worked with some of the 3.0 versions.
Gonna try to play with my iptables rules and let you guys know.




> -Original Message-
> From: Henrik Nordstrom [mailto:[EMAIL PROTECTED]
> Sent: sábado, 5 de Abril de 2008 19:38
> To: Jorge Bastos
> Cc: 'Amos Jeffries'; squid-users@squid-cache.org
> Subject: RE: [squid-users] client ip's
> 
> lr 2008-04-05 klockan 14:24 +0100 skrev Jorge Bastos:
> 
> > I updated to last STABLE-4 on debian, but this still happens this
> way.
> > What can I do more?
> 
> Good question.
> 
> One thing you can try is to downgrade to Squid-2.6. If that shows the
> same symptoms the problem is not within Squid but most likely in your
> firewall ruleset or something else relevant to how the connections end
> up at your Squid.
> 
> Regards
> Henrik




RE: [squid-users] client ip's

2008-04-05 Thread Jorge Bastos
People,

I updated to last STABLE-4 on debian, but this still happens this way.
What can I do more?

Jorge   

> -Original Message-
> From: Jorge Bastos [mailto:[EMAIL PROTECTED]
> Sent: quinta-feira, 3 de Abril de 2008 9:56
> To: 'Amos Jeffries'
> Cc: 'Henrik Nordstrom'; squid-users@squid-cache.org
> Subject: RE: [squid-users] client ip's
> 
> Hum, the last one's on debian.
> They were 3.0 PRE-X, but don't remember the number.
> 
> 
> 
> 
> > -Original Message-
> > From: Amos Jeffries [mailto:[EMAIL PROTECTED]
> > Sent: quinta-feira, 3 de Abril de 2008 6:08
> > To: Jorge Bastos
> > Cc: 'Henrik Nordstrom'; squid-users@squid-cache.org
> > Subject: Re: [squid-users] client ip's
> >
> > Jorge Bastos wrote:
> > > The rule I use to redirect traffic from 80 to 8080 is:
> > > I must remember, this was working before 3.0 stable1 or stable2
> (not
> > using
> > > stable2), I just saw this was happening now.
> >
> > What version did you upgrade from?
> >
> > >
> > > iptables -t nat -A PREROUTING -s 192.168.1.0/24 -p tcp --dport 80 -
> j
> > DNAT
> > > --to-destination 192.168.1.1:8080
> > >
> >
> > If squid is running on this same box I would recommend the REDIRECT
> > target instead of DNAT. It's less work for the kernel.
> >
> > The other possible issue is that you have your redirection rule at
> the
> > start of the NAT tables. The matching rule to allow squid traffic out
> > is
> > near the end.
> >
> > Even if you keep DNAT, they should be in this order:
> >
> > # allow squid traffic out okay.
> > iptables -t nat _A PREROUTING -s 192.168.1.1 -p tcp --dport 80 -j
> > ACCEPT
> > # redirect all other web traffic into squid.
> > iptables -t nat -A PREROUTING -s 192.168.1.0/24 -p tcp --dport 80 -j
> > REDIRECT --to-port 8080
> >
> > >
> > > cisne:~# iptables-save -t nat
> > > # Generated by iptables-save v1.4.0 on Wed Apr  2 17:12:25 2008
> > > *nat
> > > :PREROUTING ACCEPT [35:1650]
> > > :POSTROUTING ACCEPT [10307:1367320]
> > > :OUTPUT ACCEPT [66427:4357431]
> > > -A PREROUTING -d 193.164.158.105/32 -j DROP
> > > -A PREROUTING -i eth1 -p tcp -m tcp --dport 5111 -j DNAT --to-
> > destination
> > > 192.168.1.11:5900
> > > -A PREROUTING -i eth1 -p tcp -m tcp --dport 5901 -j DNAT --to-
> > destination
> > > 192.168.1.2:5900
> > > -A PREROUTING -i eth1 -p tcp -m tcp --dport 5969 -j DNAT --to-
> > destination
> > > 192.168.1.3:5900
> > > -A PREROUTING -i eth1 -p tcp -m tcp --dport 3389 -j DNAT --to-
> > destination
> > > 192.168.1.204:3389
> > > -A PREROUTING -s 192.168.1.0/24 -p tcp -m tcp --dport 80 -j DNAT
> > > --to-destination 192.168.1.1:8080
> > > -A PREROUTING -p gre -j ACCEPT
> > > -A PREROUTING -p icmp -j ACCEPT
> > > -A PREROUTING -p ah -j ACCEPT
> > > -A PREROUTING -p udp -m udp --dport 53 -j ACCEPT
> > > -A PREROUTING -p udp -m udp --dport 500 -j ACCEPT
> > > -A PREROUTING -p udp -m udp --dport 1723 -j ACCEPT
> > > -A PREROUTING -p udp -j ACCEPT
> > > -A PREROUTING -p tcp -m tcp --dport 20 -j ACCEPT
> > > -A PREROUTING -p tcp -m tcp --dport 21 -j ACCEPT
> > > -A PREROUTING -p tcp -m tcp --dport 22 -j ACCEPT
> > > -A PREROUTING -p tcp -m tcp --dport 23 -j ACCEPT
> > > -A PREROUTING -p tcp -m tcp --dport 25 -j ACCEPT
> > > -A PREROUTING -p tcp -m tcp --dport 43 -j ACCEPT
> > > -A PREROUTING -p tcp -m tcp --dport 79 -j ACCEPT
> > > -A PREROUTING -p tcp -m tcp --dport 123 -j ACCEPT
> > > -A PREROUTING -p tcp -m tcp --dport 143 -j ACCEPT
> > > -A PREROUTING -p tcp -m tcp --dport 443 -j ACCEPT
> > > -A PREROUTING -d 80.172.172.34/32 -p tcp -m tcp --dport 444 -j
> ACCEPT
> > > -A PREROUTING -p tcp -m tcp --dport 1723 -j ACCEPT
> > > -A PREROUTING -p tcp -m tcp --dport 1863 -j ACCEPT
> > > -A PREROUTING -p tcp -m tcp --dport 3306 -j ACCEPT
> > > -A PREROUTING -p tcp -m tcp --dport 3389 -j ACCEPT
> > > -A PREROUTING -d 80.172.172.34/32 -p tcp -m tcp --dport 5000 -j
> > ACCEPT
> > > -A PREROUTING -p tcp -m tcp --dport 5190 -j ACCEPT
> > > -A PREROUTING -p tcp -m tcp --dport 5900 -j ACCEPT
> > > -A PREROUTING -p tcp -m tcp --dport 5901 -j ACCEPT
> > > -A PREROUTING -p tcp -m tcp --dport 6667 -j ACCEPT
> > > -A PREROUTING -s 192.168.1.0/24 -d 192.168.1.206/32 -p tcp -m tcp -
> -
> &

RE: [squid-users] limiting upload bandwidth? pls

2008-04-03 Thread Jorge Bastos
The best thing to do this is with QoS


> -Original Message-
> From: Tvrtko Majstorović [mailto:[EMAIL PROTECTED]
> Sent: quinta-feira, 3 de Abril de 2008 13:08
> To: squid-users@squid-cache.org
> Subject: [squid-users] limiting upload bandwidth? pls
> 
> Hi,
> 
> I need to limit 'upload' bandwidth on my network but for just one
> particular application that uses 'x' port. I know it has to be done
> using delay pools, but just don't know how to configure squid. Can
> someone please tell me what must I do to achieve this? Do I need
> transparent proxy for this? Short explanation would be nice, and maybe
> some configuration snippets. If someone has a link to this kinda
> tutorial it would be nice.
> I'm new to Squid, and I did a basic configuration, but I'm starting to
> get confused a little bit with all those options.
> 
> P.S I dont need to limit download, just upload bandwidth.
> 
> Thanks in Advance,
> Tvrtko.



RE: [squid-users] client ip's

2008-04-03 Thread Jorge Bastos
Hum, the last one's on debian.
They were 3.0 PRE-X, but don't remember the number.




> -Original Message-
> From: Amos Jeffries [mailto:[EMAIL PROTECTED]
> Sent: quinta-feira, 3 de Abril de 2008 6:08
> To: Jorge Bastos
> Cc: 'Henrik Nordstrom'; squid-users@squid-cache.org
> Subject: Re: [squid-users] client ip's
> 
> Jorge Bastos wrote:
> > The rule I use to redirect traffic from 80 to 8080 is:
> > I must remember, this was working before 3.0 stable1 or stable2 (not
> using
> > stable2), I just saw this was happening now.
> 
> What version did you upgrade from?
> 
> >
> > iptables -t nat -A PREROUTING -s 192.168.1.0/24 -p tcp --dport 80 -j
> DNAT
> > --to-destination 192.168.1.1:8080
> >
> 
> If squid is running on this same box I would recommend the REDIRECT
> target instead of DNAT. It's less work for the kernel.
> 
> The other possible issue is that you have your redirection rule at the
> start of the NAT tables. The matching rule to allow squid traffic out
> is
> near the end.
> 
> Even if you keep DNAT, they should be in this order:
> 
> # allow squid traffic out okay.
> iptables -t nat _A PREROUTING -s 192.168.1.1 -p tcp --dport 80 -j
> ACCEPT
> # redirect all other web traffic into squid.
> iptables -t nat -A PREROUTING -s 192.168.1.0/24 -p tcp --dport 80 -j
> REDIRECT --to-port 8080
> 
> >
> > cisne:~# iptables-save -t nat
> > # Generated by iptables-save v1.4.0 on Wed Apr  2 17:12:25 2008
> > *nat
> > :PREROUTING ACCEPT [35:1650]
> > :POSTROUTING ACCEPT [10307:1367320]
> > :OUTPUT ACCEPT [66427:4357431]
> > -A PREROUTING -d 193.164.158.105/32 -j DROP
> > -A PREROUTING -i eth1 -p tcp -m tcp --dport 5111 -j DNAT --to-
> destination
> > 192.168.1.11:5900
> > -A PREROUTING -i eth1 -p tcp -m tcp --dport 5901 -j DNAT --to-
> destination
> > 192.168.1.2:5900
> > -A PREROUTING -i eth1 -p tcp -m tcp --dport 5969 -j DNAT --to-
> destination
> > 192.168.1.3:5900
> > -A PREROUTING -i eth1 -p tcp -m tcp --dport 3389 -j DNAT --to-
> destination
> > 192.168.1.204:3389
> > -A PREROUTING -s 192.168.1.0/24 -p tcp -m tcp --dport 80 -j DNAT
> > --to-destination 192.168.1.1:8080
> > -A PREROUTING -p gre -j ACCEPT
> > -A PREROUTING -p icmp -j ACCEPT
> > -A PREROUTING -p ah -j ACCEPT
> > -A PREROUTING -p udp -m udp --dport 53 -j ACCEPT
> > -A PREROUTING -p udp -m udp --dport 500 -j ACCEPT
> > -A PREROUTING -p udp -m udp --dport 1723 -j ACCEPT
> > -A PREROUTING -p udp -j ACCEPT
> > -A PREROUTING -p tcp -m tcp --dport 20 -j ACCEPT
> > -A PREROUTING -p tcp -m tcp --dport 21 -j ACCEPT
> > -A PREROUTING -p tcp -m tcp --dport 22 -j ACCEPT
> > -A PREROUTING -p tcp -m tcp --dport 23 -j ACCEPT
> > -A PREROUTING -p tcp -m tcp --dport 25 -j ACCEPT
> > -A PREROUTING -p tcp -m tcp --dport 43 -j ACCEPT
> > -A PREROUTING -p tcp -m tcp --dport 79 -j ACCEPT
> > -A PREROUTING -p tcp -m tcp --dport 123 -j ACCEPT
> > -A PREROUTING -p tcp -m tcp --dport 143 -j ACCEPT
> > -A PREROUTING -p tcp -m tcp --dport 443 -j ACCEPT
> > -A PREROUTING -d 80.172.172.34/32 -p tcp -m tcp --dport 444 -j ACCEPT
> > -A PREROUTING -p tcp -m tcp --dport 1723 -j ACCEPT
> > -A PREROUTING -p tcp -m tcp --dport 1863 -j ACCEPT
> > -A PREROUTING -p tcp -m tcp --dport 3306 -j ACCEPT
> > -A PREROUTING -p tcp -m tcp --dport 3389 -j ACCEPT
> > -A PREROUTING -d 80.172.172.34/32 -p tcp -m tcp --dport 5000 -j
> ACCEPT
> > -A PREROUTING -p tcp -m tcp --dport 5190 -j ACCEPT
> > -A PREROUTING -p tcp -m tcp --dport 5900 -j ACCEPT
> > -A PREROUTING -p tcp -m tcp --dport 5901 -j ACCEPT
> > -A PREROUTING -p tcp -m tcp --dport 6667 -j ACCEPT
> > -A PREROUTING -s 192.168.1.0/24 -d 192.168.1.206/32 -p tcp -m tcp --
> dport
> >  -j ACCEPT
> > -A PREROUTING -d 192.168.1.1/32 -p tcp -m tcp --dport 8080 -j ACCEPT
> > -A PREROUTING -i eth1 -p tcp -m tcp --dport 30106 -j DNAT --to-
> destination
> > 192.168.1.224:30106
> > -A PREROUTING -s 192.168.1.0/24 -p tcp -m tcp --dport 62500:63500
> > --tcp-flags FIN,SYN,RST,ACK SYN -j ACCEPT
> > -A PREROUTING -j DROP
> > -A POSTROUTING -o eth1 -j MASQUERADE
> > COMMIT
> > # Completed on Wed Apr  2 17:12:26 2008
> >
> > -Original Message-
> > From: Henrik Nordstrom [mailto:[EMAIL PROTECTED]
> > Sent: quarta-feira, 2 de Abril de 2008 11:42
> > To: Jorge Bastos
> > Cc: squid-users@squid-cache.org
> > Subject: RE: [squid-users] client ip's
> >
> > WHat do your iptables NAT rules look like?
> >
> > iptables-save -t nat
> >
> > ons 200

RE: [squid-users] client ip's

2008-04-02 Thread Jorge Bastos
The rule I use to redirect traffic from 80 to 8080 is:
I must remember, this was working before 3.0 stable1 or stable2 (not using
stable2), I just saw this was happening now.

iptables -t nat -A PREROUTING -s 192.168.1.0/24 -p tcp --dport 80 -j DNAT
--to-destination 192.168.1.1:8080


cisne:~# iptables-save -t nat
# Generated by iptables-save v1.4.0 on Wed Apr  2 17:12:25 2008
*nat
:PREROUTING ACCEPT [35:1650]
:POSTROUTING ACCEPT [10307:1367320]
:OUTPUT ACCEPT [66427:4357431]
-A PREROUTING -d 193.164.158.105/32 -j DROP
-A PREROUTING -i eth1 -p tcp -m tcp --dport 5111 -j DNAT --to-destination
192.168.1.11:5900
-A PREROUTING -i eth1 -p tcp -m tcp --dport 5901 -j DNAT --to-destination
192.168.1.2:5900
-A PREROUTING -i eth1 -p tcp -m tcp --dport 5969 -j DNAT --to-destination
192.168.1.3:5900
-A PREROUTING -i eth1 -p tcp -m tcp --dport 3389 -j DNAT --to-destination
192.168.1.204:3389
-A PREROUTING -s 192.168.1.0/24 -p tcp -m tcp --dport 80 -j DNAT
--to-destination 192.168.1.1:8080
-A PREROUTING -p gre -j ACCEPT
-A PREROUTING -p icmp -j ACCEPT
-A PREROUTING -p ah -j ACCEPT
-A PREROUTING -p udp -m udp --dport 53 -j ACCEPT
-A PREROUTING -p udp -m udp --dport 500 -j ACCEPT
-A PREROUTING -p udp -m udp --dport 1723 -j ACCEPT
-A PREROUTING -p udp -j ACCEPT
-A PREROUTING -p tcp -m tcp --dport 20 -j ACCEPT
-A PREROUTING -p tcp -m tcp --dport 21 -j ACCEPT
-A PREROUTING -p tcp -m tcp --dport 22 -j ACCEPT
-A PREROUTING -p tcp -m tcp --dport 23 -j ACCEPT
-A PREROUTING -p tcp -m tcp --dport 25 -j ACCEPT
-A PREROUTING -p tcp -m tcp --dport 43 -j ACCEPT
-A PREROUTING -p tcp -m tcp --dport 79 -j ACCEPT
-A PREROUTING -p tcp -m tcp --dport 123 -j ACCEPT
-A PREROUTING -p tcp -m tcp --dport 143 -j ACCEPT
-A PREROUTING -p tcp -m tcp --dport 443 -j ACCEPT
-A PREROUTING -d 80.172.172.34/32 -p tcp -m tcp --dport 444 -j ACCEPT
-A PREROUTING -p tcp -m tcp --dport 1723 -j ACCEPT
-A PREROUTING -p tcp -m tcp --dport 1863 -j ACCEPT
-A PREROUTING -p tcp -m tcp --dport 3306 -j ACCEPT
-A PREROUTING -p tcp -m tcp --dport 3389 -j ACCEPT
-A PREROUTING -d 80.172.172.34/32 -p tcp -m tcp --dport 5000 -j ACCEPT
-A PREROUTING -p tcp -m tcp --dport 5190 -j ACCEPT
-A PREROUTING -p tcp -m tcp --dport 5900 -j ACCEPT
-A PREROUTING -p tcp -m tcp --dport 5901 -j ACCEPT
-A PREROUTING -p tcp -m tcp --dport 6667 -j ACCEPT
-A PREROUTING -s 192.168.1.0/24 -d 192.168.1.206/32 -p tcp -m tcp --dport
 -j ACCEPT
-A PREROUTING -d 192.168.1.1/32 -p tcp -m tcp --dport 8080 -j ACCEPT
-A PREROUTING -i eth1 -p tcp -m tcp --dport 30106 -j DNAT --to-destination
192.168.1.224:30106
-A PREROUTING -s 192.168.1.0/24 -p tcp -m tcp --dport 62500:63500
--tcp-flags FIN,SYN,RST,ACK SYN -j ACCEPT
-A PREROUTING -j DROP
-A POSTROUTING -o eth1 -j MASQUERADE
COMMIT
# Completed on Wed Apr  2 17:12:26 2008

-Original Message-
From: Henrik Nordstrom [mailto:[EMAIL PROTECTED] 
Sent: quarta-feira, 2 de Abril de 2008 11:42
To: Jorge Bastos
Cc: squid-users@squid-cache.org
Subject: RE: [squid-users] client ip's

WHat do your iptables NAT rules look like?

iptables-save -t nat

ons 2008-04-02 klockan 09:18 +0100 skrev Jorge Bastos:
> Transparent proxy
> 
> Squid running on: 8080
> And I forward 80 => 8080 (squid) => web
> 
> My iptables rules are intact, I believe it was from 3.0 stable 1 or 2 that
> this started to happen.
> 
> 
> 
> 
> > -Original Message-
> > From: Henrik Nordstrom [mailto:[EMAIL PROTECTED]
> > Sent: quarta-feira, 2 de Abril de 2008 0:12
> > To: Jorge Bastos
> > Cc: squid-users@squid-cache.org
> > Subject: RE: [squid-users] client ip's
> > 
> > tis 2008-04-01 klockan 12:29 +0100 skrev Jorge Bastos:
> > > No, just squid himself.
> > 
> > As a plain proxy, or playing with NAT?
> > 
> > Regards
> > Henrik
> 




RE: [squid-users] client ip's

2008-04-02 Thread Jorge Bastos
Transparent proxy

Squid running on: 8080
And I forward 80 => 8080 (squid) => web

My iptables rules are intact, I believe it was from 3.0 stable 1 or 2 that
this started to happen.




> -Original Message-
> From: Henrik Nordstrom [mailto:[EMAIL PROTECTED]
> Sent: quarta-feira, 2 de Abril de 2008 0:12
> To: Jorge Bastos
> Cc: squid-users@squid-cache.org
> Subject: RE: [squid-users] client ip's
> 
> tis 2008-04-01 klockan 12:29 +0100 skrev Jorge Bastos:
> > No, just squid himself.
> 
> As a plain proxy, or playing with NAT?
> 
> Regards
> Henrik




RE: [squid-users] client ip's

2008-04-01 Thread Jorge Bastos
No, just squid himself.




> -Original Message-
> From: Henrik Nordstrom [mailto:[EMAIL PROTECTED]
> Sent: terça-feira, 1 de Abril de 2008 10:22
> To: Jorge Bastos
> Cc: squid-users@squid-cache.org
> Subject: Re: [squid-users] client ip's
> 
> 
> tis 2008-04-01 klockan 10:07 +0100 skrev Jorge Bastos:
> > Hi,
> >
> > My squid always report "localhost" on the client's IP.
> > What can I do to correct this? Only started to happen with the last
> 3.0
> > stable2.
> 
> are you using dansguardian or another filtering proxy infront of your
> Squid?
> 
> Regards
> Henrik




[squid-users] client ip's

2008-04-01 Thread Jorge Bastos
Hi,

My squid always report "localhost" on the client's IP.
What can I do to correct this? Only started to happen with the last 3.0
stable2.


---
1207040749.939436 localhost TCP_MISS/200 1528 GET
http://library.gnome.org/skin/tab_right.png - DIRECT/209.132.176.176
image/png





RE: [squid-users] Re: squid meetup dinner/drinks

2008-03-01 Thread Jorge Bastos
Hi guys,
I know the victoria places on the map, i've been there once!!!
Too bad I'm not there, it was on vacation!




-Original Message-
From: Kinkie [mailto:[EMAIL PROTECTED] 
Sent: sábado, 1 de Março de 2008 17:35
To: Robert Collins
Cc: Squid Users; Squid Developers
Subject: [squid-users] Re: squid meetup dinner/drinks

On Sat, Mar 1, 2008 at 4:26 PM, Robert Collins <[EMAIL PROTECTED]> wrote:
> Hi,
> The squid meetup has been going well :).

This is great news! It makes me all the more sorry for not being there with you.
I'm Garmisch Partenkirchen (de), but with you in spirit.

Have fun, and happy squidding!

-- 
/kinkie



RE: [squid-users] Squid Quota

2007-12-11 Thread Jorge Bastos
Interesting.
Website for this?


-Original Message-
From: Cassiano Martin [mailto:[EMAIL PROTECTED] 
Sent: terça-feira, 11 de Dezembro de 2007 17:06
To: squid-users@squid-cache.org
Subject: [squid-users] Squid Quota

Hi All!

I wrote a squid quota daemon (sorry admin, if this is not the right 
place to announce!)
and its working, but in testing stages. Its a squid redirector, which 
MySQL db as backend, and a
log reader, which feeds the DB with information. You can set how much an 
user, or an IP can use, in MB per day.


I want someone to help me with the project, as I dont have too much free 
time. anyone interested, please
contact me. I'll be glad. :-)


Thanks!



RE: [squid-users] Ferramenta de administraçao centra lizada

2007-11-22 Thread Jorge Bastos
Carlos,

In english!!!
(Em ingles)


-Original Message-
From: Carlos Bispo [mailto:[EMAIL PROTECTED] 
Sent: quinta-feira, 22 de Novembro de 2007 19:27
To: squid-users@squid-cache.org
Subject: [squid-users] Ferramenta de administraçao centralizada

Salve Galera,
  Preciso de uma ferramenta de administração centralizada do
squid, de forma que ao atualizar um lista de controle ou criar um
usuario na rede este atualize todos os servidores de proxy da rede.
Isso pq tenho 3 filias e o acesso a Internet nelas são distintos,
tendo um servidor para cada uma delas e a atualização e gerencia
desses servidores fica dificil, mesmo tendo um gerenciador em cada uma
delas. Uso o Webmin para administrar as listas e controles via web.
Gostaria de aplicar estas listas no servidor da matriz e ele já
replicar isso para todas as filiais.
   Alguém conhece alguma ferramenta?? ... ou vai ter que ser no shell
script mesmoeheheh

Abraço a todos
-- 
Carlos Bispo
[EMAIL PROTECTED]




-- 
Carlos Bispo
91096385
[EMAIL PROTECTED]
Messenger: [EMAIL PROTECTED]



RE: [squid-users] squid3 WindowsUpdate failed

2007-11-17 Thread Jorge Bastos
Alex,
For now i'm going to leave this as fixed.
With the debian 3.0.RC1-2, Luigi added the resume patch as I requested and
it seams to work, I may done the test wrong the other time...
I'm going to see this for some days and if I notice something I'll warn you.
I don't know the feedback for other users.



-Original Message-
From: Alex Rousskov [mailto:[EMAIL PROTECTED] 
Sent: terça-feira, 6 de Novembro de 2007 15:19
To: Jorge Bastos
Cc: squid-users@squid-cache.org
Subject: RE: [squid-users] squid3 WindowsUpdate failed


On Tue, 2007-11-06 at 09:24 +, Jorge Bastos wrote:
> Alex,
> The only ACL i have in squid.conf is:
> 
> ---
> acl all_cache src 0.0.0.0/0.0.0.0
> no_cache deny all_cache
> ---

OK, thanks.

> I'm one of the people who's having this problems.
> Now I'm using 3.0.PRE6 until this is fixed.

Can you help us troubleshoot the problem? Can you run the latest Squid3
daily snapshot and collect full debugging (debug_options ALL,9) logs
when Windows Update is malfunctioning?

Thank you,

Alex.

> -Original Message-
> From: Alex Rousskov [mailto:[EMAIL PROTECTED] 
> Sent: segunda-feira, 5 de Novembro de 2007 16:31
> To: Amos Jeffries
> Cc: John Mok; squid-users@squid-cache.org
> Subject: Re: [squid-users] squid3 WindowsUpdate failed
> 
> On Sun, 2007-11-04 at 19:30 +1300, Amos Jeffries wrote:
> > I have just had the opportunity to do WU on a customers box and
> > managed to reproduce one of the possible WU failures.
> > 
> > This one was using WinXP, and the old WindowsUpdate (NOT 
> > MicrosoftUpdate, teht remains untested). With squid configured to
> > permit 
> > client access to:
> > 
> > # Windows Update / Microsoft Update
> > #
> > redir.metaservices.microsoft.com
> > images.metaservices.microsoft.com
> > c.microsoft.com
> > windowsupdate.microsoft.com
> > #
> > # WinXP / Win2k
> > .update.microsoft.com
> > download.windowsupdate.com
> > # Win Vista
> > .download.windowsupdate.com
> > # Win98
> > wustat.windows.com
> > crl.microsoft.com
> > 
> > AND also CONNECT access to www.update.microsoft.com:443
> > 
> > PROBLEM:
> >The client box detects a needed update,
> >then during the "Download Updates" phase it says "...failed!" and
> > stops.
> > 
> > CAUSE:
> > 
> > This was caused by a bug in squid reading the ACL:
> >download.windowsupdate.com
> >   ...
> >.download.windowsupdate.com
> > 
> >   - squid would detect that download.windowsupdate.com was a
> > subdomain 
> > of .download.windowsupdate.com  and .download.windowsupdate.com would
> > be 
> > culled off the ACL as unneeded.
> > 
> >   - That culled bit held the wildcard letting v4.download.* and 
> > www.download.* be retrieved later in the process.
> > 
> >   - BUT, specifying JUST .download.windowsupdate.com would cause 
> > download.windowsupdate.com/fubar to FAIL under the same circumstances.
> > 
> > during the WU process requests for application at 
> > www.download.windowsupdate.com/fubar and K/Q updates at 
> > v(3|4|5).download.windowsupdate.com/fubar2
> > would result in a 403 and thus the FAIL.
> > 
> > 
> > SOLUTION:
> >   Changing the wildcard match to an explicit for fixes this and WU 
> > succeeds again.
> > OR,
> >   Changing the wildcard to .windowsupdate.com also fixes the problem
> > for this test.
> 
> Can other folks experiencing Windows Update troubles with Squid3 confirm
> that their setup does not have the same ACL problem?
> 
> In general, if we do not find a way to get more information about the
> Windows Update problem, we would have to assume it does not exist in
> most environments and release Squid3 STABLE "as is". If you want the
> problem fixed before the stable Squid3 release, please help us reproduce
> or debug the problem.
> 
> Thank you,
> 
> Alex.
> 
> 




RE: [squid-users] squid3 WindowsUpdate failed

2007-11-07 Thread Jorge Bastos
I can help, both parts are interested in this, me and squid.
I asked the squid3 debian package maintainer to upload the daily sources
again to test.
I don't want to compile the sources since I always use binary's.
I'll let you know when I'll have news.
If you want to ask the maintainer to upload the daily sources again, that
may reforce my need for it and he may do it.




-Original Message-
From: Alex Rousskov [mailto:[EMAIL PROTECTED] 
Sent: terça-feira, 6 de Novembro de 2007 15:19
To: Jorge Bastos
Cc: squid-users@squid-cache.org
Subject: RE: [squid-users] squid3 WindowsUpdate failed


On Tue, 2007-11-06 at 09:24 +, Jorge Bastos wrote:
> Alex,
> The only ACL i have in squid.conf is:
> 
> ---
> acl all_cache src 0.0.0.0/0.0.0.0
> no_cache deny all_cache
> ---

OK, thanks.

> I'm one of the people who's having this problems.
> Now I'm using 3.0.PRE6 until this is fixed.

Can you help us troubleshoot the problem? Can you run the latest Squid3
daily snapshot and collect full debugging (debug_options ALL,9) logs
when Windows Update is malfunctioning?

Thank you,

Alex.

> -Original Message-
> From: Alex Rousskov [mailto:[EMAIL PROTECTED] 
> Sent: segunda-feira, 5 de Novembro de 2007 16:31
> To: Amos Jeffries
> Cc: John Mok; squid-users@squid-cache.org
> Subject: Re: [squid-users] squid3 WindowsUpdate failed
> 
> On Sun, 2007-11-04 at 19:30 +1300, Amos Jeffries wrote:
> > I have just had the opportunity to do WU on a customers box and
> > managed to reproduce one of the possible WU failures.
> > 
> > This one was using WinXP, and the old WindowsUpdate (NOT 
> > MicrosoftUpdate, teht remains untested). With squid configured to
> > permit 
> > client access to:
> > 
> > # Windows Update / Microsoft Update
> > #
> > redir.metaservices.microsoft.com
> > images.metaservices.microsoft.com
> > c.microsoft.com
> > windowsupdate.microsoft.com
> > #
> > # WinXP / Win2k
> > .update.microsoft.com
> > download.windowsupdate.com
> > # Win Vista
> > .download.windowsupdate.com
> > # Win98
> > wustat.windows.com
> > crl.microsoft.com
> > 
> > AND also CONNECT access to www.update.microsoft.com:443
> > 
> > PROBLEM:
> >The client box detects a needed update,
> >then during the "Download Updates" phase it says "...failed!" and
> > stops.
> > 
> > CAUSE:
> > 
> > This was caused by a bug in squid reading the ACL:
> >download.windowsupdate.com
> >   ...
> >.download.windowsupdate.com
> > 
> >   - squid would detect that download.windowsupdate.com was a
> > subdomain 
> > of .download.windowsupdate.com  and .download.windowsupdate.com would
> > be 
> > culled off the ACL as unneeded.
> > 
> >   - That culled bit held the wildcard letting v4.download.* and 
> > www.download.* be retrieved later in the process.
> > 
> >   - BUT, specifying JUST .download.windowsupdate.com would cause 
> > download.windowsupdate.com/fubar to FAIL under the same circumstances.
> > 
> > during the WU process requests for application at 
> > www.download.windowsupdate.com/fubar and K/Q updates at 
> > v(3|4|5).download.windowsupdate.com/fubar2
> > would result in a 403 and thus the FAIL.
> > 
> > 
> > SOLUTION:
> >   Changing the wildcard match to an explicit for fixes this and WU 
> > succeeds again.
> > OR,
> >   Changing the wildcard to .windowsupdate.com also fixes the problem
> > for this test.
> 
> Can other folks experiencing Windows Update troubles with Squid3 confirm
> that their setup does not have the same ACL problem?
> 
> In general, if we do not find a way to get more information about the
> Windows Update problem, we would have to assume it does not exist in
> most environments and release Squid3 STABLE "as is". If you want the
> problem fixed before the stable Squid3 release, please help us reproduce
> or debug the problem.
> 
> Thank you,
> 
> Alex.
> 
> 




RE: [squid-users] squid3 WindowsUpdate failed

2007-11-06 Thread Jorge Bastos
On my machine it's, 3.0-PRE6 and 3.0-RC1



-Original Message-
From: Adrian Chadd [mailto:[EMAIL PROTECTED] 
Sent: terça-feira, 6 de Novembro de 2007 10:02
To: Jorge Bastos
Cc: squid-users@squid-cache.org
Subject: Re: [squid-users] squid3 WindowsUpdate failed

On Tue, Nov 06, 2007, Jorge Bastos wrote:
> Alex,
> The only ACL i have in squid.conf is:
> 
> ---
> acl all_cache src 0.0.0.0/0.0.0.0
> no_cache deny all_cache
> ---
> 
> I'm one of the people who's having this problems.
> Now I'm using 3.0.PRE6 until this is fixed.

So wait - Squid-3.0.PRE6 works but Squid-3.0.PRE7 with exactly the same
configuration file doesn't?



Adrian


-- 
- Xenion - http://www.xenion.com.au/ - VPS Hosting - Commercial Squid
Support -
- $25/pm entry-level VPSes w/ capped bandwidth charges available in WA -



RE: [squid-users] squid3 WindowsUpdate failed

2007-11-06 Thread Jorge Bastos
Alex,
The only ACL i have in squid.conf is:

---
acl all_cache src 0.0.0.0/0.0.0.0
no_cache deny all_cache
---

I'm one of the people who's having this problems.
Now I'm using 3.0.PRE6 until this is fixed.



-Original Message-
From: Alex Rousskov [mailto:[EMAIL PROTECTED] 
Sent: segunda-feira, 5 de Novembro de 2007 16:31
To: Amos Jeffries
Cc: John Mok; squid-users@squid-cache.org
Subject: Re: [squid-users] squid3 WindowsUpdate failed

On Sun, 2007-11-04 at 19:30 +1300, Amos Jeffries wrote:
> I have just had the opportunity to do WU on a customers box and
> managed to reproduce one of the possible WU failures.
> 
> This one was using WinXP, and the old WindowsUpdate (NOT 
> MicrosoftUpdate, teht remains untested). With squid configured to
> permit 
> client access to:
> 
> # Windows Update / Microsoft Update
> #
> redir.metaservices.microsoft.com
> images.metaservices.microsoft.com
> c.microsoft.com
> windowsupdate.microsoft.com
> #
> # WinXP / Win2k
> .update.microsoft.com
> download.windowsupdate.com
> # Win Vista
> .download.windowsupdate.com
> # Win98
> wustat.windows.com
> crl.microsoft.com
> 
> AND also CONNECT access to www.update.microsoft.com:443
> 
> PROBLEM:
>The client box detects a needed update,
>then during the "Download Updates" phase it says "...failed!" and
> stops.
> 
> CAUSE:
> 
> This was caused by a bug in squid reading the ACL:
>download.windowsupdate.com
>   ...
>.download.windowsupdate.com
> 
>   - squid would detect that download.windowsupdate.com was a
> subdomain 
> of .download.windowsupdate.com  and .download.windowsupdate.com would
> be 
> culled off the ACL as unneeded.
> 
>   - That culled bit held the wildcard letting v4.download.* and 
> www.download.* be retrieved later in the process.
> 
>   - BUT, specifying JUST .download.windowsupdate.com would cause 
> download.windowsupdate.com/fubar to FAIL under the same circumstances.
> 
> during the WU process requests for application at 
> www.download.windowsupdate.com/fubar and K/Q updates at 
> v(3|4|5).download.windowsupdate.com/fubar2
> would result in a 403 and thus the FAIL.
> 
> 
> SOLUTION:
>   Changing the wildcard match to an explicit for fixes this and WU 
> succeeds again.
> OR,
>   Changing the wildcard to .windowsupdate.com also fixes the problem
> for this test.

Can other folks experiencing Windows Update troubles with Squid3 confirm
that their setup does not have the same ACL problem?

In general, if we do not find a way to get more information about the
Windows Update problem, we would have to assume it does not exist in
most environments and release Squid3 STABLE "as is". If you want the
problem fixed before the stable Squid3 release, please help us reproduce
or debug the problem.

Thank you,

Alex.





RE: [squid-users] squid3 WindowsUpdate failed

2007-11-01 Thread Jorge Bastos
I've updated squid with the resume fix, and WU still not working.

---
squid3 (3.0.RC1-2) unstable; urgency=low

  * debian/patches/08-resume-http.dpatch
- Added upstream patch fixing failure to resume downloads
---

Any idea?



-Original Message-
From: Amos Jeffries [mailto:[EMAIL PROTECTED] 
Sent: quarta-feira, 31 de Outubro de 2007 2:43
To: Reinhard Haller
Cc: squid-users@squid-cache.org
Subject: Re: [squid-users] squid3 WindowsUpdate failed

> Amos Jeffries schrieb:
>> John Mok wrote:
>>> Hi,
>>>
>>> I am using Squid3 nightly built 20071026 running on Ubuntu 6.06 LTS
>>> with the compilation options :-
>>>
>>> ./configure --with-pthreads --enable-icap-client
>>>
>>> I tried both (i) the configurations with default option, or (ii)
>>> icap-enabled options, the Windows client failed to get WindowsUpdate
>>> (see the following log).
>>>
>>
>> Where is this failure you speak of?
>>   The log you posted showed a proper link to WindowsUpdate, with all
>> the static content coming from cache (TCP_HIT/TCP_IMS_HIT) and the
>> dynamic pages and updates being brought in from M$ (TCP_MISS)
>>
>> If your client got the custom M$ "Windows Update failed" page.
>> Then I suspect you have overlooked a M$ nasty:
>>   WU requires an HTTPS 'validation' test.
>>   You MUST permit an HTTP 'CONNECT' request to 65.55.184.125:443.
>>   (the IPA being that of www.update.microsoft.com from your current
>> location)
>>
>> This bypass needs to be made on your firewall. WU will NOT always
>> attempt it through the configured proxy :-(
>>
>> The best you can do is bypass it at the FW and also configure the
>> proxy manually in IE, then run "proxycfg -u" in command line on the
>> windows box, and hope that the particular box update level will use
>> the proxy for it.
>>
>>
>> Amos
> Sorry Amos,
>
> the problem exists! It appeared with squid 3.0 RC1. After the downgrade
> to 2.6 (urlgroup is missing in 3.0) I don't  have any problem with WU.
> The bypass in the firewall is not needed for proper operation.

Lucky you, looks like you have a good up-to-date user base then :). Mine
have trouble in WinXP SP1 and some earlier versions of the ActiveX WU'er
they call MicrosoftUpdate. GenuineAdvantage my a*%#.

>
> Reinhard
>

IIRC others earlier found that WU used range requests to speed downloads.
I have never confirmed this myself.

Anyway, a bug has just been found in 3.0.RC1 that caused certain range
requests to close prematurely.
http://www.squid-cache.org/bugs/show_bug.cgi?id=2116

A fix has been incorporated in the next daily snapshot. If anyone is
having this problem with 3.0.RC1, please give the 31 Oct or later
snapshots a try and see if that fixes your problem.
If its still present it will need to be reported as a bug with traces, etc.
Thank you.

Amos





RE: [squid-users] squid3 WindowsUpdate failed

2007-10-31 Thread Jorge Bastos
I forgive you.
But just this once :P



-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] 
Sent: quarta-feira, 31 de Outubro de 2007 11:09
To: [EMAIL PROTECTED]
Cc: squid-users@squid-cache.org
Subject: RE: [squid-users] squid3 WindowsUpdate failed

As I said, just me being lazy,, sorry about that! 

-Original Message-
From: Jorge Bastos [mailto:[EMAIL PROTECTED] 
Sent: 31 October 2007 10:43
To: squid-users@squid-cache.org
Subject: RE: [squid-users] squid3 WindowsUpdate failed

Not an option, at least for me.
I think that is not the way to resolve things.
This problem with WU may happen with other things, and the correct way if
to fix this, not using other software.



-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED]
Sent: quarta-feira, 31 de Outubro de 2007 10:29
To: [EMAIL PROTECTED]; [EMAIL PROTECTED]
Cc: squid-users@squid-cache.org
Subject: RE: [squid-users] squid3 WindowsUpdate failed

Hi all..

I know that this is not the answer that you are looking for, but why not
just install a WSUS server internally? Then point all your clients to it
via AD policy. (me being a lazy bugger here!)

Jay

-Original Message-
From: Amos Jeffries [mailto:[EMAIL PROTECTED]
Sent: 31 October 2007 02:43
To: Reinhard Haller
Cc: squid-users@squid-cache.org
Subject: Re: [squid-users] squid3 WindowsUpdate failed

> Amos Jeffries schrieb:
>> John Mok wrote:
>>> Hi,
>>>
>>> I am using Squid3 nightly built 20071026 running on Ubuntu 6.06 LTS 
>>> with the compilation options :-
>>>
>>> ./configure --with-pthreads --enable-icap-client
>>>
>>> I tried both (i) the configurations with default option, or (ii) 
>>> icap-enabled options, the Windows client failed to get WindowsUpdate 
>>> (see the following log).
>>>
>>
>> Where is this failure you speak of?
>>   The log you posted showed a proper link to WindowsUpdate, with all 
>> the static content coming from cache (TCP_HIT/TCP_IMS_HIT) and the 
>> dynamic pages and updates being brought in from M$ (TCP_MISS)
>>
>> If your client got the custom M$ "Windows Update failed" page.
>> Then I suspect you have overlooked a M$ nasty:
>>   WU requires an HTTPS 'validation' test.
>>   You MUST permit an HTTP 'CONNECT' request to 65.55.184.125:443.
>>   (the IPA being that of www.update.microsoft.com from your current
>> location)
>>
>> This bypass needs to be made on your firewall. WU will NOT always 
>> attempt it through the configured proxy :-(
>>
>> The best you can do is bypass it at the FW and also configure the 
>> proxy manually in IE, then run "proxycfg -u" in command line on the 
>> windows box, and hope that the particular box update level will use 
>> the proxy for it.
>>
>>
>> Amos
> Sorry Amos,
>
> the problem exists! It appeared with squid 3.0 RC1. After the 
> downgrade to 2.6 (urlgroup is missing in 3.0) I don't  have any 
> problem
with WU.
> The bypass in the firewall is not needed for proper operation.

Lucky you, looks like you have a good up-to-date user base then :). Mine
have trouble in WinXP SP1 and some earlier versions of the ActiveX WU'er
they call MicrosoftUpdate. GenuineAdvantage my a*%#.

>
> Reinhard
>

IIRC others earlier found that WU used range requests to speed downloads.
I have never confirmed this myself.

Anyway, a bug has just been found in 3.0.RC1 that caused certain range
requests to close prematurely.
http://www.squid-cache.org/bugs/show_bug.cgi?id=2116

A fix has been incorporated in the next daily snapshot. If anyone is
having this problem with 3.0.RC1, please give the 31 Oct or later
snapshots a try and see if that fixes your problem.
If its still present it will need to be reported as a bug with traces,
etc.
Thank you.

Amos



__

This message (including any attachments) is confidential and may be
privileged. It is intended for use by the addressee only. If you have
received it by mistake please notify the sender by return e-mail and
delete this message from your system. Any unauthorised use or
dissemination of this message in whole or in part is strictly prohibited.
Please note that e-mails are susceptible to change. LeasePlan Corporation
N.V. (including its group companies) shall not be responsible nor liable
for the proper and complete transmission of the information contained in
this communication nor for any delay in its receipt or damage to your
system. LeasePlan Corporation N.V. (or its group companies) does not
guarantee the confidentiality of this message, nor that the integrity of
this communication has been maintained nor that this communication is free
of viruses, interceptions 

RE: [squid-users] squid3 WindowsUpdate failed

2007-10-31 Thread Jorge Bastos
Not an option, at least for me.
I think that is not the way to resolve things.
This problem with WU may happen with other things, and the correct way if to
fix this, not using other software.



-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] 
Sent: quarta-feira, 31 de Outubro de 2007 10:29
To: [EMAIL PROTECTED]; [EMAIL PROTECTED]
Cc: squid-users@squid-cache.org
Subject: RE: [squid-users] squid3 WindowsUpdate failed

Hi all..

I know that this is not the answer that you are looking for, but why not
just install a WSUS server internally? Then point all your clients to it
via AD policy. (me being a lazy bugger here!)

Jay

-Original Message-
From: Amos Jeffries [mailto:[EMAIL PROTECTED] 
Sent: 31 October 2007 02:43
To: Reinhard Haller
Cc: squid-users@squid-cache.org
Subject: Re: [squid-users] squid3 WindowsUpdate failed

> Amos Jeffries schrieb:
>> John Mok wrote:
>>> Hi,
>>>
>>> I am using Squid3 nightly built 20071026 running on Ubuntu 6.06 LTS 
>>> with the compilation options :-
>>>
>>> ./configure --with-pthreads --enable-icap-client
>>>
>>> I tried both (i) the configurations with default option, or (ii) 
>>> icap-enabled options, the Windows client failed to get WindowsUpdate 
>>> (see the following log).
>>>
>>
>> Where is this failure you speak of?
>>   The log you posted showed a proper link to WindowsUpdate, with all 
>> the static content coming from cache (TCP_HIT/TCP_IMS_HIT) and the 
>> dynamic pages and updates being brought in from M$ (TCP_MISS)
>>
>> If your client got the custom M$ "Windows Update failed" page.
>> Then I suspect you have overlooked a M$ nasty:
>>   WU requires an HTTPS 'validation' test.
>>   You MUST permit an HTTP 'CONNECT' request to 65.55.184.125:443.
>>   (the IPA being that of www.update.microsoft.com from your current
>> location)
>>
>> This bypass needs to be made on your firewall. WU will NOT always 
>> attempt it through the configured proxy :-(
>>
>> The best you can do is bypass it at the FW and also configure the 
>> proxy manually in IE, then run "proxycfg -u" in command line on the 
>> windows box, and hope that the particular box update level will use 
>> the proxy for it.
>>
>>
>> Amos
> Sorry Amos,
>
> the problem exists! It appeared with squid 3.0 RC1. After the 
> downgrade to 2.6 (urlgroup is missing in 3.0) I don't  have any problem
with WU.
> The bypass in the firewall is not needed for proper operation.

Lucky you, looks like you have a good up-to-date user base then :). Mine
have trouble in WinXP SP1 and some earlier versions of the ActiveX WU'er
they call MicrosoftUpdate. GenuineAdvantage my a*%#.

>
> Reinhard
>

IIRC others earlier found that WU used range requests to speed downloads.
I have never confirmed this myself.

Anyway, a bug has just been found in 3.0.RC1 that caused certain range
requests to close prematurely.
http://www.squid-cache.org/bugs/show_bug.cgi?id=2116

A fix has been incorporated in the next daily snapshot. If anyone is
having this problem with 3.0.RC1, please give the 31 Oct or later
snapshots a try and see if that fixes your problem.
If its still present it will need to be reported as a bug with traces,
etc.
Thank you.

Amos



__

This message (including any attachments) is confidential and may be
privileged. It is intended for use by the addressee only. If you have
received it by mistake please notify the sender by return e-mail and delete
this message from your system. Any unauthorised use or dissemination of
this message in whole or in part is strictly prohibited. Please note that
e-mails are susceptible to change. LeasePlan Corporation N.V. (including
its group companies) shall not be responsible nor liable for the proper and
complete transmission of the information contained in this communication
nor for any delay in its receipt or damage to your system. LeasePlan
Corporation N.V. (or its group companies) does not guarantee the
confidentiality of this message, nor that the integrity of this
communication has been maintained nor that this communication is free of
viruses, interceptions or interference."
__



RE: [squid-users] squid3 WindowsUpdate failed

2007-10-30 Thread Jorge Bastos
That's what i did, downgraded to 2.6 stable.



-Original Message-
From: Reinhard Haller [mailto:[EMAIL PROTECTED] 
Sent: terça-feira, 30 de Outubro de 2007 8:38
To: squid-users@squid-cache.org
Subject: Re: [squid-users] squid3 WindowsUpdate failed

Amos Jeffries schrieb:
> John Mok wrote:
>> Hi,
>>
>> I am using Squid3 nightly built 20071026 running on Ubuntu 6.06 LTS 
>> with the compilation options :-
>>
>> ./configure --with-pthreads --enable-icap-client
>>
>> I tried both (i) the configurations with default option, or (ii) 
>> icap-enabled options, the Windows client failed to get WindowsUpdate 
>> (see the following log).
>>
>
> Where is this failure you speak of?
>   The log you posted showed a proper link to WindowsUpdate, with all 
> the static content coming from cache (TCP_HIT/TCP_IMS_HIT) and the 
> dynamic pages and updates being brought in from M$ (TCP_MISS)
>
> If your client got the custom M$ "Windows Update failed" page.
> Then I suspect you have overlooked a M$ nasty:
>   WU requires an HTTPS 'validation' test.
>   You MUST permit an HTTP 'CONNECT' request to 65.55.184.125:443.
>   (the IPA being that of www.update.microsoft.com from your current 
> location)
>
> This bypass needs to be made on your firewall. WU will NOT always 
> attempt it through the configured proxy :-(
>
> The best you can do is bypass it at the FW and also configure the 
> proxy manually in IE, then run "proxycfg -u" in command line on the 
> windows box, and hope that the particular box update level will use 
> the proxy for it.
>
>
> Amos
Sorry Amos,

the problem exists! It appeared with squid 3.0 RC1. After the downgrade 
to 2.6 (urlgroup is missing in 3.0) I don't  have any problem with WU.
The bypass in the firewall is not needed for proper operation.

Reinhard




RE: [squid-users] Problem with 3.0 RC1

2007-10-28 Thread Jorge Bastos
Chris, check the new thread with subject "squid3 WindowsUpdate failed",
I belive that's the same problem that I am having.



-Original Message-
From: Chris Robertson [mailto:[EMAIL PROTECTED] 
Sent: sexta-feira, 26 de Outubro de 2007 21:34
To: squid-users@squid-cache.org
Subject: Re: [squid-users] Problem with 3.0 RC1

Jorge Bastos wrote:
> Chris,
>
> Not Found
> The requested URL /Versions/v3/3.0/cfgman/debug_options was not found on
> this server.
>
> Apache/1.3.37 Server at www.squid-cache.org Port 80
>
>   

G.  That should have been:

http://www.squid-cache.org/Versions/v3/3.0/cfgman/debug_options.html

Chris





RE: [squid-users] Problem with 3.0 RC1

2007-10-26 Thread Jorge Bastos
Chris,

Not Found
The requested URL /Versions/v3/3.0/cfgman/debug_options was not found on
this server.

Apache/1.3.37 Server at www.squid-cache.org Port 80



-Original Message-
From: Chris Robertson [mailto:[EMAIL PROTECTED] 
Sent: sexta-feira, 26 de Outubro de 2007 1:25
To: squid-users@squid-cache.org
Subject: Re: [squid-users] Problem with 3.0 RC1

Jorge Bastos wrote:
> > -Original Message- From: Alex Rousskov
> > [mailto:[EMAIL PROTECTED] Sent: segunda-feira, 22
> > de Outubro de 2007 17:25 To: Jorge Bastos Cc:
> > squid-users@squid-cache.org Subject: Re: [squid-users] Problem with
> > 3.0 RC1
> >
> > On Mon, 2007-10-22 at 11:56 +0100, Jorge Bastos wrote:
> >
> >> I've updated from 3.0 PRE6 where everything worked OK, but with
> >> 3.0 RC1, for example, I have several machines in the network,
> >> that are installed with new licenses of Windows XP, and the
> >> automatic download start, but just stay's at 0%.
> >>
> >> I reverted to 3.0 PRE6 and it works like a charm.
> >
> > If you get no responses here, I would suggest that you try the
> > daily Squid3 snapshot.
> >
> > If the problem is still there, collect cache.log with full
> > debugging enabled (debug_options ALL,9 in squid.conf) while
> > reproducing the problem. Once you collect the information, please
> > file a Squid bug report with your compressed cache.log and the
> > above description.
> >
> > Thank you,
> >
> > Alex.
>  I thinked about that... but i didn't want to compile it sinse i use
>  the debian binary's.

You don't have to recompile to turn log debugging on.  It's a squid.conf 
directive: http://www.squid-cache.org/Versions/v3/3.0/cfgman/debug_options.

Chris



RE: [squid-users] Problem with 3.0 RC1

2007-10-22 Thread Jorge Bastos
I thinked about that... but i didn't want to compile it sinse i use the
debian binary's.

Lets see if someone knows this issue.


-Original Message-
From: Alex Rousskov [mailto:[EMAIL PROTECTED] 
Sent: segunda-feira, 22 de Outubro de 2007 17:25
To: Jorge Bastos
Cc: squid-users@squid-cache.org
Subject: Re: [squid-users] Problem with 3.0 RC1

On Mon, 2007-10-22 at 11:56 +0100, Jorge Bastos wrote:

> I've updated from 3.0 PRE6 where everything worked OK, but with 3.0 RC1,
for
> example, I have several machines in the network, that are installed with
new
> licenses of Windows XP, and the automatic download start, but just stay's
at
> 0%.
> 
> I reverted to 3.0 PRE6 and it works like a charm.

If you get no responses here, I would suggest that you try the daily
Squid3 snapshot.

If the problem is still there, collect cache.log with full debugging
enabled (debug_options ALL,9 in squid.conf) while reproducing the
problem. Once you collect the information, please file a Squid bug
report with your compressed cache.log and the above description.

Thank you,

Alex.





[squid-users] Problem with 3.0 RC1

2007-10-22 Thread Jorge Bastos
Hi guys,
Just signed up to the list.
I've updated from 3.0 PRE6 where everything worked OK, but with 3.0 RC1, for
example, I have several machines in the network, that are installed with new
licenses of Windows XP, and the automatic download start, but just stay's at
0%.

I reverted to 3.0 PRE6 and it works like a charm.

Is this known already?

Jorge