[squid-users] Squid Accelerator mode: HTTP/1.0 and 'defaultsite' header

2010-03-18 Thread Riccardo Castellani
I'm using Squid in accelerator mode (Version 2.7.STABLE3).
I'm using this config:

http_port 72.43.22.19:80 accel vhost
cache_peer 10.1.1.2 parent 8283 0 no-query originserver Name=ITS no-digest
acl LIST dstdomain pages.example.com
http_access allow LIST
cache_peer_access ITS allow LIST
cache_peer_access ITS deny all


I can to describe my environment:

We give this public site "mysite.example.com" (having this IP
72.43.22.19:80) where users can view 4 links. If you pass by mouse in front
of this 4 links you can see:

1st linkhttp://pages.example.com/mkLista.do?code=A
2nd linkhttp://pages.example.com/mkLista.do?code=B
3rd linkhttp://pages.example.com/mkLista.do?code=C
4th linkhttp://pages.example.com/mkLista.do?code=D

'pages.example.com' is resolved as IP address of Squid, that is 72.43.22.19;
this is a way to route requests to Squid accelerator (Squid is both on
extern. and on intern. network) which accelerates to 10.1.1.2 (internal
server).
I'd like accelerating ONLY these 4 links but I have no one 'defaultsite',
infact pages.example.com points to 'Apache Tomcat' default page; accelerated
server gives only these 4 objects.
I read HTTP/1.0 requests don't send 'Host' header so if I omit
'defaultsite', clients will get an "Invalid request" error.

I can't understand if I have to insert defaultsite=pages.example.com in this
case or not.
Now I'm working fine without this option but I have doubts if requests of
HTTP/1.0 clients can be accelerated.





RE: [squid-users] Squid cache_dir failed - can squid survive?

2010-03-18 Thread Henrik Nordström
tor 2010-03-18 klockan 06:16 + skrev GIGO .:
> Dear henrik,
>  
> If you have only one physical machine what is the best strategy for
> miminmizing the downtime and rebuild the cache directory again or
> start utilizing the squid without the cache directory? I assume we
> have to reinstall the Squid Software? Please guide

The approach I proposed earlier with two Squid processes running in
cooperation will make service surive automatically for as long as the
system disk is working.

If using just one process then making Squid stop trying to using the
cache is as simple as removing the cache_dir specifications from
squid.conf and start Squid again. You do not need to reinstall unless
the system/os partition have been damaged. This change to squid.conf can
easily be automated with a little script if you want.

Regards
Henrik





Re: [squid-users] Squid cache_dir failed - can squid survive?

2010-03-18 Thread Henrik Nordström
tor 2010-03-18 klockan 17:25 +1100 skrev Ivan .:
> I wonder about the value of http cache, when the majority of high
> volume sites used in the corporate environment are dynamic.
> http://www.mnot.net/cache_docs/

Hit ratio have not declined that much in the last decade. It's still
around 25-30% byte hit ratio and significantly more in request hit
ratio.

While it's true that a lot of the html content is more dynamic than
before there is also lots more inlined content such as images etc which
are plain static and caches just fine and these make up for the majority
of the traffic.

> How is the no-cache HTTP header handled by Squid?

By default as if the response is not cachable. Somewhat stricter than
the specifications require, but more in line with what web authors
expect when using this directive.

Regards
Henrik



Re: [squid-users] purge command not working

2010-03-18 Thread Amos Jeffries

jayesh chavan wrote:

Hi,
   I am not specifying anything about port in my command.My command is:
squidclient.exe -m PURGE http://www.yourtargetwebsite.com/



Which is precisely your problem.
  http://www.squid-cache.org/Versions/v3/3.1/manuals/squidclient

Amos
--
Please be using
  Current Stable Squid 2.7.STABLE8 or 3.0.STABLE25
  Current Beta Squid 3.1.0.18


Re: [squid-users] Squid cache_dir failed - can squid survive?

2010-03-18 Thread Ivan .
I noticed an improvement when I disabled it, which may have something
to do with my cache settings, but I tried a number of config combos
without much success

2010/3/18 Henrik Nordström :
> tor 2010-03-18 klockan 17:25 +1100 skrev Ivan .:
>> I wonder about the value of http cache, when the majority of high
>> volume sites used in the corporate environment are dynamic.
>> http://www.mnot.net/cache_docs/
>
> Hit ratio have not declined that much in the last decade. It's still
> around 25-30% byte hit ratio and significantly more in request hit
> ratio.
>
> While it's true that a lot of the html content is more dynamic than
> before there is also lots more inlined content such as images etc which
> are plain static and caches just fine and these make up for the majority
> of the traffic.
>
>> How is the no-cache HTTP header handled by Squid?
>
> By default as if the response is not cachable. Somewhat stricter than
> the specifications require, but more in line with what web authors
> expect when using this directive.
>
> Regards
> Henrik
>
>


Re: [squid-users] Squid Accelerator mode: HTTP/1.0 and 'defaultsite' header

2010-03-18 Thread Amos Jeffries

Riccardo Castellani wrote:

I'm using Squid in accelerator mode (Version 2.7.STABLE3).
I'm using this config:

http_port 72.43.22.19:80 accel vhost
cache_peer 10.1.1.2 parent 8283 0 no-query originserver Name=ITS no-digest
acl LIST dstdomain pages.example.com
http_access allow LIST
cache_peer_access ITS allow LIST
cache_peer_access ITS deny all


I can to describe my environment:

We give this public site "mysite.example.com" (having this IP
72.43.22.19:80) where users can view 4 links. If you pass by mouse in front
of this 4 links you can see:

1st linkhttp://pages.example.com/mkLista.do?code=A
2nd linkhttp://pages.example.com/mkLista.do?code=B
3rd linkhttp://pages.example.com/mkLista.do?code=C
4th linkhttp://pages.example.com/mkLista.do?code=D

'pages.example.com' is resolved as IP address of Squid, that is 72.43.22.19;
this is a way to route requests to Squid accelerator (Squid is both on
extern. and on intern. network) which accelerates to 10.1.1.2 (internal
server).
I'd like accelerating ONLY these 4 links but I have no one 'defaultsite',
infact pages.example.com points to 'Apache Tomcat' default page; accelerated
server gives only these 4 objects.
I read HTTP/1.0 requests don't send 'Host' header so if I omit
'defaultsite', clients will get an "Invalid request" error.

I can't understand if I have to insert defaultsite=pages.example.com in this
case or not.
Now I'm working fine without this option but I have doubts if requests of
HTTP/1.0 clients can be accelerated.



Most clients these days will do so regardless of their version. 
defaultsite is completely optional, in your case if you omit it broken 
clients will get the squid "invalid request" error page instead of 
tomcat front page.


Amos
--
Please be using
  Current Stable Squid 2.7.STABLE8 or 3.0.STABLE25
  Current Beta Squid 3.1.0.18


[squid-users] Cancelled downloads

2010-03-18 Thread CASALI COMPUTERS - Michele Brodoloni
Hello,
is it possible to stop squid from keep downloading a file when a user stops the 
download from his browser?
If an user initiates a 1GB of web download and then hits “cancel”, squid 
doesn’t mind it and continues to download until it finishes, and this is a 
waste of bandwidth.

Is there a solution for this behavior?

Thanks



[squid-users] Cache_effective_user issue

2010-03-18 Thread GIGO .

I installed ubuntu server as "u-admin" account. I have set the directive:  
cache_effective_user to "squidadmin". However my squid process keep on running 
as b-admin account. And my cache_effective user never comes into play.
 
As per a guide which described that squid will only shift to the defined user 
if it is run as root but i am unable to start it as root. I logged in as root 
by sudo -s (running it as root) and tried to start the squid but the 
permissions were denied. For the same reason i guess my startup scripts kept on 
failing.
 
I have created the squidadmin account as nologin noshell for security purposes 
and feel tht squid process should be run with this account. Please guide am i 
thinking right?
 
 
 
 
please help me out.
 
regards,
 

  
_
Your E-mail and More On-the-Go. Get Windows Live Hotmail Free.
https://signup.live.com/signup.aspx?id=60969

[squid-users] Squid3 issues

2010-03-18 Thread Gmail

Hello everyone,
I have been trying for nearly 5 weeks now to get this piece of software to 
work, I have tried several versions, I have tried it on several platforms, 
all I got from it is frustration, I know that some people would say what a 
fantastic piece of software.


I have used many softwares, packages, compiled stuff for years, never ever 
had an experience such as this one, it's a package full of headaches, and 
problem after problem, And to be honest the feedback I get is always blaming 
other things, why can't you people just admit that Squid doesn't work at 
all, and you are not providing any help whatsoever, as if you expect 
everyone to be an expert.


I also don't like the attitude of some people, talking to you as if you're 
an "idiot", while in fact you follow their suggestions to the letter and yet 
it doesn't work, instead of blaming the operating systems and blaming people 
for not knowing how to use it, why can't you try and do something that works 
for a change, I have wasted nearly 5 weeks day in day out sometimes I stayed 
til 3 or 4 am trying desperately to get this thing working.


For instance if I compile with no options I know that somewhere down the 
line I am going to find out that I needed this or that, if I compile it with 
some options I get errors that don't make any sense, examples.


I uninstalled the version that was packaged with Ubuntu hardy, I am trying 
to compile it so I won't have the same problem, with the file descriptors, I 
followed exactly the suggestions in the configure --help menu, yet I am 
getting an error,

like Compile cannot create executable, or something to that effect.

Not to mention when I tried to run it, it didn't forward any requests, I 
have followed all of the configuration examples and people's suggestions, 
never could forward any request to my backend server.


After three weeks I managed to get my clients to have access to the 
internet, and many applications didn't work, such as Yahoo, Msn, Steam and 
so on, when I ask for help, nobody has an answer including some members of 
the team.


Yes I can hear some arguments, saying but we are volunteers, true, but you 
either do something that works or don't.


If I needed help for say, Unrealircd or any other program I know I can get 
help, and their documentaion, does what it says on the tin. you follow their 
instructions, you will get it to work exactly as they say.


With squid, it doesn't work, that's all I am getting, I don't even believe 
that it works now to be honest, I am sorry I am not moaning but it's true, I 
have been on many forums for weeks and all I could see were problems people 
are facing with any version of squid , and no solutions are given very few 
and after you fix one problem 10 others pop up somewhere else I certainly 
don't want to spend my life fixing and bashing my head trying to find a 
solution, I want something that works, but unfortunately it doesn't.


I am just going to try something else somewhere else,
Thanks all the same for anyone who tried to help, but this is not for me, 
life is too short to waste anymore of my time, in trying to get something 
that doesn't work, "working"


If anybody can prove me wrong:

Regards
All the best to everyone 



[squid-users] Squid proxy Setup in fail-over mode

2010-03-18 Thread GIGO .

How to setup squid proxy to run in fail-over mode? Any guide.
 
regards,
 
Bilal Aslam
  
_
Hotmail: Powerful Free email with security by Microsoft.
https://signup.live.com/signup.aspx?id=60969

Re: [squid-users] Squid proxy Setup in fail-over mode

2010-03-18 Thread Luis Daniel Lucio Quiroz
Le Jeudi 18 Mars 2010 07:35:04, GIGO . a écrit :
> How to setup squid proxy to run in fail-over mode? Any guide.
> 
> regards,
> 
> Bilal Aslam
> 
> _
> Hotmail: Powerful Free email with security by Microsoft.
> https://signup.live.com/signup.aspx?id=60969
Look for WPAD


Re: [squid-users] Squid3 issues

2010-03-18 Thread Nyamul Hassan

Your email is one long whining without much substance at all.  I have been a
member of this list for over 3 years now, and been using Squid for a year
and a half.  During all this time, I have always found this list to be
hospitable, and helpful.

If you don't like the software, then don't use it.  It's not costing you
anything.  That being said, I have almost never found any attitude from any
person on the list that says RTFM.  Even when someone asks about obvious
things, someone is kind enough to point to the right direction.

We use over 4 Squid proxies (running 2.7STABLE7), running commodity
hardware, and their performance has been more than satisfactory to us.
However, we run all of them over CentOS 5+.  Last year, even commercial 
vendor Bluecoat could not give us a strong enough reason to show that their 
product performed any better than Squid to justify the cost differential.


If it is file descriptors that is creating problems, then you need to read 
the OS docs on how to increase that on the OS side.  On CentOS, running 
"ulimit -n" shows how many FDs are allowed by the OS.


As for Squid, a simple recompile with the "--with-maxfd=" flag worked
like a charm for me.  Using "squid -v" is always handy to get the existing 
compile-time flags first.


Whatever your frustration at this point, whining over at the forum, and
blaming everyone else and saying "admit that squid does not work", is pretty 
lame.


Oh, by the way, I searched my email archives of the Squid Mailing List, with 
your "email id", and it turned out there is only one email from you, and 
that was only 15 hours ago, within which there have been 4 email responses 
already.  You did not even reply to one of them saying what did not go as 
suggested.


Regards
HASSAN



- Original Message - 
From: "Gmail" 

To: 
Sent: Thursday, March 18, 2010 19:30
Subject: [squid-users] Squid3 issues



Hello everyone,
I have been trying for nearly 5 weeks now to get this piece of software to
work, I have tried several versions, I have tried it on several platforms,
all I got from it is frustration, I know that some people would say what a
fantastic piece of software.

I have used many softwares, packages, compiled stuff for years, never ever
had an experience such as this one, it's a package full of headaches, and
problem after problem, And to be honest the feedback I get is always
blaming other things, why can't you people just admit that Squid doesn't
work at all, and you are not providing any help whatsoever, as if you
expect everyone to be an expert.

I also don't like the attitude of some people, talking to you as if you're
an "idiot", while in fact you follow their suggestions to the letter and
yet it doesn't work, instead of blaming the operating systems and blaming
people for not knowing how to use it, why can't you try and do something
that works for a change, I have wasted nearly 5 weeks day in day out
sometimes I stayed til 3 or 4 am trying desperately to get this thing
working.

For instance if I compile with no options I know that somewhere down the
line I am going to find out that I needed this or that, if I compile it
with some options I get errors that don't make any sense, examples.

I uninstalled the version that was packaged with Ubuntu hardy, I am trying
to compile it so I won't have the same problem, with the file descriptors,
I followed exactly the suggestions in the configure --help menu, yet I am
getting an error,
like Compile cannot create executable, or something to that effect.

Not to mention when I tried to run it, it didn't forward any requests, I
have followed all of the configuration examples and people's suggestions,
never could forward any request to my backend server.

After three weeks I managed to get my clients to have access to the
internet, and many applications didn't work, such as Yahoo, Msn, Steam and
so on, when I ask for help, nobody has an answer including some members of
the team.

Yes I can hear some arguments, saying but we are volunteers, true, but you
either do something that works or don't.

If I needed help for say, Unrealircd or any other program I know I can get
help, and their documentaion, does what it says on the tin. you follow
their instructions, you will get it to work exactly as they say.

With squid, it doesn't work, that's all I am getting, I don't even believe
that it works now to be honest, I am sorry I am not moaning but it's true,
I have been on many forums for weeks and all I could see were problems
people are facing with any version of squid , and no solutions are given
very few and after you fix one problem 10 others pop up somewhere else I
certainly don't want to spend my life fixing and bashing my head trying to
find a solution, I want something that works, but unfortunately it
doesn't.

I am just going to try something else somewhere else,
Thanks all the same for anyone who tried to help, but this is not for me,
life is too short to waste anymore of my time, in try

Re: [squid-users] Squid proxy Setup in fail-over mode

2010-03-18 Thread Diego Woitasen
2010/3/18 GIGO . :
>
> How to setup squid proxy to run in fail-over mode? Any guide.
>
> regards,
>
> Bilal Aslam
>
> _
> Hotmail: Powerful Free email with security by Microsoft.
> https://signup.live.com/signup.aspx?id=60969


Use Heartbeat+LVS. Squid doesn't have any special requirement to run
in fail-over mode, just run Squid in two servers and get load balance
with LVS and failover with Heartbeat. There are a lot of documentation
about this on the web.

-- 
Diego Woitasen
XTECH


Re: [squid-users] Squid proxy Setup in fail-over mode

2010-03-18 Thread fedorischev
В сообщении от Thursday 18 March 2010 17:14:08 Diego Woitasen написал(а):

> Use Heartbeat+LVS. Squid doesn't have any special requirement to run
> in fail-over mode, just run Squid in two servers and get load balance
> with LVS and failover with Heartbeat. There are a lot of documentation
> about this on the web.

I think, WPAD is a better solution. In any case, it requires more than one 
squid server.

WBR


AW: [squid-users] Squid3 issues

2010-03-18 Thread Zeller, Jan
I also say thanks to the squid-team for their excellent work & support !
We're running 4 squid-3.0 servers for our whole campus including c-icap from 
Tsantilas Christos, 
At the begining we had some implementation problems of course but that is 
normal. (Mostly due to Layer 8 issues.)
But to summarize : we're really really satisfied with this nice piece of 
software even if I still don't know how it really works.
What would you use instead of squid ? Bluecoat Inc. ? or maybe mod_proxy / 
mod_cache ?

Mit freundlichen Grüssen

Jan
Universität Bern


[squid-users] HTTPS Proxy Question

2010-03-18 Thread Sheahan, John

Does Squid actually proxy HTTPS connections or does it just tunnel it?

The reason I ask is that if you install a Blue Coat proxy, it requires a 
certificate to be installed from the Blue Coat box on all HTTPS clients because 
they say it is "true" HTTPS proxy and does man in the middle and Squid does not?


Re: [squid-users] Cache_effective_user issue

2010-03-18 Thread Matthew Morgan

GIGO . wrote:

I installed ubuntu server as "u-admin" account. I have set the directive:  
cache_effective_user to "squidadmin". However my squid process keep on running as b-admin 
account. And my cache_effective user never comes into play.
 
As per a guide which described that squid will only shift to the defined user if it is run as root but i am unable to start it as root. I logged in as root by sudo -s (running it as root) and tried to start the squid but the permissions were denied. For the same reason i guess my startup scripts kept on failing.
 
I have created the squidadmin account as nologin noshell for security purposes and feel tht squid process should be run with this account. Please guide am i thinking right?
 
 
 
 
please help me out.
 
regards,
 

  		 	   		  
_

Your E-mail and More On-the-Go. Get Windows Live Hotmail Free.
https://signup.live.com/signup.aspx?id=60969

Two questions:

Did you install squid via apt, or did you compile from source?

Can you paste your squidamdin line from /etc/passwd?


Re: [squid-users] HTTPS Proxy Question

2010-03-18 Thread Denys Fedorysychenko
On Thursday 18 March 2010 17:36:09 Sheahan, John wrote:
> Does Squid actually proxy HTTPS connections or does it just tunnel it?
> 
> The reason I ask is that if you install a Blue Coat proxy, it requires a
>  certificate to be installed from the Blue Coat box on all HTTPS clients
>  because they say it is "true" HTTPS proxy and does man in the middle and
>  Squid does not?
> 

Squid have same mode and same man in the middle mode. 
keywords "squid wildcard certificate"
Probably good to tell them, that they are quite unprofessional, because or 
they lie, or they just don't know what opensource can offer.

Btw this "true" mode is huge security threat in some cases.


RE: [squid-users] HTTPS Proxy Question

2010-03-18 Thread Sheahan, John
If Squid is configured to use the "squid wildcard certificate", does this mean 
that all of the HTTPS clients have to manually accept this certificate in order 
to proxy HTTPS through squid?

thanks

-Original Message-
From: Denys Fedorysychenko [mailto:nuclear...@nuclearcat.com] 
Sent: Thursday, March 18, 2010 11:44 AM
To: squid-users@squid-cache.org
Cc: Sheahan, John
Subject: Re: [squid-users] HTTPS Proxy Question

On Thursday 18 March 2010 17:36:09 Sheahan, John wrote:
> Does Squid actually proxy HTTPS connections or does it just tunnel it?
> 
> The reason I ask is that if you install a Blue Coat proxy, it requires a
>  certificate to be installed from the Blue Coat box on all HTTPS clients
>  because they say it is "true" HTTPS proxy and does man in the middle and
>  Squid does not?
> 

Squid have same mode and same man in the middle mode. 
keywords "squid wildcard certificate"
Probably good to tell them, that they are quite unprofessional, because or 
they lie, or they just don't know what opensource can offer.

Btw this "true" mode is huge security threat in some cases.


RE: [squid-users] Squid cache_dir failed - can squid survive?

2010-03-18 Thread GIGO .

Is it possible to run two instances/processes of squid on the same physicail 
machine that is one with cache and other in proxy only mode? is that what u 
mean ? how.


> From: hen...@henriknordstrom.net
> To: gi...@msn.com
> CC: gina...@gmail.com; squid-users@squid-cache.org
> Date: Thu, 18 Mar 2010 09:54:34 +0100
> Subject: RE: [squid-users] Squid cache_dir failed - can squid survive?
>
> tor 2010-03-18 klockan 06:16 + skrev GIGO .:
>> Dear henrik,
>>
>> If you have only one physical machine what is the best strategy for
>> miminmizing the downtime and rebuild the cache directory again or
>> start utilizing the squid without the cache directory? I assume we
>> have to reinstall the Squid Software? Please guide
>
> The approach I proposed earlier with two Squid processes running in
> cooperation will make service surive automatically for as long as the
> system disk is working.
>
> If using just one process then making Squid stop trying to using the
> cache is as simple as removing the cache_dir specifications from
> squid.conf and start Squid again. You do not need to reinstall unless
> the system/os partition have been damaged. This change to squid.conf can
> easily be automated with a little script if you want.
>
> Regards
> Henrik
>
>
> 
_
Hotmail: Trusted email with powerful SPAM protection.
https://signup.live.com/signup.aspx?id=60969

Re: [squid-users] HTTPS Proxy Question

2010-03-18 Thread K K
See: http://wiki.squid-cache.org/Features/SslBump

On Thu, Mar 18, 2010 at 11:54 AM, Sheahan, John
 wrote:
> If Squid is configured to use the "squid wildcard certificate", does this 
> mean that all of the HTTPS clients have to manually accept this certificate 
> in order to proxy HTTPS through squid?

Same issues as with Blue Coat and "SSL Intercept".  Some tunneled
protocols and a few websites will fail when intercepted, so you must
have provisions to make exceptions (e.g. "ssl_bump deny broken_sites")

Generally you would have the clients pre-loaded with your private CA
certificate, for MSIE you can do this by GPO, for some other
browsers/OS you do have to manually load the CA certificate, once.

Kevin


[squid-users] Reverse Proxy SSL Options

2010-03-18 Thread Dean Weimer
I am trying to setup a reverse proxy to server multiple websites,
everythign is working fine except that so far in the testing process I
have discovered that it is not passing the PCI scans that we are
required to pass.
 
We have multiple websites using a certificate that has subject
alternative names set to use SSL for the multiple domains.  That part is
working fine, and traffic will pass through showing with Valid
certificates.  However, I need to Disable it from answering with weak
ciphers and SSLv2 to pass the scans.
 
I found the sslproxy_options and the sslproxy_cipher directives, I would
assume that these are what I would use to fix this problem.  However
there is nothing in the documentation that says where to place these in
the configuration file or what arguements they accept.
 
It would be greatly appriciated if someone could direct me to some
docuementation on how to set these options.
 
Thanks
Dean Weimer


Re: [squid-users] Squid3 issues

2010-03-18 Thread Nyamul Hassan

Your rant can be summarized as follows:

1.  You are using a OS (and version) which (according to you) has poor 
documentation or other info.  Ubuntu / Debian are very popular Linux 
flavours.  As someone who claims to have been a Linux administrator for 
"several years", haven't you ever come across the FD issue in any situation? 
Strange!


2.  You are a single man catering to "many" servers, so you don't want to 
"waste time" with a software that you could not get running on an OS as in 
#1.  The "we" in my email was meant to be my company.  I personally see over 
all the installations we have.  No one else as a helping hand.


3.  You find a lot of people complaining that they can't run squid.  But, 
you ignore a lot of other people (like myself and Jan who responded to your 
post) who are saying this is a brilliant piece of software.


Instead of ranting, I would suggest you change your attitude, and start 
laying down the problems that you are facing.  Someone from the community 
will always get back to you, as they have for me in the past.


I have no intention of starting a flame war here.  I just want you to calm 
down, and assure you, Squid in itself is a brilliant piece of code. 
Remember, this is the same software that serves Wikipedia, and that speaks a 
lot about how stable this software is.


Also, Squid 3.0 is still under active development, and is not suitable for 
all scenarios.  We use 2.7 because we use it as a forward proxy, and many 
features available in the 2.x branch have not yet been fully migrated to the 
3.x branch.  So, 2.7 suits our scenario more.  Perhaps you can also mention 
what your scenario is.


As for OS, I've seen some people say FreeBSD is one of the best OS for 
Squid.  But, we ourselves are pretty happy with CentOS 5.x.  So, find out 
what works for you.  Ubuntu / Debian are also very popular Linux flavours, 
so I think you need to search some more about how to increase FDs.


I hope you find solutions to your woes, and come to use Squid to your 
favour.


Regards
HASSAN



- Original Message - 
From: "Gmail" 

To: "Nyamul Hassan" 
Sent: Friday, March 19, 2010 00:01
Subject: Re: [squid-users] Squid3 issues



Hello,
I did say that some people would disagree, I know that there are people 
who might find it brilliant
I am not moaning, I was stating facts, you're talking about the version 
2.7 or whatever you're using

I am talking about version 3.0++.

All of the examples don't make any sense I have followed them to the 
letter and yet I still got lot of issues, as for the emails I did reply to 
both of them and twice
and I even sent another email asking another question, if it works for you 
then good for you, I am glad that some people like it.
but for me it didn't work, no matter what I tried, I am using Ubuntu, 
that's another thing you will find certain things in details for most OSes 
but not for Ubuntu or even a debian and if you do find anything it's 
always with the older versions., which you know most things in the v2.0++ 
are not recognised in version 3.0


I have been a webmaster for many years and I have used many linux distros, 
I have compiled, installed and ran countless programmes.
I also code in Java, and many other scripting languages, I am not exactly 
a novice.


If you think the programme is so brilliant, then don't take my word for 
it, just check out the forums, mailing lists etc.. you will see how many 
people are having difficulties with squid, since it started.
It hasn't got any better I had a go few years back I had the same problems 
back then.
If you do like it good for you, and to tell me if I don't like it I 
shouldn't use it, yes if we had another option yes, but we don't, and it's 
not as good as people claim to be, the truth is, people don't have a 
choice or an alternative.


If you're happy spending hours everyday solving one problem after another, 
be my guest but I hardly have the time to mock around with useless 
software, People should be able to use and run without having to become 
experts.
The same things applies to the Linux community, that's why most pople 
can't be bothered to have linux in their homes, even though deep down they 
know that Microsoft isn't reliable.


And you're talking in "we" meaning you have more than one person to run 
whatever you're running, as for me I am running everything all by myself, 
from the webservers, to the clients, to the DBs, to the chat servers, to 
the commercial websites, all by myself.
So I don't have the time to waste with one program that is supposed to be 
compiled, installed and ran without any difficulties, not even a read me 
on how to install it., unless you run .configure --help in order to find 
the list of options, and most of them are not recognised and so on and so 
forth.


on one hand solving problems caused by Squid and then solve problems on 
the system itself in order for it to recognise it,
not to mention the huge amount of errors you'll get when you try 

Re: [squid-users] Unable connect to site FTP through client FireFTP or similar (Filezilla)

2010-03-18 Thread Mariel Sebedio


I looked this soft a time ago but I saw that have not had recently 
developed.


Thanks, I will learn about it.



Amos Jeffries wrote:

On Wed, 17 Mar 2010 15:30:55 -0200, Mariel Sebedio 
wrote:
  
Hello, I have RHLE 5.4 and squid3.0Stable19 and I need to connect FTP 
Client through proxy server.
When I was connected with web browser its is OK and when I was used the 
FireFTP or another FTP client  the connection is unable whit time out. 
(report the client)
I look the access.log and in the second case was  not registered the 
intend...


Any ideas?



Only the FAQ answer ... "Squid is an HTTP proxy not an FTP proxy".

Meaning that Squid accepts HTTP connections not FTP ones.
If you need to proxy FTP traffic take a look at "frox".

Amos


  



--
Lic. Mariel Sebedio
Division Computos y Sistemas
Tel (02944)-445400 int 2307
INVAP S.E. - www.invap.com.ar



RE: [squid-users] Squid cache_dir failed - can squid survive?

2010-03-18 Thread Henrik Nordström
tor 2010-03-18 klockan 16:58 + skrev GIGO .:
> Is it possible to run two instances/processes of squid on the same
> physicail machine that is one with cache and other in proxy only mode?
> is that what u mean ? how.

Yes.

See wiki.

Regards
Henrik



Re: [squid-users] Squid3 issues

2010-03-18 Thread Nyamul Hassan
Please outline your scenario in detail.  If you are facing problems
about FDs, then I take it that you already have a running instance,
but that your load is quite high to require more than the default 1024
FDs.

Did you get your OS limits changed to more than 1024?  I modify my
servers to 65536 whenever I'm running Squid on them.

Please outline your problem in more details, so that we can help you.

Regards
HASSAN




On Fri, Mar 19, 2010 at 1:37 AM, Gmail  wrote:
>
> Hello again,
> I am not ranting, I was merely expressing my opinion, well I never said I was 
> an expert when it comes to the proxies I have never used them, this is my 
> first attempt, yes Debian and Debian based distros are very very popular I 
> have used many distros and by far I must admit that the debian is a fantastic 
> OS, now it comes to tastes, some like FreeBSD, some like OpenSuse, some like 
> Centos5, and so on I have seen many people using Fedora, but I wasn't 
> impressed by it when I used it 2 years ago. anyway as I said it's a matter of 
> taste.
>
> You say I couldn't get it running yes, when I follow the instruction to the 
> letter and I mean to the letter, I have changed several times the config, to 
> just forward requests to the backend server "As a test" first the back end 
> server is running on virtualhosts, that was the reason, why I decided to 
> tackle Squid or a (proxy sever if you like).
>
> I took people's word for it, and I tried it, all I could get it to do is 
> allow some http clients to acces the internet, when I try and visit any of 
> the websites all I get is the "front page on the proxy itself" No matter what 
> I did either I get acces denied etc.. or I get invalid url, all of the 
> standard error message, and finally I got to get the default page of the 
> apache on the proxy itself, but not one request was forwarded to the backend 
> webserver.
>
> I am not mentionning other apps that don't work with Squid, just a few, such 
> as MSN, Steam, Utorrent and so on..and then I got the Warnings that my cache 
> is running out of file descriptors, no other programme ever did this to my 
> servers.
>
> I found squid extremely picky, extremely demanding, what I am saying is, 1024 
> descriptors should be more than enough for it to run and considering the fact 
> that nothing else is running on that machine, it was a dedicated machine just 
> for squid3.0
> I got to the point where I couldn't even open the syslog because the buffer 
> limit was exceeded, and that was a couple days of me just trying testing it, 
> I managed to get the Utorrent working in the end, I had to use pidgin instead 
> of MSN for the clients because It was impossible to get MSN or Yahoo to 
> connect I haven't tried Skype though, but some people don't like Pidgin, they 
> prefer either MSN or Yahoo anyway that wasn't a big deal.
> All I am saying is I found that Squid is very very demanding indeed, and if I 
> can't use it the way I like, what's the point?
> Anyway, I have decided not to use it and leave it for people who are happy 
> with it and wish them good luck with it.
> I haven't ignored anybody, I have always replied and to tell you the truth, I 
> have asked questions before and I was ignored, and that's fine all I was 
> asking if they had a decent documentation with clear examples.
> The examples I read all made no sense to me, there are better ways of giving 
> good examples.
> For example, they ask you to use a parent, a sibling etc. what if you don't 
> have any of these??
>
> Anyway I don't know why you took it so personally, All I am saying here in 
> simple terms, when you write a program don't expect every person to know what 
> you're on about, make your example as simple as possible that anybody can 
> understand, people don't need to be experts in order to use that's all, and 
> if that offended and you can't take a bit of criticism than I can't help you.
>
> When I write a program and I get criticised I will listen and ask how would 
> they like to be and I will explain why I did it the way I did it.
> Simple :-)
>
> Take care mate, we're going nowhere with this, thanks anyway for your replies
> Regards
> Adam
>
> - Original Message - From: "Nyamul Hassan" 
> To: 
> Sent: Thursday, March 18, 2010 6:28 PM
> Subject: Re: [squid-users] Squid3 issues
>
>
>> Your rant can be summarized as follows:
>>
>> 1.  You are using a OS (and version) which (according to you) has poor 
>> documentation or other info.  Ubuntu / Debian are very popular Linux 
>> flavours.  As someone who claims to have been a Linux administrator for 
>> "several years", haven't you ever come across the FD issue in any situation? 
>> Strange!
>>
>> 2.  You are a single man catering to "many" servers, so you don't want to 
>> "waste time" with a software that you could not get running on an OS as in 
>> #1.  The "we" in my email was meant to be my company.  I personally see over 
>> all the installations we have.  No one else as a helpin

[squid-users] Extending Squid

2010-03-18 Thread noor nashid
Dear All,

I want to modify Squid in the following way:

- Firstly I want to use gzip compression for texts

- Later I shall go for Image Compression using ImageMagick Library

- And thirdly I want to download a whole page in Squid, and then I
want to transfer that to the other side (i.e. to the browser/client)
using a single HTTP pipelined connection. Suppose we are browsing
http://www.squid-cache.org/ . Here  I shall gather all objects of the
website and then I shall respond back to the client using a single TCP
connection. The point here is not open a TCP connection for each
object.

For the first two I shall use the eCAP interface for content
adaptation. But whats your suggestion for implementing the third one.

My intention is to make squid aware of HTTP compression and pipelining
feature enabled. I think this two feature will make Squid much
stronger.

If you have any suggestion regarding this issue, please respond.

Thanks

Nashid
Postgrad Researcher
Mobile & Internet Systems Laboratory (MISL)
University College Cork (UCC)
Cork, Ireland


Re: [squid-users] Squid Accelerator mode: HTTP/1.0 and 'defaultsite' header

2010-03-18 Thread Riccardo Castellani
Most clients these days will do so regardless of their version. 
defaultsite is completely optional, in your case if you omit it broken 
clients will get the squid "invalid request" error page instead of tomcat 
front page


If I insert 'defaultsite', I think so for HTTP/1.0 clients :

host header (http://pages.example.com) is present in the request,  but I 
think the http packet contains "GET command" with the complete URL (e.g. 
http://pages.example.com/mkLista.do?code=A) so they will be able to ask 
correct url.

Why do you say "... instead of tomcat front page" ?
"Tomcat front" page appears only you request http://pages.example.com.






- Original Message - 
From: "Amos Jeffries" 

To: 
Sent: Thursday, March 18, 2010 12:16 PM
Subject: Re: [squid-users] Squid Accelerator mode: HTTP/1.0 and 
'defaultsite' header




Riccardo Castellani wrote:

I'm using Squid in accelerator mode (Version 2.7.STABLE3).
I'm using this config:

http_port 72.43.22.19:80 accel vhost
cache_peer 10.1.1.2 parent 8283 0 no-query originserver Name=ITS 
no-digest

acl LIST dstdomain pages.example.com
http_access allow LIST
cache_peer_access ITS allow LIST
cache_peer_access ITS deny all


I can to describe my environment:

We give this public site "mysite.example.com" (having this IP
72.43.22.19:80) where users can view 4 links. If you pass by mouse in 
front
of this 4 links you can see: 1st link 
http://pages.example.com/mkLista.do?code=A

2nd link http://pages.example.com/mkLista.do?code=B
3rd link http://pages.example.com/mkLista.do?code=C
4th link http://pages.example.com/mkLista.do?code=D

'pages.example.com' is resolved as IP address of Squid, that is 
72.43.22.19;

this is a way to route requests to Squid accelerator (Squid is both on
extern. and on intern. network) which accelerates to 10.1.1.2 (internal
server).
I'd like accelerating ONLY these 4 links but I have no one 'defaultsite',
infact pages.example.com points to 'Apache Tomcat' default page; 
accelerated

server gives only these 4 objects.
I read HTTP/1.0 requests don't send 'Host' header so if I omit
'defaultsite', clients will get an "Invalid request" error.

I can't understand if I have to insert defaultsite=pages.example.com in 
this

case or not.
Now I'm working fine without this option but I have doubts if requests of
HTTP/1.0 clients can be accelerated.



Most clients these days will do so regardless of their version. 
defaultsite is completely optional, in your case if you omit it broken 
clients will get the squid "invalid request" error page instead of tomcat 
front page.


Amos
--
Please be using
  Current Stable Squid 2.7.STABLE8 or 3.0.STABLE25
  Current Beta Squid 3.1.0.18 




Re: [squid-users] Squid3 issues

2010-03-18 Thread Nyamul Hassan
So, do you want to use proxy in an ISP like setup?  Or in a Web
Hosting like setup?

Regards
HASSAN




On Fri, Mar 19, 2010 at 2:25 AM, Gmail  wrote:
> Ok I'll try and clarify it (thanks btw)
> I am running 3 websites on one single machine and have been for few years,
> then the load started to grow, then I decided to have a go at a proxy
> server:
> I was actually putting off for a couple of years, simply because I am very
> restricted time wise
> I have as I said 3 different websites running on one single machine in a
> vhost mode
>
> three websites with three different domain names.
>
> Let's say 1) example.com, example.net, example.org all pointing eventually
> to the same IP address
> as I said it worked perfectly but it started to slow down a bit as the load
> gets too much for one machine to handle.
> On top of that I run other servers on different machines, such as Chat
> servers (IRC, Flash, DigiChat) , and various other applications.
>
> Now, I am using this machine as a proxy server (reverse proxy server) and a
> router at the same time using iptables, and I use another machine as a
> DNS/DHCP servers, all configured and working fine indeed no problems at all.
>
> Now, I really struggled to get the clients on my network to have access to
> the internet, I mean just to browse the net, I did in the end, but every
> single example I followed not a single one worked for me, I don't know how
> many forums and articles I read.
> I have applied so many examples no luck.
>
> So basically no requests were passed to the backend server, all I wanted is
> to get those requests forwarded to the web-server and if that works then I
> will add three more machines as backend servers and each machine will hold
> one website with it's DB and so on..
>
> That was my plan anyway, And I found myself in ever decreasing circle going
> around in circle, following some people's examples and nothing worked, I
> tried to find information for example about, how to setup a cache parent,
> sibbling and so on, not a single word about, I even read O'reilly's
> articles.
>
>
> In those examples for instance they mention a parent in order to forward a
> request, without telling you how to set a parent, and if you don't have a
> parent, does that mean you can't use a proxy server, and If I had a parent
> where would it be? and how to decide which one is the parent and which one
> is the child etc.. NO indication not a single word, "they expect" you to
> know all that as if you spent all you life working on their project, it
> never occured to them that maybe some people won't know what is a parent or
> how to set it up and so on..
>
>
> I can go on like this for a whole night, I know you're trying to help but to
> be perfectly honest I am put off by this whole thing, I don't think I want
> to use Squid at all, I reached a saturation point now.
>
> You see I know even if I get the thing off the ground now, I am sure in a
> few weeks time it will whinge at me or even in a few days time.
>
> Maybe one day if I have the time I can look into it in more details and take
> the time to understand first it's concept and the way it works, it seems to
> have it's own logic.
>
> If not I will just have to either purchase a software that does a similar
> thing or use Apache as a proxy server and see how it goes.
>
> I just want to thank you for your time and your effort in trying to help
>
> Best regards
> Adam
>
> - Original Message - From: "Nyamul Hassan" 
> To: "Squid Users" 
> Sent: Thursday, March 18, 2010 7:49 PM
> Subject: Re: [squid-users] Squid3 issues
>
>
> Please outline your scenario in detail.  If you are facing problems
> about FDs, then I take it that you already have a running instance,
> but that your load is quite high to require more than the default 1024
> FDs.
>
> Did you get your OS limits changed to more than 1024?  I modify my
> servers to 65536 whenever I'm running Squid on them.
>
> Please outline your problem in more details, so that we can help you.
>
> Regards
> HASSAN
>
>
>
>
> On Fri, Mar 19, 2010 at 1:37 AM, Gmail  wrote:
>>
>> Hello again,
>> I am not ranting, I was merely expressing my opinion, well I never said I
>> was an expert when it comes to the proxies I have never used them, this is
>> my first attempt, yes Debian and Debian based distros are very very popular
>> I have used many distros and by far I must admit that the debian is a
>> fantastic OS, now it comes to tastes, some like FreeBSD, some like OpenSuse,
>> some like Centos5, and so on I have seen many people using Fedora, but I
>> wasn't impressed by it when I used it 2 years ago. anyway as I said it's a
>> matter of taste.
>>
>> You say I couldn't get it running yes, when I follow the instruction to
>> the letter and I mean to the letter, I have changed several times the
>> config, to just forward requests to the backend server "As a test" first the
>> back end server is running on virtualhosts, that was the reason, why I
>> decided to

[squid-users] Grant access to tunnel HTTP requests via scripts

2010-03-18 Thread Carlos Lopez
Hi all,

I'am new to squid and I was wondering if it is possible to tunnel http requests 
from authenticated users and then via script block/allow access to https 
address, but depending of  the result of the script (the allowed/blocked sites 
should be queried from a DB) , let's say:

user1 and user2

user1, have access to check yahoo, hotmail and google mail only and check for 
bank transactions for only one specific site, so he/she may need https port to 
be open (https and http are blocked on the firewall), but at the same time do 
some filtering, to restrict him/her to navigate for example Adult sites.

user2, got access only to navigate through port http and also do some filters 
via script (for example, block access to webchat links)

Is it possible to do with squid?

Thanks for your help.

Carlos.


  

¡Obtén la mejor experiencia en la web!
Descarga gratis el nuevo Internet Explorer 8. 
http://downloads.yahoo.com/ieak8/?l=e1


[squid-users] media center and squid -- tellin squid to pass 'direct' to allow http1.1?

2010-03-18 Thread Linda Walsh

Has anyone gotten Windows media center to work through squid?

I just tried it and whenever it got to 'content', I saw lots of
"bad-gateway" mssages right after the HTTP/1.0 returned by squid.  Just
before that, I saw a bumch of SSDP requests looking for HTTP/1.1 -- but
only thing it got back was an HTTP/1.0.

The remote server kept sending back "bad gateway" -- after about 30
attempts, the client gave up and returned "Video not available at this
time."...

So What I'm wondering is if it is possible to have squid not cache
attempts 

>From there, I eventually (on the player) got "Video is not available at
this time".

So I was wondering if it was possible to setup some sort of ACL type list
to tell squid to pass through 1.1 requiring requests so they wouldn't fail
-- wouldn't be cached, but better not cached than complete failure.

Is this possible or has anyone done this?

Thanks in advance!..

Linda






Re: [squid-users] Java not working behind squid

2010-03-18 Thread Thomas Klein

Amos Jeffries schrieb:

On Wed, 17 Mar 2010 23:21:44 +0100, Thomas Klein
 wrote:
  

Truth Seeker schrieb:


-



http_access deny !AuthorizedUsers



... performs authentication. Which was your problem
  
  

with



Java...

order is important!

  
  

So does it mean, i need to put them as the following;

### For JAVA
acl Java browser Java/1.4 Java/1.5 Java/1.6
acl testnet src 192.168.7.0/24
acl testnet src 192.168.8.0/24
http_access allow testnet Java
 
http_access deny !AuthorizedUsers





Yes when i modified as the above, its working fine

Now another doubt. will this solve the issues related to all the java
sites?

  
  

Hi there,

i have actually also the problem that java-applications are in no way 
able to get a working connect to the internet, but this workaround with 
the example of http://www.dailyfx.com/ doesn't work for me in any


case
  
My test-user matches the acl "gruppe_vollzugriff" - i'm using 
2.7.STABLE3-4.1 on Debian Lenny with squidguard 1.4. I also use NTLM 
auth against a AD.


If I do it in this way:

acl gruppe_standarduser external wbinfo_group Proxygruppe-Standarduser
acl gruppe_vollzugriff external wbinfo_group Proxygruppe-Vollzugriff
acl gruppe_azubis external wbinfo_group Proxygruppe-Azubis
acl gruppe_test external wbinfo_group Proxygruppe-test
acl Java browser Java/1.4 Java/1.5 Java/1.6
acl localnet src 172.1.0.0/19
...
http_access allow localnet Java
http_access allow gruppe_azubis erlaubte_seiten_azubis
http_access allow gruppe_standarduser
http_access allow gruppe_test
http_access allow gruppe_vollzugriff
http_access deny all

I get in access.log the following:
1268863619.997 13 172.1.0.128 TCP_MISS/404 0 CONNECT http:443 - 
DIRECT/- -
1268863620.008  3 172.1.0.128 TCP_MISS/404 0 CONNECT http:443 - 
DIRECT/- -
1268863620.022  3 172.1.0.128 TCP_MISS/404 0 CONNECT http:443 - 
DIRECT/- -
1268863620.034  3 172.1.0.128 TCP_MISS/404 0 CONNECT http:443 - 
DIRECT/- -



If i modify the order of the http_access line in this way:

acl gruppe_standarduser external wbinfo_group Proxygruppe-Standarduser
acl gruppe_vollzugriff external wbinfo_group Proxygruppe-Vollzugriff
acl gruppe_azubis external wbinfo_group Proxygruppe-Azubis
acl gruppe_test external wbinfo_group Proxygruppe-test
acl Java browser Java/1.4 Java/1.5 Java/1.6
acl localnet src 172.1.0.0/19
...
http_access allow gruppe_azubis erlaubte_seiten_azubis
http_access allow gruppe_standarduser
http_access allow gruppe_test
http_access allow gruppe_vollzugriff
http_access allow localnet Java
http_access deny all

I get the following output in the log:
1268864049.866  8 172.1.0.128 TCP_DENIED/407 1867 CONNECT 
balancer.netdania.com:443 - NONE/- text/html
1268864049.900  6 172.1.0.128 TCP_DENIED/407 1841 CONNECT 
balancer.netdania.com:443 - NONE/- text/html
1268864049.914  4 172.1.0.128 TCP_DENIED/407 1867 CONNECT 
balancer.netdania.com:443 - NONE/- text/html
1268864049.927  6 172.1.0.128 TCP_DENIED/407 1841 CONNECT 
balancer.netdania.com:443 - NONE/- text/html
1268864049.940  4 172.1.0.128 TCP_DENIED/407 1867 CONNECT 
balancer.netdania.com:443 - NONE/- text/html
1268864049.965 15 172.1.0.128 TCP_DENIED/407 1841 CONNECT 
balancer.netdania.com:443 - NONE/- text/html
1268864049.979  4 172.1.0.128 TCP_DENIED/407 1867 CONNECT 
balancer.netdania.com:443 - NONE/- text/html
1268864049.989  6 172.1.0.128 TCP_DENIED/407 1841 CONNECT 
balancer.netdania.com:443 - NONE/- text/html



As I described, java isn't able to get a working connect to the 
internet. What's wrong in my case? I would be glad if you have a hint 
for me



There is some form of deny line happening outside the set you showed.
Which blocks the first configuration form working. The Java auth problem
blocks the second.

Amos

  
Thank you for your hint - i'm using squidGuard, and this seems to be the 
problem. If I comment out the following line from squid.conf, Java works 
fine:

url_rewrite_program /root/squidGuard -c /etc/squid/squidGuard.conf

Ok so far - I'm now a step closer but i'm afraid that's not the 
solution, because if I disable the content filter from squidGuard, my 
boss will kill me ;)


I checked the squidGuard Logfiles, but there is nothing to find about 
authentication and so on only the database updates are being logged. 
Because the AD-Authentication from squidguard did not work, I'm pulling 
with "net rpc group members" every 10 Minutes all members of the 
neccessary AD-Groups into a local file for each access group in the 
squidguard-Database directory, and squidguard looks into these files for 
finding the usernames there.


This works so far for the whole internet access, but Java seems to get 
in trouble with this. The case is also strange, that squidguard does not 
log any information about authentication or something about the 
filtering in its logfiles - don't know if thats ok?!?


H

[squid-users] Re: Squid3 issues

2010-03-18 Thread Linda Walsh
Gmail wrote:
> I have used many softwares, packages, compiled stuff for years, never
> ever had an experience such as this one, it's a package full of
> headaches, and problem after problem, And to be honest the feedback I
> get is always blaming other things, why can't you people just admit that
> Squid doesn't work at all, and you are not providing any help
> whatsoever, as if you expect everyone to be an expert.

 I've only seen one post by you on this list -- and that was about
increasing your linux file descriptors at process start time in linux
-- not something in the squid software -- but something you do in linux
before you call squid.  It *** SHOULD*** be in your squid's
/etc/init.d/squid startup script. --  you should see a command "ulimit -n
".

I have "ulimit -n 4096" in my squid's rc script.

It is a builtin in the "bash" script.  I don't know where else it is
documented, but if you use the linux-standard shell, "bash", it should
just work.  "-n" sets the number of open file descriptors.


> I uninstalled the version that was packaged with Ubuntu hardy, I am
> trying to compile it so I won't have the same problem, with the file
> descriptors, I followed exactly the suggestions in the configure --help
> menu, yet I am getting an error, like Compile cannot create executable,
> or something to that effect.

Maybe you should try a distribution where it is 1) known to work, or
2) already has a pre-compiled binary.

Try opensuse.org. It's what I use.  It works flawlessly out of the
box. (from http://www.opensuse.org/en/).

Everyone will have their favorite and tell you how well it works.
That one is mine (for the nonce).  Been using it for several years -- the
fact that they have gotten seed money from Microsoft -- means also that
they have worked to add support for the new Vista/Win7 networking stacks
which supports various advanced device functionality (either a pain in the
ass or a bonus depending on whether or not you have such equipment and
want it to work).  

The fact that it is in there doesn't mean you can't turn it off and
delete it (which I did).  Now am working to turn it back on as I get some
win-media enabled devices on my network. (My new TV speaks those protos!
(but doesn't work over squid!) -- but my new Blu Ray DVD player (Sony),
used proxy autodetect (http://wpad/wpad.dat), and worked through my squid
proxy the first try!...was quite pleased with that.

So

> After three weeks I managed to get my clients to have access to the
> internet, and many applications didn't work, such as Yahoo, Msn, Steam
> and so on, when I ask for help, nobody has an answer including some
> members of the team.
-
Some of these are problems - you have to contact the application
writers and get them to use HTTP PROXIES -- because they IGNORE your
HTTP_PROXY settings and attempt to go direct.

This is due to no fault of squid, but the misbehaving applications.
They only way to proxy them would be to use a transparent proxy which both
a pain, and maybe not worth the bother, as you have to let them connect to
any address at port "whatever" not all use port 80. 

Worse -- not all use TCP -- some use UDP which squid doesn't handle at
all.  In those cases, all you can do is setup NAT on your firewall and let
them talk through it.  Not great for security, but the writers of those
apps don't care about your security -- just their apps.  So you conform to
them or you don't run their apps -- nothing to do with squid.


None of this has anything to do with squid people -- nearly all your
problems are with the apps you are running -- they write their apps NOT to
work with proxies.  When they do that -- they are not going to work with
squid. 

Only well-behaved apps that work through some proxy (ANY PROXY!), will
work with squid.  Those that are ill behaved are just poorly behaved
children that refuse to 'get with the program'...  

Whatcha gonna do?


> If anybody can prove me wrong:

Consider yourself "proven wrong"you are pointing your fingers in the
wrong place.

*peace*, Linda




[squid-users] Resolved: Re: [squid-users] Time-based oddity that I can't quite nail down...

2010-03-18 Thread Kurt Buff
On Thu, Nov 12, 2009 at 17:49, Amos Jeffries  wrote:
> Kurt Buff wrote:
> 
>>
>>> By testing by running a browser on the proxy server itself you can
>>> identify if the problem is Squid or something outside Squid.
>>
>> Unless you think lynx (or some other text-based browser, or using
>> wget/curl/other) is sufficient, I am leery of doing this. I'd have to
>> install all the bits to do this, including X and a browser, and all
>> that. I'm all up for learning, though, so your suggestion on how to do
>> this is very warmly welcomed.
>
> The squidclient tool fits into the niche gap between telnet and lynx.
> Allowing a simple input of the request URL and optional other details and
> producing a dump of the results. It has no large dependencies.
>
> To test through the proxy:
>  squidclient http://example.com/
>
> To test without proxy:
>  squidclient -h example.com -p 80 /
>
>
>>
>> I was unable to do any testing in the past few days - a new child is
>> taking me away from working longer hours...
>>
>> I hope to do some work on it this weekend.
>>
>> Kurt
>
> Congratulations!
>
> Amos

Sorry for the very late followup, but I thought I'd outline what happened:

For diagnostics, I finally ended up using cURL on three different
FreeBSD boxes: One is the router for our DS3 and outside our firewall,
and two are internal (used for various network monitoring/management
functions) with one of those using our Active Directory DNS servers,
and the other using our ISPs DNS servers. This allowed me to test
several things at once - inside firewall, outside firewall, internal
DNS servers used and external DNS servers used.

I selected from the squid logs a set of around 2100 URLs from a single
day (basically, any url with 'www' in it), and pared them down to the
bare http://FQDN url, and used those as a comparison set across all
three machines for multiple runs.

Several nights (and days!) I ran the following command at the same
time (using at(1)) on each machine:

 /usr/local/bin/curl -K /root/urls.txt >> /root/DATE-curl-MachineName.out

Each entry in urls.txt file looks like the following:

 url = "http://h30333.www3.hp.com";
 -s
 -w = "%{url_effective}\t%{time_total}\t%{time_namelookup}\n"
 -o = /dev/null

I narrowed it down to that set of options, because I quickly figured
out that it wasn't squid at issue, after using it as a proxy in
urls.txt with the '-x' option.

I had suspected it was the firewall and its name resolution times,
which is why I included the '\t%{time_namelookup}' parameter, but that
was not the issue.

So, to end the suspense - it was the firewall's http(s) proxy, with
off-hours results for cURL behind the firewall at over 20 seconds
average page load time across the URL set, with some taking as long as
360+ seconds. I did no analysis of which pages were worst, nor why.

Applying patches to bring it current for the particular rev of the
software fixed the problem completely.

It was a weird and frustrating problem, but I feel pretty good about solving it.

Kurt


[squid-users] error libcap2 --

2010-03-18 Thread Ariel
Hello .. please someone can help me with this error because more than
a week ago that I'm swearing and I just realized I asked this

in Centos 5.4 i386 kernel 2.6.30 iptables 1.4.5

it asks for libcap2 and libcap2-dev, but there in centos 5.3 and I am
following this guide to install
http://www.eu.squid-cache.org/mail-archive/squid-users/200906/0602.html
someone has way to fix this?


Hola.. por favor alguien me puede ayudar con este error , ya que hace
mas de una semana que estoy renegando y recien me doy cuenta que me
pide esto

en Centos 5.4 i386 kernel 2.6.30 iptables 1.4.5

me pide que instale libcap2 y libcap2-dev, pero no existe en centos
5.3 y estoy siguiendo esta guia para instalarlo
http://www.eu.squid-cache.org/mail-archive/squid-users/200906/0602.html

alguien tiene forma de solucionar esto ?


[squid-users] Requests through proxy take 4x+ longer than direct to the internet

2010-03-18 Thread David Parks
Hi, I set up a dev instance of squid on my windows system.

I've configured 2 browsers (Chrome & Firefox), chrome direct to the
internet, firefox through the locally running instance of squid.

I expected similar response times from the two browsers, but I consistently
see firefox (configured to proxy through squid) takes 4x+ longer.

Below are the logs showing response times from a hit on yahoo.com, the
chrome browser opened the page in ~<2 seconds.

I have used the windows binaries of squid and configured digest password
authentication, everything else (other than default port) is left as default
in the config file.

After doing a packet capture I noted the following behavior:

   - When going through the proxy: 9 GET requests are made, and 9 HTTP
responses are received in a reasonable time period (<2sec)
   - After the 9th HTTP response is sent, there is a 4 second delay until
the next GET request is made
   - Then 6 GET requests are made, and 6 HTTP responses are received in a
reasonable amount of time.
   - After the 6th GET request in this second group there is a 5 second
delay until the next GET request is made.
   - This pattern repeats its self when the proxy is in use.
   - This pattern does not occur when I am not connected through the proxy.

Any thoughts on this behavior?

Thanks much,
David


Yahoo example log:

1268958646.966417 127.0.0.1 TCP_MISS/301 602 GET http://yahoo.com/ test
DIRECT/67.195.160.76 text/html
1268958652.263   5289 127.0.0.1 TCP_MISS/302 748 GET http://www.yahoo.com/
test DIRECT/209.191.122.70 text/html
1268958658.997   6726 127.0.0.1 TCP_MISS/200 38900 GET http://mx.yahoo.com/?
test DIRECT/209.191.122.70 text/html
1268958664.895   5132 127.0.0.1 TCP_MISS/200 1616 GET
http://d.yimg.com/a/i/ww/met/pa_icons/gmail_22_052809.gif test
DIRECT/189.254.81.8 image/gif
1268958664.908   5142 127.0.0.1 TCP_MISS/200 1118 GET
http://d.yimg.com/a/i/ww/met/pa_icons/glamout_22_012010.gif test
DIRECT/189.254.81.8 image/gif
1268958666.087   6140 127.0.0.1 TCP_MISS/200 32906 GET
http://l.yimg.com/br.yimg.com/i/img2/200911/111609_fp_movil.swf test
DIRECT/189.254.81.35 application/x-shockwave-flash







[squid-users] Squid not caching anything

2010-03-18 Thread jayesh chavan
Hi,
 My squid is working but not caching anything.What is
problem?Whenever I use purge command for any url,it replies 404 not
found.
Regards,
   Jayesh


Re: [squid-users] Re: Squid3 issues

2010-03-18 Thread Amos Jeffries

Linda Walsh wrote:

Gmail wrote:

I have used many softwares, packages, compiled stuff for years, never
ever had an experience such as this one, it's a package full of
headaches, and problem after problem, And to be honest the feedback I
get is always blaming other things, why can't you people just admit that
Squid doesn't work at all, and you are not providing any help
whatsoever, as if you expect everyone to be an expert.


 I've only seen one post by you on this list -- and that was about



"Gmail" (Adam?),
  I think most of the problem communicating with us is that your 
replies outlining the problems are going to individual people, not to 
the list itself. Those of us here who might be ale to help with the 
secondary problems are not even hearing about them.


What I've seen is;
  you post a problem description, somebody post a solution that 
_should_ work under some circumstances and could act as a pointer for 
further fixes or research if you understood them right.
 Then no further response from you. Which in these parts indicates you 
are happy with the solution and have moved on to other problems at your 
workplace.

 The rest of us make that assumption and move on to other peoples problems.


To fix this breakdown in communication:

 If you are using the gmail interface there is an advanced reply 
options that need to be setup. If you can do "Reply-To List" or 
"Reply-To All" the list should start getting the mails (check that the 
list address 'squid-users' is in the recipients set before sending 
anyway just to be sure).


 Other mailers tend to have those reply-to-all features somewhere as 
well, and more easily available.




increasing your linux file descriptors at process start time in linux
-- not something in the squid software -- but something you do in linux
before you call squid.  It *** SHOULD*** be in your squid's
/etc/init.d/squid startup script. --  you should see a command "ulimit -n
".

I have "ulimit -n 4096" in my squid's rc script.

It is a builtin in the "bash" script.  I don't know where else it is
documented, but if you use the linux-standard shell, "bash", it should
just work.  "-n" sets the number of open file descriptors.



FWIW, Myself or Luigi of Debian are the contact people for Squid 
problems on Ubuntu.


In Ubuntu and other Debian -derived OS it seems to be limited by both 
ulimit and the setting inside /proc/sys/fs/file-max.


The "squid" package from 2.7+ alters /proc/sys/fs/file-max as needed, 
and provides a max_fd configuration option for run-time settings. Plus 
adding a new "SQUID_MAXFD=1024" in /etc/default/squid increases the 
global limit set into /proc.


The "squid3" package does not alter /proc, but changing ulimit in 
/etc/init.d/squid3 can allow up to the /proc amount of FD to be used.
 The early 3.0 packages were built with an absolute max of 1024 FD, I 
think the newer ones are built with a higher limit





I uninstalled the version that was packaged with Ubuntu hardy, I am
trying to compile it so I won't have the same problem, with the file
descriptors, I followed exactly the suggestions in the configure --help
menu, yet I am getting an error, like Compile cannot create executable,
or something to that effect.


Maybe you should try a distribution where it is 1) known to work, or
2) already has a pre-compiled binary.


Linda,
 Ubuntu Hardy is one such. But the old packages have low FD built-in.

"Gmail",
  Regarding your earlier complaints which Nyamul Hassan kindly 
forwarded back to the list for the rest of us to see...


 Yes we know squid-3.0 (particularly the early releases) was very 
problematic. These problems have mostly been fixed over the last few 
years as people reported them. You seem to have been stuck with an old 
release of OS distribution and thus an old non-changeable squid version.


 If you are not tied to the LTS support, I would suggest trying an 
upgrade to Ubuntu Jaunty or Karmic. The Hardy "squid3" package has a lot 
of known and fixed issues.


 Yes, I read your reply to Nyamul indicating you were trying to build 
your own. Squid 3.x is mostly developed on Debian and Ubuntu. Your build 
problems are a mystery.  Self-builds usually fail due to wanting 
features built but not having the development libraries needed. If you 
want to continue the self-build route we can help, but will need to know 
exactly what the error messages are that you face.


I'm awaiting your response to Nyamul Hassans' last question before 
commenting on the config details for yoru setup.


Amos
--
Please be using
  Current Stable Squid 2.7.STABLE8 or 3.0.STABLE25
  Current Beta Squid 3.1.0.18


Re: [squid-users] Squid proxy Setup in fail-over mode

2010-03-18 Thread Amos Jeffries

GIGO . wrote:

How to setup squid proxy to run in fail-over mode? Any guide.
 


There is no such mode in Squid.

As the other respondents have said so far, to have fail-over from your 
users perspective when squid dies you need multiple squid and some load 
balancer setup (WPAD counts as a load balancer).


To have squid performing fail-over between multiple web servers where 
data is sourced, you need do nothing particular. This is how squid is 
designed to work.


Amos
--
Please be using
  Current Stable Squid 2.7.STABLE8 or 3.0.STABLE25
  Current Beta Squid 3.1.0.18


Re: [squid-users] Squid Accelerator mode: HTTP/1.0 and 'defaultsite' header

2010-03-18 Thread Amos Jeffries

Riccardo Castellani wrote:
Most clients these days will do so regardless of their version. 
defaultsite is completely optional, in your case if you omit it broken 
clients will get the squid "invalid request" error page instead of 
tomcat front page


If I insert 'defaultsite', I think so for HTTP/1.0 clients :

host header (http://pages.example.com) is present in the request,  but I 
think the http packet contains "GET command" with the complete URL (e.g. 
http://pages.example.com/mkLista.do?code=A) so they will be able to ask 
correct url.

Why do you say "... instead of tomcat front page" ?
"Tomcat front" page appears only you request http://pages.example.com.



HTTP standards require clients to send the Host: header.
If they do not, squid looks for a configured defaultsite= and uses that 
instead, if neither is present the client gets an error page.


When the defaultsite is set and squid will use it and pass the broken 
request on to tomcat. Resulting in the tomcat response for whatever URL 
was requested.


Amos
--
Please be using
  Current Stable Squid 2.7.STABLE8 or 3.0.STABLE25
  Current Beta Squid 3.1.0.18


Re: [squid-users] Cancelled downloads

2010-03-18 Thread Amos Jeffries

CASALI COMPUTERS - Michele Brodoloni wrote:

Hello,
is it possible to stop squid from keep downloading a file when a user stops the 
download from his browser?
If an user initiates a 1GB of web download and then hits “cancel”, squid 
doesn’t mind it and continues to download until it finishes, and this is a 
waste of bandwidth.

Is there a solution for this behavior?



This is the default behaviour of Squid.

Check your configuration settings for:
 http://www.squid-cache.org/Doc/config/quick_abort_max/
 http://www.squid-cache.org/Doc/config/quick_abort_min/
 http://www.squid-cache.org/Doc/config/quick_abort_pct/
 http://www.squid-cache.org/Doc/config/range_offset_limit/


Amos
--
Please be using
  Current Stable Squid 2.7.STABLE8 or 3.0.STABLE25
  Current Beta Squid 3.1.0.18


Fwd: [squid-users] Squid3 issues

2010-03-18 Thread Nyamul Hassan
Hi,

As a normal courtesy on regular mailing lists, it is more appropriate
to use your "regular name", rather than just "GMail".  The answers on
this list still come from humans, and it's always nice to know the
name of the person we're communicating with.

Also, in one of your emails, you said that you had a FD problem, which
can only happen if you have a working Squid, which is processing a lot
of requests.  Please confirm if that is correct.

And, if your're seeing this, then I believe you have already read
Amos's post.  I'm forwarding this to the list.  I'm more of a "forward
proxy" guy, so the more adept members of the list would be of more
helpful in your scenario.

Regards
HASSAN




-- Forwarded message --
From: Gmail 
Date: Fri, Mar 19, 2010 at 3:29 AM
Subject: Re: [squid-users] Squid3 issues
To: Nyamul Hassan 


I'd rather use it in hosting like setup, considering I have other
clients not only the webservers
so if it's possible which I believe it is, to use it as Hosting setup
Thanks

Let me give you a quick insight of my network

All my machines run Ubuntu hardy 8 my network is based on 192.1.1.0/24
1) DNS / DHCP   Examples (192.168.1.1)
2) Router (Squid) Proxy    (192.168.1.4)
3) Webserver  xxx.xxx.x. 5
4) Websever   xxx.xxx.x.6
5) Websever  xxx.xxx.x 7
6) IRC Server xxx.xxx.110
7) Digichat 100% (java) / Flash Servers xxx.xxx.x 112
5) Windows XP clients range 192.168.1.3 - 192.168.1.2 - 192.168.1.8 -
192.168.1.111 - 192.168.1.113
Other machines are not connected yet
The above are just examples
Two network switches

Hope that helps
Thanks



- Original Message - From: "Nyamul Hassan" 
To: "Squid Users" 
Sent: Thursday, March 18, 2010 9:05 PM
Subject: Re: [squid-users] Squid3 issues


So, do you want to use proxy in an ISP like setup?  Or in a Web
Hosting like setup?

Regards
HASSAN




On Fri, Mar 19, 2010 at 2:25 AM, Gmail  wrote:
>
> Ok I'll try and clarify it (thanks btw)
> I am running 3 websites on one single machine and have been for few years,
> then the load started to grow, then I decided to have a go at a proxy
> server:
> I was actually putting off for a couple of years, simply because I am very
> restricted time wise
> I have as I said 3 different websites running on one single machine in a
> vhost mode
>
> three websites with three different domain names.
>
> Let's say 1) example.com, example.net, example.org all pointing eventually
> to the same IP address
> as I said it worked perfectly but it started to slow down a bit as the load
> gets too much for one machine to handle.
> On top of that I run other servers on different machines, such as Chat
> servers (IRC, Flash, DigiChat) , and various other applications.
>
> Now, I am using this machine as a proxy server (reverse proxy server) and a
> router at the same time using iptables, and I use another machine as a
> DNS/DHCP servers, all configured and working fine indeed no problems at all.
>
> Now, I really struggled to get the clients on my network to have access to
> the internet, I mean just to browse the net, I did in the end, but every
> single example I followed not a single one worked for me, I don't know how
> many forums and articles I read.
> I have applied so many examples no luck.
>
> So basically no requests were passed to the backend server, all I wanted is
> to get those requests forwarded to the web-server and if that works then I
> will add three more machines as backend servers and each machine will hold
> one website with it's DB and so on..
>
> That was my plan anyway, And I found myself in ever decreasing circle going
> around in circle, following some people's examples and nothing worked, I
> tried to find information for example about, how to setup a cache parent,
> sibbling and so on, not a single word about, I even read O'reilly's
> articles.
>
>
> In those examples for instance they mention a parent in order to forward a
> request, without telling you how to set a parent, and if you don't have a
> parent, does that mean you can't use a proxy server, and If I had a parent
> where would it be? and how to decide which one is the parent and which one
> is the child etc.. NO indication not a single word, "they expect" you to
> know all that as if you spent all you life working on their project, it
> never occured to them that maybe some people won't know what is a parent or
> how to set it up and so on..
>
>
> I can go on like this for a whole night, I know you're trying to help but to
> be perfectly honest I am put off by this whole thing, I don't think I want
> to use Squid at all, I reached a saturation point now.
>
> You see I know even if I get the thing off the ground now, I am sure in a
> few weeks time it will whinge at me or even in a few days time.
>
> Maybe one day if I have the time I can look into it in more details and take
> the time to understand first it's concept and the way it works, it seems to
> have it's own logic.
>
> If not I will just have to either purchase a so

Re: [squid-users] error libcap2 --

2010-03-18 Thread Amos Jeffries

Ariel wrote:

Hello .. please someone can help me with this error because more than
a week ago that I'm swearing and I just realized I asked this

in Centos 5.4 i386 kernel 2.6.30 iptables 1.4.5

it asks for libcap2 and libcap2-dev, but there in centos 5.3 and I am
following this guide to install
http://www.eu.squid-cache.org/mail-archive/squid-users/200906/0602.html
someone has way to fix this?



What error?

As I understand it libcap2" is a piece of system software, not an error.

Could you clarify please what problem you have hit?


Amos
--
Please be using
  Current Stable Squid 2.7.STABLE8 or 3.0.STABLE25
  Current Beta Squid 3.1.0.18