RE: [squid-users] Can squid acted as a application SSL proxy

2008-11-27 Thread 李春

Thanks for your reply.

> Date: Fri, 28 Nov 2008 16:19:36 +1300
> From: [EMAIL PROTECTED]
> To: [EMAIL PROTECTED]
> CC: [EMAIL PROTECTED]; squid-users@squid-cache.org
> Subject: Re: [squid-users] Can squid acted as a application SSL proxy
> 
> 李春 wrote:
>> Thanks for you help.
>> But I am sorry you may mistook my meaning entirely.
>> I do not need the http proxy and cache functionality of squid.
>> I just wander that if the squid can receive the client SSL connetion( or 
>> packages)
>> , decode it and tranfer the data with no SSL to the server as a transparent 
>> layer.
>> squid using SSL may be like this: 
>> 
>> http data
>> 
>> SSL
>> 
>> TCP/IP
>> 
>> But I wonder if the squid can act like this
>> 
>> my application data
>> 
>> SSL
>> 
>> TCP/IP
>> 
>> Thanks very much.
>> yours,
>> Pickup.Li
>> 
> 
> You seem to misunderstand the network layering concept.
> 
> You want something that connects to clients using HTTPS (HTTP/SSL) and
> connects them to your application using plain HTTP?
> 
> The name for such configuration is "reverse proxy".
> http://wiki.squid-cache.org/SquidFaq/ReverseProxy
> 
> Only the front listening port is configured with https_port instead of
> http_port.
> 
Yes. I want to build ReverseProxy of squid. 
And I have manage to build it with "https_port" in my environment.
But my client is not web explorer but a application.


> caching is optional.
> 
> The action of wrapping/unwrapping SSL requires proxy of some type,
> sometimes called tunnel agents.
> 
Yes. You get it. I just want the "wrapping/unwrapping SSL requires proxy"
and wander if squid can be configuratured as it. 
if any other exist open source project major in it, Please let me know.
I am very appreciated for your help.
 

> Amos
> 
> 
> 
>> 
>> 
>> 
>>> Date: Thu, 27 Nov 2008 15:54:08 +0100
>>> From: [EMAIL PROTECTED]
>>> To: squid-users@squid-cache.org
>>> Subject: Re: [squid-users] Can squid acted as a application SSL proxy
>>>
>>> On 27.11.08 09:45, 李春 wrote:
>>>
>>> Please configure your mailer to wrap lines below 80 characters per line.
>>>
 I have a client/server application program and want to add SSL module to
 it to secure the data transferring on the network. I wander that if I can
 use the squid as a SSL proxy between client and server. The squid will
 configurated as a reserve proxy and located in the application server's
 environment. The client and squid contact with SSL connection. Just like
 this:
 <-(no SSL)-- <-(SSL)--
 Server Squid client
 --(no SSL)-> --(SSL)->

 I know squid can act as web proxy like this using "https_port". But I am 
 curious that if I can make use of squid like this.
>>> Yes, that's what https_port is for. Just properly configure squid as reverse
>>> proxy.
>>>
>>> -- 
>>> Matus UHLAR - fantomas, [EMAIL PROTECTED] ; http://www.fantomas.sk/
>>> Warning: I wish NOT to receive e-mail advertising to this address.
>>> Varovanie: na tuto adresu chcem NEDOSTAVAT akukolvek reklamnu postu.
>>> I wonder how much deeper the ocean would be without sponges. 
>> _
>> 新版手机MSN,新功能,新体验!满足您的多彩需求!
>> http://mobile.msn.com.cn
> 
> 
> -- 
> Please be using
> Current Stable Squid 2.7.STABLE5 or 3.0.STABLE10
> Current Beta Squid 3.1.0.2
_
MSN热搜榜,每天最In的信息资讯和热点排行让您一览无余!
 http://top.msn.com.cn


Re: [squid-users] 2 squid server

2008-11-27 Thread ░▒▓ ɹɐzǝupɐɥʞ ɐzɹıɯ ▓▒░
yes i found the problem,
it's because Server A ( squid ) use round-robin Parent to server B
that have HAVP and squid in one machine :(

thx for the support and info

On Fri, Nov 28, 2008 at 10:25 AM, Amos Jeffries <[EMAIL PROTECTED]> wrote:
> ░▒▓ ɹɐzǝupɐɥʞ ɐzɹıɯ ▓▒░ wrote:
>>
>> hi all
>> i have problem here
>>
>> server A 192.168.222.111 squid port 2210
>> server B 192.168.222.100 squid port 2012
>>
>> when i put this line on Server A on the first line
>> cache_peer 192.168.222.100 parent 2012 0 no-query no-digest default
>>
>> and i put this line at server B
>> First line :
>> cache_peer 192.168.222.111 sibling 2012 0 no-query no-digest default
>>
>> and at line after
>> acl manager proto cache_object
>> acl all src 0.0.0.0/0.0.0.0
>> acl localhost src 127.0.0.1
>> acl SSL_ports port 443 563
>> acl Safe_ports port 21 80 81 53 143 2443 443 563 70 210 1025-65535
>> acl Safe_ports port 280
>> acl Safe_ports port 488
>> acl Safe_ports port 591
>> acl Safe_ports port 777
>> acl CONNECT method CONNECT
>>
>> i put :
>> cache_peer_access 192.168.222.111 allow all
>>
>>
>>
>> some how it doest work
>> at server B's log it say  192.168.222.111 DENIED
>>
>> my question :
>> 1. why it say denied ?
>> 2. if i have user rule at server A , is the rule will still work ? or
>> i must put the rule again at server B
>>
>> ~~~ it;s urgent but not about live ^^
>
>  http://wiki.squid-cache.org/SquidFaq/ReverseProxy
>
> note the config examples for standard reverse proxy.
>
> there is no reason why server at 192.168.222.100 or Squid should be asked to
> supply http://192.168.222.111/  requests in the first place.
>
> Amos
> --
> Please be using
>  Current Stable Squid 2.7.STABLE5 or 3.0.STABLE10
>  Current Beta Squid 3.1.0.2
>



-- 
-=-=-=-=
http://amyhost.com
Hot News !!! : Dikarenakan Banyaknya permintaan Domain registration
sehingga Stok Saldo kini terupdate menggunakan Kurs saat ini yaitu Rp.
85.000 untuk non-reseller | Rp. 82.000 untuk Reseller

Pengin punya Layanan SMS PREMIUM ?
Contact me ASAP. dapatkan Share revenue MAXIMAL tanpa syarat traffic...


Re: [squid-users] ICAP help

2008-11-27 Thread malmeida
Thanks Christos,

Found my error in spelling mistake for downloads instead of download.

but how come eicar.com and eicar.com.txt dint had any problem. Problem was
only for compress files i guess coz i requires to download and scan

One more question
Is it possible to scan (download) any https request

//Remy 


On Thu, 27 Nov 2008 23:20:17 +0200, Christos Tsantilas
<[EMAIL PROTECTED]> wrote:
> Hi Remy,
> 
>   OK so squid use the ICAP server and probably the squid part of your 
> configuration is OK.
> 
> Please look on both squid logs and icap server logs for error messages. 
> Should exist something in the logs which explains the reason of the
error.
> 
> Also look in your c-icap configuration. For example, has the c-icap 
> server write access to all directories in which is trying to write? The 
> /var/tmp and /tmp/download/ directories in your case. (Also c-icap has 
> its own mailing list probably you should ask here..)
> 
> --
>Christos
> 
> [EMAIL PROTECTED] wrote:
>> Thanks Christos,
>> 
>> after purging it form squid cache it work fine able to scan.
>> But now another problem when I try to download a zip virus file
>> http://www.eicar.org/download/eicar_com.zip
>> 
>> ERROR in the browser
>> The following error was encountered while trying to retrieve the URL:
>> http://www.eicar.org/download/eicar_com.zip
>> 
>> ICAP protocol error.
>> 
>> The system returned: [No Error]
>> 
>> This means that some aspect of the ICAP communication failed.
>> 
>> Some possible problems are:
>> 
>> *
>> 
>>   The ICAP server is not reachable.
>> *
>> 
>>   An Illegal response was received from the ICAP server.
>> 
>> 
>> //Remy
>> 
>>



Re: [squid-users] Cache_dir more than 10GB

2008-11-27 Thread Adrian Chadd
2008/9/29 Amos Jeffries <[EMAIL PROTECTED]>:

>  Squid-2 has issues with handling of very large individual files being
> somewhat slow.

Only if you have an insanely large cache_mem and
maximum_object_size_in_memory setting. Very large individual files on
disk are handled just as efficiently across all Squid versions.

If its kept low then it performs just fine.




Adrian


Re: [squid-users] tuning an overloaded server

2008-11-27 Thread Adrian Chadd
Gah, they way they work is really quite simple.

* ufs does the disk io at the time the request happens. It used to try
using select/poll on the disk fds from what I can gather in the deep,
deep dark history of CVS but that was probably so the disk io happened
in the next IO loop so recursion was avoided.

* aufs operations push requests into a global queue which are then
dequeued by the aio helper threads as they become free. The aio helper
threads do the particular operation (open, close, read, write, unlink)
and then push the results into a queue so the main squid thread can
handle the callbacks at a later time.

* diskd operations push requests into a per storedir queue which is
then dequeued in order, one operation at a time, by the diskd helper.
The diskd helper does the normal IO operations (open, close, read,
write, unlink) and holds all the disk filedescriptors (ie, the main
squid process doesn't hold open the disk FDs; they're just given
"handles".) The diskd processes do the operation and then queue the
result back to the main squid process which handles the callbacks at a
later time.

AUFS works great where the system threads allow for concurrent
blocking syscalls. This meant Linux (linuxthreads being just
processes) and Solaris in particular worked great. The BSDs used
userland threads via a threading library which "wrapped" syscalls to
try and be non-blocking. This wouldn't work for disk operations and so
a disk operation stalled all threads in a given process. diskd, as far
as I can gather (duane would know better!) came into existance to
solve a particular problem or two, and one of those problems was the
lack of scalable disk IO available in the BSDs.

FreeBSD in particular has since grown a "real" threading library which
supports disk IO happening across threads quite fine.

The -big- difference right now is how the various disk buffer cache
and VM systems handle IO. By default, the AUFS support in Squid only
uses the aio helper threads for a small subset of the operations. This
may work great under Linux but operations such as write() and close()
block under FreeBSD (think 'writing out metadata', for example) and
this mostly gives rise to the notion of Linux "being better" by most
people who haven't studied the problem in depth. :)

hope that helps,



Adrian

2008/11/27 Amos Jeffries <[EMAIL PROTECTED]>:
> B. Cook wrote:
>>
>> On Nov 22, 2008, at 7:30 AM, Amos Jeffries wrote:
>>
>> 8< -- snip -- >8
>>

>>>
>>> That said BSD family of systems get more out of diskd than aufs in
>>> current Squid.
>>>
>
>>> --
>>> Please be using
>>>  Current Stable Squid 2.7.STABLE5 or 3.0.STABLE10
>>>  Current Beta Squid 3.1.0.2
>>
>> Hello,
>>
>> Sorry to bother..
>>
>> so even in any FreeBSD (6.3, 7.0, etc..) diskd is still better than aufs?
>>
>> and if so,
>>
>> http://wiki.squid-cache.org/Features/DiskDaemon
>>
>> this page talks about 2.4
>>
>> and I can't seem to find an aufs page.. I can find coss, but coss has been
>> removed from 3.0..
>>
>> so again, diskd should be what FreeBSD users use?  As well as the kernel
>> additions?  Even on 6.3 and 7.0 machines amd64 and i386 alike?
>
> Yes. We have some circumstantial info that leads to believe its probably a
> bug in the way Squid uses AUFS and the underlying implementation differences
> in FreeBSD and Linux. We have not yet had anyone investigate deeply and
> correct the issue. So its still there in all Squid releases.
>
>
>>
>> Thanks in advance..
>>
>> (I would think a wiki page on an OS would be very useful.. common configs
>> for linux 2.x and BSD, etc.. )
>>
>> Many people are not as versed in squid as the developers, and giving them
>> guidelines to follow would probably make it easier for them to use.. imho.
>>
>> They don't understand coss vs aufs vs diskd vs ufs.. ;)
>
> We are trying to get there :). It's hard for just a few people and
> non-experts in many areas at that. So if anyone has good knowledge of how
> AUFS works jump in with a feature page analysis.
>
> What we have so far in the way of config help is explained at
> http://wiki.squid-cache.org/SquidFaq/ConfiguringSquid#head-ad11ea76c4876a92aa1cf8fb395e7efd3e1993d5
>
> Amos
> --
> Please be using
>  Current Stable Squid 2.7.STABLE5 or 3.0.STABLE10
>  Current Beta Squid 3.1.0.2
>
>


Re: [squid-users] Disk space over limit Warning

2008-11-27 Thread Amos Jeffries

Wilson Hernandez - MSD, S. A. wrote:

Yes. I did run squid -z and it created all the directories.

Paul Bertain wrote:

Hi Wilson,

Did you run "squid -z" after changing your settings?  For themto take 
effect, I believe you need to run "squid -a" again.


Paul



On Nov 28, 2008, at 15:53, "Wilson Hernandez - MSD, S. A." 
<[EMAIL PROTECTED]> wrote:



Hello;

I currently have a network with about 30 users and my swap space 
tends to fill up quite quickly. I increased the swap three weeks ago 
from:


#cache_dir ufs /var/log/squid/cache 5000 16 256 to
cache_dir ufs /var/log/squid/cache 1 255 255

Now, I'm getting the same warning:

2008/11/27 13:59:00| WARNING: Disk space over limit: 10241036 KB > 
1024 KB


If I leave it as is will I have problems in the future or should I 
change it to what? What is a safe size for this?


Thank you in advanced for all your help.







Check your garbage collection settings.

http://www.squid-cache.org/Doc/config/cache_replacement_policy/
http://www.squid-cache.org/Doc/config/cache_swap_high/
http://www.squid-cache.org/Doc/config/cache_swap_low/


Amos
--
Please be using
  Current Stable Squid 2.7.STABLE5 or 3.0.STABLE10
  Current Beta Squid 3.1.0.2


Re: [squid-users] 2 squid server

2008-11-27 Thread Amos Jeffries

░▒▓ ɹɐzǝupɐɥʞ ɐzɹıɯ ▓▒░ wrote:

hi all
i have problem here

server A 192.168.222.111 squid port 2210
server B 192.168.222.100 squid port 2012

when i put this line on Server A on the first line
cache_peer 192.168.222.100 parent 2012 0 no-query no-digest default

and i put this line at server B
First line :
cache_peer 192.168.222.111 sibling 2012 0 no-query no-digest default

and at line after
acl manager proto cache_object
acl all src 0.0.0.0/0.0.0.0
acl localhost src 127.0.0.1
acl SSL_ports port 443 563
acl Safe_ports port 21 80 81 53 143 2443 443 563 70 210 1025-65535
acl Safe_ports port 280
acl Safe_ports port 488
acl Safe_ports port 591
acl Safe_ports port 777
acl CONNECT method CONNECT

i put :
cache_peer_access 192.168.222.111 allow all



some how it doest work
at server B's log it say  192.168.222.111 DENIED

my question :
1. why it say denied ?
2. if i have user rule at server A , is the rule will still work ? or
i must put the rule again at server B

~~~ it;s urgent but not about live ^^


 http://wiki.squid-cache.org/SquidFaq/ReverseProxy

note the config examples for standard reverse proxy.

there is no reason why server at 192.168.222.100 or Squid should be 
asked to supply http://192.168.222.111/  requests in the first place.


Amos
--
Please be using
  Current Stable Squid 2.7.STABLE5 or 3.0.STABLE10
  Current Beta Squid 3.1.0.2


Re: [squid-users] Can squid acted as a application SSL proxy

2008-11-27 Thread Amos Jeffries
李春 wrote:
> Thanks for you help.
> But I am sorry you may mistook my meaning entirely.
> I do not need the http proxy and cache functionality of squid.
> I just wander that if the squid can receive the client SSL connetion( or 
> packages)
> , decode it and tranfer the data with no SSL to the server as a transparent 
> layer.
> squid using SSL may be like this: 
> 
> http data
> 
> SSL
> 
> TCP/IP
> 
> But I wonder if the squid can act like this
> 
> my application data
> 
> SSL
> 
> TCP/IP
> 
> Thanks very much.
> yours,
> Pickup.Li
>  

You seem to misunderstand the network layering concept.

You want something that connects to clients using HTTPS (HTTP/SSL) and
connects them to your application using plain HTTP?

The name for such configuration is "reverse proxy".
  http://wiki.squid-cache.org/SquidFaq/ReverseProxy

Only the front listening port is configured with https_port instead of
http_port.

caching is optional.

The action of wrapping/unwrapping SSL requires proxy of some type,
sometimes called tunnel agents.

Amos



>  
> 
> 
>> Date: Thu, 27 Nov 2008 15:54:08 +0100
>> From: [EMAIL PROTECTED]
>> To: squid-users@squid-cache.org
>> Subject: Re: [squid-users] Can squid acted as a application SSL proxy
>>
>> On 27.11.08 09:45, 李春 wrote:
>>
>> Please configure your mailer to wrap lines below 80 characters per line.
>>
>>> I have a client/server application program and want to add SSL module to
>>> it to secure the data transferring on the network. I wander that if I can
>>> use the squid as a SSL proxy between client and server. The squid will
>>> configurated as a reserve proxy and located in the application server's
>>> environment. The client and squid contact with SSL connection. Just like
>>> this:
>>> <-(no SSL)-- <-(SSL)--
>>> Server Squid client
>>> --(no SSL)-> --(SSL)->
>>>
>>> I know squid can act as web proxy like this using "https_port". But I am 
>>> curious that if I can make use of squid like this.
>> Yes, that's what https_port is for. Just properly configure squid as reverse
>> proxy.
>>
>> -- 
>> Matus UHLAR - fantomas, [EMAIL PROTECTED] ; http://www.fantomas.sk/
>> Warning: I wish NOT to receive e-mail advertising to this address.
>> Varovanie: na tuto adresu chcem NEDOSTAVAT akukolvek reklamnu postu.
>> I wonder how much deeper the ocean would be without sponges. 
> _
> 新版手机MSN,新功能,新体验!满足您的多彩需求!
> http://mobile.msn.com.cn


-- 
Please be using
  Current Stable Squid 2.7.STABLE5 or 3.0.STABLE10
  Current Beta Squid 3.1.0.2


Re: [squid-users] I need help to find my error

2008-11-27 Thread Amos Jeffries

Mariel Sebedio wrote:

Hello, I have a squid-2.6.STABLE16-2.fc8 on RHEL 5.1

I need to access a this video on Macromedia-Flash but my squid 
configuration does not permit.


This is de  url: 
http://wireless.agilent.com/vcentral/viewvideo.aspx?vid=349


When I test de page whitout squid this access open in the source the 
port 80 and 1935.


When I conect through squid, the page does not open the video but in the 
access.log in the page have no errors.


I copy the access.log, and my squid Ports configuration.

Thanks a lot for your help.

Mariel



Squid does not handle non-HTTP streaming content on port 1935.

I'd put this down to a broken stream server denying your stream request 
because it comes from a different IP than the initial HTTP video 
front-end requests.


The only workaround is to selectively not use the proxy for such requests.

Amos
--
Please be using
  Current Stable Squid 2.7.STABLE5 or 3.0.STABLE10
  Current Beta Squid 3.1.0.2


Re: [squid-users] Can I Force Connections To All or Some Sites To Traverse using HTTP 1.1?

2008-11-27 Thread Amos Jeffries

Matus UHLAR - fantomas wrote:

On 26.11.08 10:57, [EMAIL PROTECTED] wrote:

Please set up your mailer to wrap lines below 80 characters per line.


I have a proxy-to-proxy setup (without ICP) and it is working wonderfully
with the exception of cases whereby IE users attempt to connect to a
remote Citrix server.  The odd thing is that the errors encountered do not
seem to happen at all when users use Firefox.

When IE initiates traffic to the Citrix site, it uses HTTP 1.0, somewhere
along the way, the Citrix site (or other Proxy, which I have no control
nor ability to see into) returns HTTP 1.1 traffic.  At this point, the 1.1
trafic arrives back to my proxy, converting it back to the original HTTP
1.0 format before it passes the traffic back to IE user.


Server must not return HTTP/1.1 traffic for HTTP/1.0 request. If it does,
it's broken.


So it appears that my Squid proxy tries to convert to HTTP 1.0, but only
for IE sessions as Firefox users never have these issues, also Firefox
uses 1.1 anyhow, thus not requiring any conversions.

Any thoughts?  Is there any way to force or preserve the HTTP protocol
version to 1.1 on all connections, or preferably on a destination basis?


Squid is not HTTP/1.1 server, it's only HTTP/1.0. So, it does not convert
HTTP/1.1 requests. 


Squid has many hacks to cope with broken servers, HTTP/1.1 responders 
included.


This appears to be a bug in IE with it not including such hacks to cope 
with 1.1 response to its 1.0 mode requests.


Amos
--
Please be using
  Current Stable Squid 2.7.STABLE5 or 3.0.STABLE10
  Current Beta Squid 3.1.0.2


Re: [squid-users] Squid Logging

2008-11-27 Thread Amos Jeffries

Ressa wrote:

Hi,

I was wondering can I made the squid log their activities to the 
database server (such as MySQL or something) and is there any tools can 
provide information from those database.


Thanks


't would be a rarity. Especially as Squid does not natively support 
database logging and the daemon is still a relatively new feature.


Amos
--
Please be using
  Current Stable Squid 2.7.STABLE5 or 3.0.STABLE10
  Current Beta Squid 3.1.0.2


Re: [squid-users] Exchange ActiveSync and squid reverse proxy

2008-11-27 Thread Amos Jeffries

Koopmann, Jan-Peter wrote:

Hi,

we are using squid as reverse proxy for Outlook RPC over HTTPS without
any problems. Today some iPhone users/customers wanted to use Exchange
ActiveSync as well so I decided to "simply" allow
/Microsoft-Server-ActiveSync/* as well and hoped all is well. Far from
it. The Exchange ActiveSync account on the iPhone can be setup without
problems and verifies. Sending E-Mails is no problem. However receiving
e-mails and e-mail push does not work at all. Whenever the user tries to
receive mails I see this in cache.log:

2008/11/25 11:28:26| The request OPTIONS
https://outlook.test.de:443/Microsoft-Server-ActiveSync is ALLOWED,
because it matched 'url_allow'
2008/11/25 11:28:26| The reply for OPTIONS
https://outlook.test.de/Microsoft-Server-ActiveSync is ALLOWED, because
it matched 'all'
2008/11/25 11:28:26| Invalid chunk header '
'
2008/11/25 11:28:26| clientWriteComplete: Object aborted
2008/11/25 11:28:27| The request POST
https://outlook.test.de:443/Microsoft-Server-ActiveSync?User=testuser&De
viceId=Appl88843DYCY7H&DeviceType=iPhone&Cmd=FolderSync is ALLOWED,
because it matched 'url_allow'
2008/11/25 11:28:27| clientReadBody: start fd=12 body_size=13
in.offset=13 cb=0x8088430 req=0x888c000
2008/11/25 11:28:27| clientProcessBody: start fd=12 body_size=13
in.offset=13 cb=0x8088430 req=0x888c000
2008/11/25 11:28:27| clientProcessBody: end fd=12 size=13 body_size=0
in.offset=0 cb=0x8088430 req=0x888c000
2008/11/25 11:28:27| The reply for POST
https://outlook.test.de/Microsoft-Server-ActiveSync?User=testuser&Device
Id=Appl88843DYCY7H&DeviceType=iPhone&Cmd=FolderSync is ALLOWED, because
it matched 'all'
2008/11/25 11:28:27| Invalid chunk header '
'
2008/11/25 11:28:27| clientWriteComplete: Object aborted

This seems to be the root of the problem but how do I fix it if I can
fix it at all? Customer is running squid-2.7.4 against Exchange 2003.


Any help greatly appreciated.

Regards,
  JP


Appears to be squid not handling a chunked-encoding situation properly.
Please report this as a bug.

If you intended to have Squid simulating a HTTP/1.1 service you have 
problems. Otherwise you can use the Accept-Encoding workaround 
(basically deny the header to those requests).


Amos
--
Please be using
  Current Stable Squid 2.7.STABLE5 or 3.0.STABLE10
  Current Beta Squid 3.1.0.2


Re: [squid-users] Change squid binary in flight

2008-11-27 Thread Amos Jeffries

Lluis Ribes wrote:

Hi,

I've finally resolve my problem. I compiled the version 3 STABLE10 with
--with-filedescriptors=8192 and --prefix=/opt/squid. After this, I shutdown
down Squid, I ran "make install" and squid was installed in the same Debian
default location (/opt/squid/), but only the binary: my config file doesn't
change it :)

The last step wasn't configure max_filedescriptors parameter in squid.conf,
because this parameter doesn't work, I don't know why... the last step was
to write the line "ulimit -n 8192" in the startup script
(/etc/init.d/squid), at the beginning.


For the record: max_filedescriptors options is not ported to Squid-3 yet.

Amos



I started squid (/etc/init.d/squid) and now I'm running with 8192 file
descriptors!

Thanks all for your help!

Lluís Ribes

-Mensaje original-
De: Amos Jeffries [mailto:[EMAIL PROTECTED] 
Enviado el: viernes, 21 de noviembre de 2008 4:19

Para: Kinkie
CC: Lluis Ribes; squid-users@squid-cache.org
Asunto: Re: [squid-users] Change squid binary in flight


On Thu, Nov 20, 2008 at 7:10 PM, Lluis Ribes <[EMAIL PROTECTED]> wrote:

Dear Squid Folks,

I have a Squid 3.0Stable1 running in a server. This version was
installed
with apt-get Debian package utility. So, it has worked fine until now,
where
I have file descriptor problems.

I saw that my installation has 1024 files as max_filedescriptor, I think
not
much. I want to change it, but the parameter max_filedescriptor in
squid.conf doesn't work (I receive an error message about unknown
parameter).

Are you sure? It may be a runtime limitation; please check that
there's a 'ulimit -n 8192' line in the squid startup script (replace
8192 with your desired limit).


So, I think thah the only way is recompile with file_descriptor flag:

./configure --with-filedescriptors=8192 --prefix=/opt/squid
--with-openssl
--enable-ssl --disable-internal-dns --enable-async-io
--enable-storeio=ufs,diskd

Ok, I compiled Squid 3.0Stable10. So my question is:

Could I replace directly the binary that it was generated by my
compilation
process and located in $SQUID_SOURCE/src/squid with my debian binary
version
that it's running nowadays? I have to avoid lost of service of my web.

the debian package may have different configure options; if you miss
some configuration option your configuration file may be incompatible
with your new binary.
you may want to run 'squid -v' to check that your new configure
options are compatible with the previous ones.

You may also want to keep your old binary around, to be able to roll
back in case of problems.



Indeed. There is at least one patch needed to make squid log correctly in
Debian. http://wiki.squid-cache.org/SquidFaq/CompilingSquid (Debian
section)

IIRC the packages are built with 4096 fd by default, maybe stable1 missed
out for some reason. stable8 is available for Debian, please try that
before going to a custom build.

If you must, using the exact same configure options as your packaged build
and the logs/log patch leaves "make install" placing all binaries in the
correct places for a /etc/init.d/restart to work.

Amos





--
Please be using
  Current Stable Squid 2.7.STABLE5 or 3.0.STABLE10
  Current Beta Squid 3.1.0.2


RE: [squid-users] Can squid acted as a application SSL proxy

2008-11-27 Thread 李春

Thanks for you help.
But I am sorry you may mistook my meaning entirely.
I do not need the http proxy and cache functionality of squid.
I just wander that if the squid can receive the client SSL connetion( or 
packages)
, decode it and tranfer the data with no SSL to the server as a transparent 
layer.
squid using SSL may be like this: 

http data

SSL

TCP/IP

But I wonder if the squid can act like this

my application data

SSL

TCP/IP

Thanks very much.
yours,
Pickup.Li
 
 


> Date: Thu, 27 Nov 2008 15:54:08 +0100
> From: [EMAIL PROTECTED]
> To: squid-users@squid-cache.org
> Subject: Re: [squid-users] Can squid acted as a application SSL proxy
> 
> On 27.11.08 09:45, 李春 wrote:
> 
> Please configure your mailer to wrap lines below 80 characters per line.
> 
>> I have a client/server application program and want to add SSL module to
>> it to secure the data transferring on the network. I wander that if I can
>> use the squid as a SSL proxy between client and server. The squid will
>> configurated as a reserve proxy and located in the application server's
>> environment. The client and squid contact with SSL connection. Just like
>> this:
>> <-(no SSL)-- <-(SSL)--
>> Server Squid client
>> --(no SSL)-> --(SSL)->
>> 
>> I know squid can act as web proxy like this using "https_port". But I am 
>> curious that if I can make use of squid like this.
> 
> Yes, that's what https_port is for. Just properly configure squid as reverse
> proxy.
> 
> -- 
> Matus UHLAR - fantomas, [EMAIL PROTECTED] ; http://www.fantomas.sk/
> Warning: I wish NOT to receive e-mail advertising to this address.
> Varovanie: na tuto adresu chcem NEDOSTAVAT akukolvek reklamnu postu.
> I wonder how much deeper the ocean would be without sponges. 
_
新版手机MSN,新功能,新体验!满足您的多彩需求!
http://mobile.msn.com.cn


Re: [squid-users] Disk space over limit Warning

2008-11-27 Thread Wilson Hernandez - MSD, S. A.

Yes. I did run squid -z and it created all the directories.

Paul Bertain wrote:

Hi Wilson,

Did you run "squid -z" after changing your settings?  For themto take 
effect, I believe you need to run "squid -a" again.


Paul



On Nov 28, 2008, at 15:53, "Wilson Hernandez - MSD, S. A." 
<[EMAIL PROTECTED]> wrote:



Hello;

I currently have a network with about 30 users and my swap space tends 
to fill up quite quickly. I increased the swap three weeks ago from:


#cache_dir ufs /var/log/squid/cache 5000 16 256 to
cache_dir ufs /var/log/squid/cache 1 255 255

Now, I'm getting the same warning:

2008/11/27 13:59:00| WARNING: Disk space over limit: 10241036 KB > 
1024 KB


If I leave it as is will I have problems in the future or should I 
change it to what? What is a safe size for this?


Thank you in advanced for all your help.





--
*Wilson Hernandez*
Presidente
829.848.9595
809.766.0441
www.msdrd.com 
Conservando el medio ambiente


Re: [squid-users] Disk space over limit Warning

2008-11-27 Thread Paul Bertain

Hi Wilson,

Did you run "squid -z" after changing your settings?  For themto take  
effect, I believe you need to run "squid -a" again.


Paul



On Nov 28, 2008, at 15:53, "Wilson Hernandez - MSD, S. A."  
<[EMAIL PROTECTED]> wrote:



Hello;

I currently have a network with about 30 users and my swap space  
tends to fill up quite quickly. I increased the swap three weeks ago  
from:


#cache_dir ufs /var/log/squid/cache 5000 16 256 to
cache_dir ufs /var/log/squid/cache 1 255 255

Now, I'm getting the same warning:

2008/11/27 13:59:00| WARNING: Disk space over limit: 10241036 KB >  
1024 KB


If I leave it as is will I have problems in the future or should I  
change it to what? What is a safe size for this?


Thank you in advanced for all your help.


[squid-users] Disk space over limit Warning

2008-11-27 Thread Wilson Hernandez - MSD, S. A.

Hello;

I currently have a network with about 30 users and my swap space tends 
to fill up quite quickly. I increased the swap three weeks ago from:


#cache_dir ufs /var/log/squid/cache 5000 16 256 to
cache_dir ufs /var/log/squid/cache 1 255 255

Now, I'm getting the same warning:

2008/11/27 13:59:00| WARNING: Disk space over limit: 10241036 KB > 
1024 KB


If I leave it as is will I have problems in the future or should I 
change it to what? What is a safe size for this?


Thank you in advanced for all your help.


Re: [squid-users] tuning an overloaded server

2008-11-27 Thread Amos Jeffries

B. Cook wrote:


On Nov 22, 2008, at 7:30 AM, Amos Jeffries wrote:

8< -- snip -- >8





That said BSD family of systems get more out of diskd than aufs in 
current Squid.





--
Please be using
 Current Stable Squid 2.7.STABLE5 or 3.0.STABLE10
 Current Beta Squid 3.1.0.2


Hello,

Sorry to bother..

so even in any FreeBSD (6.3, 7.0, etc..) diskd is still better than aufs?

and if so,

http://wiki.squid-cache.org/Features/DiskDaemon

this page talks about 2.4

and I can't seem to find an aufs page.. I can find coss, but coss has 
been removed from 3.0..


so again, diskd should be what FreeBSD users use?  As well as the kernel 
additions?  Even on 6.3 and 7.0 machines amd64 and i386 alike?


Yes. We have some circumstantial info that leads to believe its probably 
a bug in the way Squid uses AUFS and the underlying implementation 
differences in FreeBSD and Linux. We have not yet had anyone investigate 
deeply and correct the issue. So its still there in all Squid releases.





Thanks in advance..

(I would think a wiki page on an OS would be very useful.. common 
configs for linux 2.x and BSD, etc.. )


Many people are not as versed in squid as the developers, and giving 
them guidelines to follow would probably make it easier for them to 
use.. imho.


They don't understand coss vs aufs vs diskd vs ufs.. ;)


We are trying to get there :). It's hard for just a few people and 
non-experts in many areas at that. So if anyone has good knowledge of 
how AUFS works jump in with a feature page analysis.


What we have so far in the way of config help is explained at 
http://wiki.squid-cache.org/SquidFaq/ConfiguringSquid#head-ad11ea76c4876a92aa1cf8fb395e7efd3e1993d5


Amos
--
Please be using
  Current Stable Squid 2.7.STABLE5 or 3.0.STABLE10
  Current Beta Squid 3.1.0.2


Re: [squid-users] ICAP help

2008-11-27 Thread Christos Tsantilas

Hi Remy,

 OK so squid use the ICAP server and probably the squid part of your 
configuration is OK.


Please look on both squid logs and icap server logs for error messages. 
Should exist something in the logs which explains the reason of the error.


Also look in your c-icap configuration. For example, has the c-icap 
server write access to all directories in which is trying to write? The 
/var/tmp and /tmp/download/ directories in your case. (Also c-icap has 
its own mailing list probably you should ask here..)


--
  Christos

[EMAIL PROTECTED] wrote:

Thanks Christos,

after purging it form squid cache it work fine able to scan.
But now another problem when I try to download a zip virus file
http://www.eicar.org/download/eicar_com.zip

ERROR in the browser
The following error was encountered while trying to retrieve the URL:
http://www.eicar.org/download/eicar_com.zip

ICAP protocol error.

The system returned: [No Error]

This means that some aspect of the ICAP communication failed.

Some possible problems are:

*

  The ICAP server is not reachable.
*

  An Illegal response was received from the ICAP server.


//Remy




[squid-users] Re: squid_ldap_auth and passwords in clear text

2008-11-27 Thread Markus Moeller
You might try squid_kerb_auth which uses Negotiate/Kerberos instead of NTLM 
or Negotiate/NTLM.


Markus

"Matias Chris" <[EMAIL PROTECTED]> wrote in message 
news:[EMAIL PROTECTED]

Henrik,

I have tried LDAP authentication in the past and stop using it becouse
of the passwords being sent in clear text. I read about TLS but then I
would need my DC to be a CA and that is not feasible at the moment. So
Im testing NTLMSSP now, but is not being very stable and also read
that is not recommended for networks with more than 200 users.

Is this the end of the road? Is there any other method Im missing to
authenticate users against AD?Transparently?

Thanks,

On Tue, Nov 18, 2008 at 6:59 AM, Henrik Nordstrom
<[EMAIL PROTECTED]> wrote:

On fre, 2008-11-14 at 10:31 -0600, Johnson, S wrote:


I just got the squid_ldap_auth working ok on my segment but when
watching the protocol analyzer I see that the auth requests against the
AD are coming in as clear text passwords.  Is there anyway we can
encrypt the ldap domain requests?


By AD do you refer to Microsoft AD? In such case use NTLM authentication
instead of LDAP.

You can also TLS encrypt the LDAP communication, but this does not
protect the credentials sent by browsers to Squid, just the
communication squid->LDAP.

Regards
Henrik










Re: [squid-users] ICAP help

2008-11-27 Thread malmeida
Thanks Christos,

after purging it form squid cache it work fine able to scan.
But now another problem when I try to download a zip virus file
http://www.eicar.org/download/eicar_com.zip

ERROR in the browser
The following error was encountered while trying to retrieve the URL:
http://www.eicar.org/download/eicar_com.zip

ICAP protocol error.

The system returned: [No Error]

This means that some aspect of the ICAP communication failed.

Some possible problems are:

*

  The ICAP server is not reachable.
*

  An Illegal response was received from the ICAP server.


//Remy


On Thu, 27 Nov 2008 21:46:15 +0200, Christos Tsantilas
<[EMAIL PROTECTED]> wrote:
> OK this is when your are using the icap-client.What about when you are 
> using squid3?
> 
> - Are you seeing any log entries in c-icap log files? Just to see if 
> squid contacts the icap server...
> 
>   - Do you see any error message in squid3 cache.log file? Maybe for a 
> reason squid can not access the icap server.
> 
>   - What are you seeing in your web browser? How are you testing your 
> configuration? If you are just trying to download the eicar.com file it 
> is probably stored in your squid cache or your web broswer cache before 
> you install the icap server. You need to remove it from your cache. Look 
> in FAQ for info: 
>
http://wiki.squid-cache.org/SquidFaq/OperatingSquid#head-f418956943bd72ee8b94390ec9df241c3d1dfd20
> Also be sure that you had delete any web browser cache before the test.
> 
> Regards,
>   Christos
> 
> 
> [EMAIL PROTECTED] wrote:
>> Test sample output
>> 
>> 
>> /usr/local/c_icap/bin# /usr/local/c_icap/bin/icap-client -f
>> /home/remy/Desktop/eicar.com.txt  -s
>> "srv_clamav?allow204=on&force=on&sizelimit=off&mode=simple"
>> ICAP server:localhost, ip:127.0.0.1, port:1344
>> 
>> 
>> 
>> 
>> 
>> 
>> VIRUS FOUND
>> 
>> You try to upload/download a file that contain the virus
>> Eicar-Test-Signature
>> This message generated by C-ICAP/060708rc1 srvClamAV/antivirus module
>>  
>> 
>> 
>> #for sample virus file test access log file of c-icap
>> tail -f /usr/local/c_icap/var/log/access.log
>> Thu Nov 27 23:09:48 2008, 127.0.0.1, 127.0.0.1, OPTIONS,
>> srv_clamav?allow204=on&force=on&sizelimit=off&mode=simple, OK
>> Thu Nov 27 23:09:48 2008, 127.0.0.1, 127.0.0.1, RESPMOD,
>> srv_clamav?allow204=on&force=on&sizelimit=off&mode=simple, OK
>> 
>> #for sample virus file test access log file of c-icap
>> tail -f /usr/local/c_icap/var/log/server.log 
>> Thu Nov 27 23:09:48 2008, general, VIRUS DETECTED:Eicar-Test-Signature.
>> Take action...
>> 
>> //Remy
>> 
>> 
>> On Thu, 27 Nov 2008 19:50:16 +0200, Christos Tsantilas
>> <[EMAIL PROTECTED]> wrote:
>>> [EMAIL PROTECTED] wrote:
 Hi Christos,

 I think I have not made my self clear

 first of all I don't have icap_class and icap_access in my squid.conf
>>> file
 since you said
>>> Your configuration should also contain something like the
> following:
>>>
>>>icap_class class_avi  service_avi
>>>icap_access class_avi allow all
 I did those changes as per you and got that message

 my problem is I have enabled icap support but some how its not work
> (not
 able to scan)
 if is use the icap-client command to test it work fine

 where is my mistake?
>>> Do you see  error messages in your squid3 server.log file?
>>> Are there any entries in  c-icap's access.log file?
>>> How are you testing it?
>>>
 //Remy



Re: [squid-users] ICAP help

2008-11-27 Thread Christos Tsantilas
OK this is when your are using the icap-client.What about when you are 
using squid3?


- Are you seeing any log entries in c-icap log files? Just to see if 
squid contacts the icap server...


 - Do you see any error message in squid3 cache.log file? Maybe for a 
reason squid can not access the icap server.


 - What are you seeing in your web browser? How are you testing your 
configuration? If you are just trying to download the eicar.com file it 
is probably stored in your squid cache or your web broswer cache before 
you install the icap server. You need to remove it from your cache. Look 
in FAQ for info: 
http://wiki.squid-cache.org/SquidFaq/OperatingSquid#head-f418956943bd72ee8b94390ec9df241c3d1dfd20

Also be sure that you had delete any web browser cache before the test.

Regards,
 Christos


[EMAIL PROTECTED] wrote:

Test sample output


/usr/local/c_icap/bin# /usr/local/c_icap/bin/icap-client -f
/home/remy/Desktop/eicar.com.txt  -s
"srv_clamav?allow204=on&force=on&sizelimit=off&mode=simple"
ICAP server:localhost, ip:127.0.0.1, port:1344






VIRUS FOUND

You try to upload/download a file that contain the virus
Eicar-Test-Signature
This message generated by C-ICAP/060708rc1 srvClamAV/antivirus module
 


#for sample virus file test access log file of c-icap
tail -f /usr/local/c_icap/var/log/access.log
Thu Nov 27 23:09:48 2008, 127.0.0.1, 127.0.0.1, OPTIONS,
srv_clamav?allow204=on&force=on&sizelimit=off&mode=simple, OK
Thu Nov 27 23:09:48 2008, 127.0.0.1, 127.0.0.1, RESPMOD,
srv_clamav?allow204=on&force=on&sizelimit=off&mode=simple, OK

#for sample virus file test access log file of c-icap
tail -f /usr/local/c_icap/var/log/server.log 
Thu Nov 27 23:09:48 2008, general, VIRUS DETECTED:Eicar-Test-Signature.

Take action...

//Remy


On Thu, 27 Nov 2008 19:50:16 +0200, Christos Tsantilas
<[EMAIL PROTECTED]> wrote:

[EMAIL PROTECTED] wrote:

Hi Christos,

I think I have not made my self clear

first of all I don't have icap_class and icap_access in my squid.conf

file

since you said

Your configuration should also contain something like the following:

   icap_class class_avi  service_avi
   icap_access class_avi allow all

I did those changes as per you and got that message

my problem is I have enabled icap support but some how its not work (not
able to scan)
if is use the icap-client command to test it work fine

where is my mistake?

Do you see  error messages in your squid3 server.log file?
Are there any entries in  c-icap's access.log file?
How are you testing it?


//Remy


Re: [squid-users] ICAP help

2008-11-27 Thread malmeida
Test sample output


/usr/local/c_icap/bin# /usr/local/c_icap/bin/icap-client -f
/home/remy/Desktop/eicar.com.txt  -s
"srv_clamav?allow204=on&force=on&sizelimit=off&mode=simple"
ICAP server:localhost, ip:127.0.0.1, port:1344






VIRUS FOUND

You try to upload/download a file that contain the virus
Eicar-Test-Signature
This message generated by C-ICAP/060708rc1 srvClamAV/antivirus module
 


#for sample virus file test access log file of c-icap
tail -f /usr/local/c_icap/var/log/access.log
Thu Nov 27 23:09:48 2008, 127.0.0.1, 127.0.0.1, OPTIONS,
srv_clamav?allow204=on&force=on&sizelimit=off&mode=simple, OK
Thu Nov 27 23:09:48 2008, 127.0.0.1, 127.0.0.1, RESPMOD,
srv_clamav?allow204=on&force=on&sizelimit=off&mode=simple, OK

#for sample virus file test access log file of c-icap
tail -f /usr/local/c_icap/var/log/server.log 
Thu Nov 27 23:09:48 2008, general, VIRUS DETECTED:Eicar-Test-Signature.
Take action...

//Remy


On Thu, 27 Nov 2008 19:50:16 +0200, Christos Tsantilas
<[EMAIL PROTECTED]> wrote:
> [EMAIL PROTECTED] wrote:
>> Hi Christos,
>> 
>> I think I have not made my self clear
>> 
>> first of all I don't have icap_class and icap_access in my squid.conf
> file
>> since you said
> Your configuration should also contain something like the following:
>
>icap_class class_avi  service_avi
>icap_access class_avi allow all
>> I did those changes as per you and got that message
>> 
>> my problem is I have enabled icap support but some how its not work (not
>> able to scan)
>> if is use the icap-client command to test it work fine
>> 
>> where is my mistake?
> 
> Do you see  error messages in your squid3 server.log file?
> Are there any entries in  c-icap's access.log file?
> How are you testing it?
> 
>> 
>> //Remy
>> 



Re: [squid-users] ICAP help

2008-11-27 Thread Christos Tsantilas

[EMAIL PROTECTED] wrote:

Hi Christos,

I think I have not made my self clear

first of all I don't have icap_class and icap_access in my squid.conf file
since you said

Your configuration should also contain something like the following:

   icap_class class_avi  service_avi
   icap_access class_avi allow all

I did those changes as per you and got that message

my problem is I have enabled icap support but some how its not work (not
able to scan)
if is use the icap-client command to test it work fine

where is my mistake?


Do you see  error messages in your squid3 server.log file?
Are there any entries in  c-icap's access.log file?
How are you testing it?



//Remy





Re: [squid-users] squid_ldap_auth and passwords in clear text

2008-11-27 Thread Matias Chris
Henrik,

I have tried LDAP authentication in the past and stop using it becouse
of the passwords being sent in clear text. I read about TLS but then I
would need my DC to be a CA and that is not feasible at the moment. So
Im testing NTLMSSP now, but is not being very stable and also read
that is not recommended for networks with more than 200 users.

Is this the end of the road? Is there any other method Im missing to
authenticate users against AD?Transparently?

Thanks,

On Tue, Nov 18, 2008 at 6:59 AM, Henrik Nordstrom
<[EMAIL PROTECTED]> wrote:
> On fre, 2008-11-14 at 10:31 -0600, Johnson, S wrote:
>
>> I just got the squid_ldap_auth working ok on my segment but when
>> watching the protocol analyzer I see that the auth requests against the
>> AD are coming in as clear text passwords.  Is there anyway we can
>> encrypt the ldap domain requests?
>
> By AD do you refer to Microsoft AD? In such case use NTLM authentication
> instead of LDAP.
>
> You can also TLS encrypt the LDAP communication, but this does not
> protect the credentials sent by browsers to Squid, just the
> communication squid->LDAP.
>
> Regards
> Henrik
>
>
>


Re: [squid-users] NTLM Auth and not authenticated pages

2008-11-27 Thread Matias Chris
Chris,
Thanks, that pretty much cleared my doubt.



On Wed, Nov 26, 2008 at 6:33 PM, Chris Robertson <[EMAIL PROTECTED]> wrote:
> Matias Chris wrote:
>>
>> Hello All,
>>
>> Im currently in the process of changing the way we authenticate users
>> from LDAP to NTLMSSP. Now we are in test phase and while ntlm auth is
>> working fine and allowing all users that are already logged to the AD
>> Domain to access the web without asking for their credentials, Im
>> seeing a lot of denied attempts at the log.
>> Is like for every page visited I have now two log entries, one is
>> denied, and the other one is allowed.
>>
>
> That's due to the design of NTLM.  See
> http://devel.squid-cache.org/ntlm/client_proxy_protocol.html
>
>> Is there any way to tweak squid to avoid doing this? AD DC is on the
>> same phisycal LAN.
>>
>
> I suppose you could refrain from logging 407 responses...
>
>> 1227614260.463  0 127.0.0.1 TCP_DENIED/407 2083 POST
>> http://mail.google.com/a/matiaschris.com.ar/channel/bind? - NONE/-
>> text/html
>> 1227614261.218188 127.0.0.1 TCP_MISS/200 351 POST
>> http://mail.google.com/a/matiaschris.com.ar/channel/bind? mchrist
>> DIRECT/66.102.9.18 text/html
>>
>> Any help will be much appreciated. Thanks.
>>
>
> Chris
>


Re: [squid-users] Question about Squid 3 reverse proxy and SSL

2008-11-27 Thread Tom Williams

Matus UHLAR - fantomas wrote:

On 26.11.08 17:58, Tom Williams wrote:
  

Ok, I'm adding SSL support to my Squid 3 reverse proxy configuration.

Here are the configuration directives:

http_port 8085 accel defaultsite=www.mydomain.com vhost
https_port 4433 accel cert=/etc/ssl/cert/www_mydomain_com.crt 
key=/etc/ssl/private/private.key  defaultsite=www.mydomain.com vhost
cache_peer 192.168.1.7 parent 80 0 no-query originserver login=PASS 
name=web2Accel
cache_peer 192.168.1.7 parent 443 0 no-query originserver ssl login=PASS 
name=web2SSLAccel


Here is the error I get when I try to connect:

clientNegotiateSSL: Error negotiating SSL connection on FD 13: 
error:1407609C:SSL routines:SSL23_GET_CLIENT_HELLO:http request (1/-1)


What does this error mean?



someone apparently used HTTP on port you have configured to be HTTPS

Btw, why are you using ports 8085 and 4433 for reverze proxy? 
Reverse proxy should listen on 80/443 and forward requests to real server on

different IP/port?
  
Ah.  Now that you mention that, I believe I made that mistake myself.  I 
probably used http://blah:4433/ instead of httpS://blah:4433/.  I really 
need to get some sleep.   :(


As for the strange ports, it's because I'm currently doing testing.  
Once everything has been worked out, we will switch Squid over to using 
ports 80/443 for HTTP and HTTPS traffic.  :)


Thanks!

Peace...

Tom


Re: [squid-users] ICAP help

2008-11-27 Thread malmeida
Hi Christos,

I think I have not made my self clear

first of all I don't have icap_class and icap_access in my squid.conf file
since you said
>>> Your configuration should also contain something like the following:
>>>
>>>icap_class class_avi  service_avi
>>>icap_access class_avi allow all
I did those changes as per you and got that message

my problem is I have enabled icap support but some how its not work (not
able to scan)
if is use the icap-client command to test it work fine

where is my mistake?

//Remy

On Thu, 27 Nov 2008 08:59:18 -0500 (EST), "Christos Tsantilas"
<[EMAIL PROTECTED]> wrote:
>> Hi Christos,
>>
>> I used icap_class and icap_access  but I get this
>>
>> 2008/11/27 17:07:44| Processing Configuration
>> File: /etc/squid/squid.conf (depth 0)
>> 2008/11/27 17:07:44| WARNING: 'icap_class' is depricated. Use
>> 'adaptation_service_set' instead
>> 2008/11/27 17:07:44| WARNING: 'icap_access' is depricated. Use
>> 'adaptation_access' instead
>> 2008/11/27 17:07:44| Initializing https proxy context
> 
> You are  using squid 3.1.x .
> Just replace the icap_class and icap_access lines with the following:
> 
> adaptation_service_set  class_avi  service_avi
> adaptation_access  class_avi allow all
> 
> The icap_class and icap_access are deprecated but should work too.
> 
> --
>Christos
> 
>>
>> //Remy
>>
>> On Thu, 2008-11-27 at 07:53 -0500, Christos Tsantilas wrote:
>>> > Hi All,
>>> >
>>> > Need help on how to configure c-icap to scan http,https and ftp
>>> request
>>> >
>>> > Sample virus to test
>>> > http://www.eicar.org/download/eicar.com
>>> >
>>> > my configuration is as below
>>> > to test my setup I used the above link but it was not scanned for
>>> virus
>>> > and I was able to downloaded it nothing is working
>>> > what am i missing?
>>> > can someone help me in this?
>>> >
>>> > #squid.conf
>>> > 
>>> > icap_enable on
>>> > icap_preview_enable on
>>> > icap_preview_size 128
>>> > icap_send_client_ip on
>>> > icap_service service_avi_req reqmod_precache 0
>>> > icap://localhost:1344/srv_clamav
>>> > icap_service service_avi respmod_precache 1
>>> > icap://localhost:1344/srv_clamav
>>> >
>>>
>>> You need to define an icap_class and define access list for this
>>> icap_class
>>> Why do you need virus scan for http requests?
>>> Your configuration should also contain something like the following:
>>>
>>>icap_class class_avi  service_avi
>>>icap_access class_avi allow all
>>>
>>> Regards,
>>>Christos
>>>
>>
>>



Re: [squid-users] improve flow capacity for Squid

2008-11-27 Thread Adrian Chadd
Is that per-flow, or in total?



Adrian

2008/11/24 Ken DBA <[EMAIL PROTECTED]>:
> Hello,
>
> I was just finding the flow capacity for Squid is too limited.
> It's even hard to reach an upper limit of 150 MBits.
>
> How can I improve the flow capacity for Squid in the reverse-proxy mode?
> Thanks in advance.
>
> Ken
>
>
>
>
>


Re: [squid-users] Can I Force Connections To All or Some Sites To Traverse using HTTP 1.1?

2008-11-27 Thread Matus UHLAR - fantomas
On 26.11.08 10:57, [EMAIL PROTECTED] wrote:

Please set up your mailer to wrap lines below 80 characters per line.

> I have a proxy-to-proxy setup (without ICP) and it is working wonderfully
> with the exception of cases whereby IE users attempt to connect to a
> remote Citrix server.  The odd thing is that the errors encountered do not
> seem to happen at all when users use Firefox.
> 
> When IE initiates traffic to the Citrix site, it uses HTTP 1.0, somewhere
> along the way, the Citrix site (or other Proxy, which I have no control
> nor ability to see into) returns HTTP 1.1 traffic.  At this point, the 1.1
> trafic arrives back to my proxy, converting it back to the original HTTP
> 1.0 format before it passes the traffic back to IE user.

Server must not return HTTP/1.1 traffic for HTTP/1.0 request. If it does,
it's broken.

> So it appears that my Squid proxy tries to convert to HTTP 1.0, but only
> for IE sessions as Firefox users never have these issues, also Firefox
> uses 1.1 anyhow, thus not requiring any conversions.
> 
> Any thoughts?  Is there any way to force or preserve the HTTP protocol
> version to 1.1 on all connections, or preferably on a destination basis?

Squid is not HTTP/1.1 server, it's only HTTP/1.0. So, it does not convert
HTTP/1.1 requests. 
-- 
Matus UHLAR - fantomas, [EMAIL PROTECTED] ; http://www.fantomas.sk/
Warning: I wish NOT to receive e-mail advertising to this address.
Varovanie: na tuto adresu chcem NEDOSTAVAT akukolvek reklamnu postu.
Remember half the people you know are below average. 


Re: [squid-users] Can squid acted as a application SSL proxy

2008-11-27 Thread Matus UHLAR - fantomas
On 27.11.08 09:45, 李春 wrote:

Please configure your mailer to wrap lines below 80 characters per line.

> I have a client/server application program and want to add SSL module to
> it to secure the data transferring on the network. I wander that if I can
> use the squid as a SSL proxy between client and server. The squid will
> configurated as a reserve proxy and located in the application server's
> environment. The client and squid contact with SSL connection. Just like
> this:
>  <-(no SSL)--<-(SSL)--
> Server Squid   client
>  --(no SSL)->--(SSL)->
> 
> I know squid can act as web proxy like this using "https_port". But I am 
> curious that if I can make use of squid like this.

Yes, that's what https_port is for. Just properly configure squid as reverse
proxy.

-- 
Matus UHLAR - fantomas, [EMAIL PROTECTED] ; http://www.fantomas.sk/
Warning: I wish NOT to receive e-mail advertising to this address.
Varovanie: na tuto adresu chcem NEDOSTAVAT akukolvek reklamnu postu.
I wonder how much deeper the ocean would be without sponges. 


Re: [squid-users] squid and http 1.0 VS. http 1.1

2008-11-27 Thread Matus UHLAR - fantomas
On 27.11.08 15:02, Joar Jegleim wrote:
> I've been debugging a problem with a soap app (cognos planning) which
> brakes when being run through our squid 3.0 proxy .

> After tcpdumping the whole session and investigating with wireshark it
> seems to me that the following happens
> 1. client performs a 'GET' in HTTP 1.1 to the proxy
> 2. the proxy then performs this GET against the app server, but now it's
>  in HTTP 1.0

Yes, because squid only supports HTTP/1.0

> 3. the app server replies in HTTP 1.1 which in turn squid

The application is broken, because it must not answer in HTTP/1.1 for
HTTP/1.0 request

> 2.: I thought by configuring squid to 'always_direct' sessions to the
> app server that squid is transparent in between the client and the app
> server. As of now it seems to me that, even with bypassing squid, squid
> fiddles with the HTTP version in the GET's being performed resulting in
>  the application breaking. E.G. to make this work the application must
> be rewritten to support giving 'content length' in those GET's where
> squid gives a 411

you aren't bypassing squid with always_direct. the always_direct is SQUID
directive not to use any parent proxies, but the squid is already processing
the request.

-- 
Matus UHLAR - fantomas, [EMAIL PROTECTED] ; http://www.fantomas.sk/
Warning: I wish NOT to receive e-mail advertising to this address.
Varovanie: na tuto adresu chcem NEDOSTAVAT akukolvek reklamnu postu.
You have the right to remain silent. Anything you say will be misquoted,
then used against you. 


Re: [squid-users] squid reverse-proxy for videos

2008-11-27 Thread Matus UHLAR - fantomas
On 27.11.08 14:13, Ken DBA wrote:
> We have some web servers for videos playing (the FLV format,like youtube).
> Could we deploy squid to act as a reverse-proxy for this application?
> What's the recommend configure for squid? Thanks.

configure it a standard reverse proxy. Avoid any techniques to
disable proxying on your web server, like changing address, URL etc...
-- 
Matus UHLAR - fantomas, [EMAIL PROTECTED] ; http://www.fantomas.sk/
Warning: I wish NOT to receive e-mail advertising to this address.
Varovanie: na tuto adresu chcem NEDOSTAVAT akukolvek reklamnu postu.
I feel like I'm diagonally parked in a parallel universe. 


RE: [squid-users] TCP_MISS and TCP_DENIED

2008-11-27 Thread Alex Huxham
This is the basic function ability in the NTLM negotiation mechanism.
Ignore it, all logs that support NTLM authentication show two denied
requests off hand, than the connect.

HTH

-Original Message-
From: Tom Porch [mailto:[EMAIL PROTECTED] 
Sent: 27 November 2008 14:24
To: squid-users@squid-cache.org
Subject: [squid-users] TCP_MISS and TCP_DENIED

Hi all

I've got 2.7 on a Windows box and have configured it for NTLM
authentication so I get the username logged.
However I get TCP_MISS and TCP_DENIED logged even though access is
allowed to the web sites requested.

Is there a quick fix to get it correctly logging the requests?

Thanks
Tom


Re: [squid-users] Question about Squid 3 reverse proxy and SSL

2008-11-27 Thread Matus UHLAR - fantomas
On 26.11.08 17:58, Tom Williams wrote:
> Ok, I'm adding SSL support to my Squid 3 reverse proxy configuration.
> 
> Here are the configuration directives:
> 
> http_port 8085 accel defaultsite=www.mydomain.com vhost
> https_port 4433 accel cert=/etc/ssl/cert/www_mydomain_com.crt 
> key=/etc/ssl/private/private.key  defaultsite=www.mydomain.com vhost
> cache_peer 192.168.1.7 parent 80 0 no-query originserver login=PASS 
> name=web2Accel
> cache_peer 192.168.1.7 parent 443 0 no-query originserver ssl login=PASS 
> name=web2SSLAccel
> 
> Here is the error I get when I try to connect:
> 
> clientNegotiateSSL: Error negotiating SSL connection on FD 13: 
> error:1407609C:SSL routines:SSL23_GET_CLIENT_HELLO:http request (1/-1)
> 
> What does this error mean?

someone apparently used HTTP on port you have configured to be HTTPS

Btw, why are you using ports 8085 and 4433 for reverze proxy? 
Reverse proxy should listen on 80/443 and forward requests to real server on
different IP/port?
-- 
Matus UHLAR - fantomas, [EMAIL PROTECTED] ; http://www.fantomas.sk/
Warning: I wish NOT to receive e-mail advertising to this address.
Varovanie: na tuto adresu chcem NEDOSTAVAT akukolvek reklamnu postu.
99 percent of lawyers give the rest a bad name. 


[squid-users] TCP_MISS and TCP_DENIED

2008-11-27 Thread Tom Porch
Hi all

I've got 2.7 on a Windows box and have configured it for NTLM authentication so 
I get the username logged.
However I get TCP_MISS and TCP_DENIED logged even though access is allowed to 
the web sites requested.

Is there a quick fix to get it correctly logging the requests?

Thanks
Tom


[squid-users] squid and http 1.0 VS. http 1.1

2008-11-27 Thread Joar Jegleim

Hi,

I've been debugging a problem with a soap app (cognos planning) which
brakes when being run through our squid 3.0 proxy .

>From what I've gathered so far, according to one of the developers of
the cognos appplication : "We use a lot of xml/soap communication which
is chunked"

And according to
http://www.nabble.com/POST-PUT-request-Content-Length-td17497369.html#a17501199

http 1.0 doesn't support chunked transfer-encoding


My access.log shows
NONE/411 4900
when the error occurs.

After tcpdumping the whole session and investigating with wireshark it
seems to me that the following happens
1. client performs a 'GET' in HTTP 1.1 to the proxy
2. the proxy then performs this GET against the app server, but now it's
 in HTTP 1.0
3. the app server replies in HTTP 1.1 which in turn squid
4. replies in HTTP 1.0 to the client.

This leads me to the following theory:
Considering that chunking is incompatible with http1.0 this explains why
things are breaking.

This seems to happen even when I'm configuring squid to bypass requests
to the app server:
acl bypassquid dstdomain "/usr/local/squid/etc/domainlist.bypass.squid"
always_direct allow bypassquid

(the relevant app server is listed in 'domainlist.bypass.squid')

2 questions:
1.: does anybody know if I'm totally off the target about those http1.0
/ 1.1 GET's being performed between client / squid / app server, and if
this may or may not be the reason things brake ?

2.: I thought by configuring squid to 'always_direct' sessions to the
app server that squid is transparent in between the client and the app
server. As of now it seems to me that, even with bypassing squid, squid
fiddles with the HTTP version in the GET's being performed resulting in
 the application breaking. E.G. to make this work the application must
be rewritten to support giving 'content length' in those GET's where
squid gives a 411

Thoughts over this matter in any form is highly appreciated.

regards
Joar Jegleim







Re: [squid-users] ICAP help

2008-11-27 Thread Christos Tsantilas
> Hi Christos,
>
> I used icap_class and icap_access  but I get this
>
> 2008/11/27 17:07:44| Processing Configuration
> File: /etc/squid/squid.conf (depth 0)
> 2008/11/27 17:07:44| WARNING: 'icap_class' is depricated. Use
> 'adaptation_service_set' instead
> 2008/11/27 17:07:44| WARNING: 'icap_access' is depricated. Use
> 'adaptation_access' instead
> 2008/11/27 17:07:44| Initializing https proxy context

You are  using squid 3.1.x .
Just replace the icap_class and icap_access lines with the following:

adaptation_service_set  class_avi  service_avi
adaptation_access  class_avi allow all

The icap_class and icap_access are deprecated but should work too.

--
   Christos

>
> //Remy
>
> On Thu, 2008-11-27 at 07:53 -0500, Christos Tsantilas wrote:
>> > Hi All,
>> >
>> > Need help on how to configure c-icap to scan http,https and ftp
>> request
>> >
>> > Sample virus to test
>> > http://www.eicar.org/download/eicar.com
>> >
>> > my configuration is as below
>> > to test my setup I used the above link but it was not scanned for
>> virus
>> > and I was able to downloaded it nothing is working
>> > what am i missing?
>> > can someone help me in this?
>> >
>> > #squid.conf
>> > 
>> > icap_enable on
>> > icap_preview_enable on
>> > icap_preview_size 128
>> > icap_send_client_ip on
>> > icap_service service_avi_req reqmod_precache 0
>> > icap://localhost:1344/srv_clamav
>> > icap_service service_avi respmod_precache 1
>> > icap://localhost:1344/srv_clamav
>> >
>>
>> You need to define an icap_class and define access list for this
>> icap_class
>> Why do you need virus scan for http requests?
>> Your configuration should also contain something like the following:
>>
>>icap_class class_avi  service_avi
>>icap_access class_avi allow all
>>
>> Regards,
>>Christos
>>
>
>




Re: [squid-users] ICAP help

2008-11-27 Thread John Doe
> I used icap_class and icap_access  but I get this
> 
> 2008/11/27 17:07:44| Processing Configuration
> File: /etc/squid/squid.conf (depth 0)
> 2008/11/27 17:07:44| WARNING: 'icap_class' is depricated. Use
> 'adaptation_service_set' instead
> 2008/11/27 17:07:44| WARNING: 'icap_access' is depricated. Use
> 'adaptation_access' instead
> 2008/11/27 17:07:44| Initializing https proxy context

Follow squid's advice:
  icap_class => adaptation_service_set
  icap_access => adaptation_access
It must be mentionned in the squid.conf.default, have a look.

JD


  



Re: [squid-users] ICAP help

2008-11-27 Thread Mario Remy Almeida
Hi Christos,

I used icap_class and icap_access  but I get this

2008/11/27 17:07:44| Processing Configuration
File: /etc/squid/squid.conf (depth 0)
2008/11/27 17:07:44| WARNING: 'icap_class' is depricated. Use
'adaptation_service_set' instead
2008/11/27 17:07:44| WARNING: 'icap_access' is depricated. Use
'adaptation_access' instead
2008/11/27 17:07:44| Initializing https proxy context

//Remy

On Thu, 2008-11-27 at 07:53 -0500, Christos Tsantilas wrote:
> > Hi All,
> >
> > Need help on how to configure c-icap to scan http,https and ftp request
> >
> > Sample virus to test
> > http://www.eicar.org/download/eicar.com
> >
> > my configuration is as below
> > to test my setup I used the above link but it was not scanned for virus
> > and I was able to downloaded it nothing is working
> > what am i missing?
> > can someone help me in this?
> >
> > #squid.conf
> > 
> > icap_enable on
> > icap_preview_enable on
> > icap_preview_size 128
> > icap_send_client_ip on
> > icap_service service_avi_req reqmod_precache 0
> > icap://localhost:1344/srv_clamav
> > icap_service service_avi respmod_precache 1
> > icap://localhost:1344/srv_clamav
> >
> 
> You need to define an icap_class and define access list for this icap_class
> Why do you need virus scan for http requests?
> Your configuration should also contain something like the following:
> 
>icap_class class_avi  service_avi
>icap_access class_avi allow all
> 
> Regards,
>Christos
> 



RE: [squid-users] Change squid binary in flight

2008-11-27 Thread Lluis Ribes
Hi,

I've finally resolve my problem. I compiled the version 3 STABLE10 with
--with-filedescriptors=8192 and --prefix=/opt/squid. After this, I shutdown
down Squid, I ran "make install" and squid was installed in the same Debian
default location (/opt/squid/), but only the binary: my config file doesn't
change it :)

The last step wasn't configure max_filedescriptors parameter in squid.conf,
because this parameter doesn't work, I don't know why... the last step was
to write the line "ulimit -n 8192" in the startup script
(/etc/init.d/squid), at the beginning.

I started squid (/etc/init.d/squid) and now I'm running with 8192 file
descriptors!

Thanks all for your help!

Lluís Ribes

-Mensaje original-
De: Amos Jeffries [mailto:[EMAIL PROTECTED] 
Enviado el: viernes, 21 de noviembre de 2008 4:19
Para: Kinkie
CC: Lluis Ribes; squid-users@squid-cache.org
Asunto: Re: [squid-users] Change squid binary in flight

> On Thu, Nov 20, 2008 at 7:10 PM, Lluis Ribes <[EMAIL PROTECTED]> wrote:
>> Dear Squid Folks,
>>
>> I have a Squid 3.0Stable1 running in a server. This version was
>> installed
>> with apt-get Debian package utility. So, it has worked fine until now,
>> where
>> I have file descriptor problems.
>>
>> I saw that my installation has 1024 files as max_filedescriptor, I think
>> not
>> much. I want to change it, but the parameter max_filedescriptor in
>> squid.conf doesn't work (I receive an error message about unknown
>> parameter).
>
> Are you sure? It may be a runtime limitation; please check that
> there's a 'ulimit -n 8192' line in the squid startup script (replace
> 8192 with your desired limit).
>
>> So, I think thah the only way is recompile with file_descriptor flag:
>>
>> ./configure --with-filedescriptors=8192 --prefix=/opt/squid
>> --with-openssl
>> --enable-ssl --disable-internal-dns --enable-async-io
>> --enable-storeio=ufs,diskd
>>
>> Ok, I compiled Squid 3.0Stable10. So my question is:
>>
>> Could I replace directly the binary that it was generated by my
>> compilation
>> process and located in $SQUID_SOURCE/src/squid with my debian binary
>> version
>> that it's running nowadays? I have to avoid lost of service of my web.
>
> the debian package may have different configure options; if you miss
> some configuration option your configuration file may be incompatible
> with your new binary.
> you may want to run 'squid -v' to check that your new configure
> options are compatible with the previous ones.
>
> You may also want to keep your old binary around, to be able to roll
> back in case of problems.
>

Indeed. There is at least one patch needed to make squid log correctly in
Debian. http://wiki.squid-cache.org/SquidFaq/CompilingSquid (Debian
section)

IIRC the packages are built with 4096 fd by default, maybe stable1 missed
out for some reason. stable8 is available for Debian, please try that
before going to a custom build.

If you must, using the exact same configure options as your packaged build
and the logs/log patch leaves "make install" placing all binaries in the
correct places for a /etc/init.d/restart to work.

Amos




Re: [squid-users] ICAP help

2008-11-27 Thread Christos Tsantilas
> Hi All,
>
> Need help on how to configure c-icap to scan http,https and ftp request
>
> Sample virus to test
> http://www.eicar.org/download/eicar.com
>
> my configuration is as below
> to test my setup I used the above link but it was not scanned for virus
> and I was able to downloaded it nothing is working
> what am i missing?
> can someone help me in this?
>
> #squid.conf
> 
> icap_enable on
> icap_preview_enable on
> icap_preview_size 128
> icap_send_client_ip on
> icap_service service_avi_req reqmod_precache 0
> icap://localhost:1344/srv_clamav
> icap_service service_avi respmod_precache 1
> icap://localhost:1344/srv_clamav
>

You need to define an icap_class and define access list for this icap_class
Why do you need virus scan for http requests?
Your configuration should also contain something like the following:

   icap_class class_avi  service_avi
   icap_access class_avi allow all

Regards,
   Christos



[squid-users] ICAP help

2008-11-27 Thread Mario Remy Almeida
Hi All,

Need help on how to configure c-icap to scan http,https and ftp request

Sample virus to test
http://www.eicar.org/download/eicar.com

my configuration is as below
to test my setup I used the above link but it was not scanned for virus
and I was able to downloaded it nothing is working
what am i missing?
can someone help me in this?

#squid.conf

icap_enable on
icap_preview_enable on
icap_preview_size 128
icap_send_client_ip on
icap_service service_avi_req reqmod_precache 0
icap://localhost:1344/srv_clamav
icap_service service_avi respmod_precache 1
icap://localhost:1344/srv_clamav


#c-icap.conf
+
PidFile /var/run/c-icap.pid
CommandsSocket /var/run/c-icap/c-icap.ctl
Timeout 300
KeepAlive On
MaxKeepAliveRequests 100
KeepAliveTimeout 600
StartServers 3
MaxServers 10
MinSpareThreads 10
MaxSpareThreads 20
ThreadsPerChild 10
MaxRequestsPerChild  0
Port 1344
User proxy
Group nobody
TmpDir /var/tmp
MaxMemObject 131072
ServerLog /usr/local/c_icap/var/log/server.log
AccessLog /usr/local/c_icap/var/log/access.log
DebugLevel 3
ModulesDir /usr/lib/c_icap
Module logger sys_logger.so
sys_logger.Prefix "C-ICAP:"
sys_logger.Facility local1
Logger file_logger
AclControllers default_acl
acl localsquid_respmod src 127.0.0.1 type respmod
acl localsquid_options src 127.0.0.1 type options
acl localsquid src 127.0.0.1
acl externalnet src 0.0.0.0/0.0.0.0
acl localnet_respmod src 10.200.2.0/255.255.255.0 type respmod
acl localnet_options src 10.200.2.0/255.255.255.0 type options
acl localnet src 10.200.2.0/255.255.255.0
icap_access allow localsquid_respmod
icap_access allow localsquid_options
icap_access allow localsquid
icap_access allow localnet_respmod
icap_access allow localnet_options
icap_access allow localnet
icap_access deny externalnet
icap_access log localsquid
icap_access log localnet
icap_access log externalnet
ServicesDir /usr/lib/c_icap
Service echo_module srv_echo.so
Service url_check_module srv_url_check.so
Service antivirus_module srv_clamav.so
ServiceAlias  avscan srv_clamav?allow204=on&sizelimit=off&mode=simple
srv_clamav.ScanFileTypes TEXT DATA EXECUTABLE ARCHIVE GIF JPEG MSOFFICE
StartSendPercentDataAfter size
srv_clamav.SendPercentData 5
srv_clamav.StartSendPercentDataAfter 2M
previews for srv_clamav
srv_clamav.Allow204Responces off
srv_clamav.MaxObjectSize  5M
srv_clamav.ClamAvMaxFilesInArchive 0
srv_clamav.ClamAvMaxFileSizeInArchive 100M
srv_clamav.ClamAvMaxRecLevel 5
srv_clamav.VirSaveDir /tmp/download/
get_file.pl script in contrib dir)
srv_clamav.VirHTTPServer  "http://fortune/cgi-bin/get_file.pl?usename=%
f&remove=1&file="
srv_clamav.VirUpdateTime   15
srv_clamav.VirScanFileTypes ARCHIVE EXECUTABLE

//Remy