[squid-users] squid 3.0 + POST method + reqmod

2008-11-18 Thread Philipp
Hi

I've been testing Squid's icap client (Squid 3.0Stable10) together with a
trial license of Kaspersky's kav4proxy version 5.5.51.

On specific websites I get a status 400 from the icap server when POST is
used together with icap reqmod.

Of course once just could deny the POST method for reqmod or just run
respmod while disabling reqmod. So, there is a workaround.

The issue is reproducable on these webpages:
http://www.jobs.ch/suche/Electronic-Mechanics-Engineering/72/0 and then
select something from the 'Select region' bar.
http://www.brack.ch --> click on the 'Anmelden' button

I made packet dumps of the failed reqmod and compared them to RFC 3507.
The client's reqmod looks sane to me. I do not understand why it results
into status 400.

If interested I can attach the dumps in a later mail.

Thanks
Philipp






[squid-users] sslBump: only bump requests to sites with invalid certificates

2008-11-23 Thread Philipp
Hi

I would like to bump requests to sites with invalid certificates only.
Sites that have valid SSL certificates should not be bumped (bump decision
based on valitidy of the SSL cert).

First, I tried this ACL:
acl InvalidCert ssl_error SQUID_X509_V_ERR_DOMAIN_MISMATCH
acl InvalidCert ssl_error X509_V_ERR_UNABLE_TO_GET_ISSUER_CERT
acl InvalidCert ssl_error X509_V_ERR_CERT_NOT_YET_VALID
acl InvalidCert ssl_error X509_V_ERR_ERROR_IN_CERT_NOT_BEFORE_FIELD
acl InvalidCert ssl_error X509_V_ERR_CERT_HAS_EXPIRED
acl InvalidCert ssl_error X509_V_ERR_ERROR_IN_CERT_NOT_AFTER_FIELD
acl InvalidCert ssl_error X509_V_ERR_UNABLE_TO_GET_ISSUER_CERT_LOCALLY
ssl_bump allow InvalidCert
ssl_bump deny all

Result: Squid uses CONNECT for https.
Interpretation: 'ssl_bump deny all' always matches.


Second, I tried this ACL:
acl NoSSLError ssl_error SSL_ERROR_NONE
ssl_bump deny NoSSLError
ssl_bump allow all

Result: Squid uses CONNECT for https.
Interpretation: 'ssl_bump deny NoSSLError' always matches.


Last, I also tried "normal" ACLs such as:
ACL whitelisted dstdomain .somedomain.com
ssl_bump deny whitelisted
ssl_bump allow all

This works as expected. If .somedomain.com is https, Squid uses CONNECT.
All other https sites are bumped.


I am aware of that the ssl_error ACL type is not documented (at least I
could not find any).
I'm trying this setup with Squid 3.1.0.2.
Can this sort of ACL (bump decision based on validity of Cert) be done or
is this a bug?

Thanks,
Philipp






Re: [squid-users] antivirus for squid proxy

2009-02-12 Thread Philipp
Hi

on a private basis I ve been playing (using Squids ICAP client) with
clamav and c_icap, Avira Webgate, kav4proxy from Kaspersky.
Best results, especially against HTML / javascript threats I had with
Avira's Webgate. Worst results I had with kav4proxy.
My impression is that neither of these products can be compared to a
locally installed anti virus that also checks for the threats you (and me)
want protection against.
The only thing I can suggest you is to set up a linux test box and see for
youself.

Maybe others have different experience.

Best,
Philipp


> Hi All,
>
> Which is a best antivirus in untubu linux? I'm running squid
> proxy-server on my ubuntu server
> I want all the web requests tobe scanned by antivirus for any
> virus/malware infection and then
> pass to the user.
> Which antivirus package can help me in this?
>
>
> ~~
> Sameer Shinde.
> M:- +91 98204 61580
> Millions saw the apple fall, but Newton was the one who asked why.
>




Re: [squid-users] antivirus for squid proxy

2009-02-12 Thread Philipp
> Greetings,
>
>  Without trying to hijack this thread, I'm curious what you are using for
> your ICAP server?

Avira Webgate as well as kav4proxy come with their own icap server
implementation. You just need to configure Squid's icap client.
For clamav there is c_icap. Just google for it. Its an opensource project
available at sourceforge.net.

I currently run a 1-month trial from Avira. The icap respmod did not work
although I set it up according to Avira's icap documentation. The icap
server always responded with status code 500.
I did not dig deeper as I first wanted to see Avira's performance. So I
chained the two proxies.

Philipp






>  --
>  Russ
>
>  Philipp wrote:Hi  on a private basis I ve been playing (using Squids
> ICAP client) with clamav and c_icap, Avira Webgate, kav4proxy from
> Kaspersky. Best results, especially against HTML / javascript threats I
> had with Avira's Webgate. Worst results I had with kav4proxy. My
> impression is that neither of these products can be compared to a locally
> installed anti virus that also checks for the threats you (and me) want
> protection against. The only thing I can suggest you is to set up a linux
> test box and see for youself.  Maybe others have different experience.
> Best, Philipp Hi All,  Which is a best antivirus in untubu
> linux? I'm running squid proxy-server on my ubuntu server I want all the
> web requests tobe scanned by antivirus for any virus/malware infection
> and then pass to the user. Which antivirus package can help me in this?
> ~~ Sameer Shinde. M:- +91 98204 61580 Millions saw the apple
> fall, but Newton was the one who asked why.




[squid-users] Multi-ISP / Squid 2.6 Problem going DIRECT

2007-09-18 Thread Philipp Rusch

Sorry to bother you, but I don't get it.

We have a SuSE 10.1 system and have our www-traffic going through squid.
Since upgrade from 2.5 to version 2.6 STABLE5-30 (SuSE versions) we notice
that Squid is behaving strange. After running normally a while Squid seems
to go "DIRECT" only and the browsers on the clients seem to hang and or
surfing is ultra slow. This is happening every three or four websites we 
try
to access, it seems to work normal for one or two, then the next four or 
five

GETs are very slow again and the circle begins again.
In /var/logs/Squid/access.log I see that most of the connections are going
DIRECT, sometimes we get connection timeouts (110) and sometimes we
see that "somehow" an :443 is added to the URL-lines. STRANGE.
Any hints appreciated.

Regards from Germany,
Mit freundlichen Grüßen
Philipp Rusch



[squid-users] - SOLVED - Multi-ISP / Squid 2.6 Problem going DIRECT

2007-09-21 Thread Philipp Rusch

SOLVED - Update to Squid 2.6.STABLE14-8.5 and applying patches to
our firewall (Shorewall 4.0.3) did work for us.
Now this works flawlessly.

--
Sorry to bother you, but I don't get it.

We have a SuSE 10.1 system and have our www-traffic going through squid.
Since upgrade from 2.5 to version 2.6 STABLE5-30 (SuSE versions) we notice
that Squid is behaving strange. After running normally a while Squid seems
to go "DIRECT" only and the browsers on the clients seem to hang and or
surfing is ultra slow. This is happening every three or four websites we 
try
to access, it seems to work normal for one or two, then the next four or 
five

GETs are very slow again and the circle begins again.
In /var/logs/Squid/access.log I see that most of the connections are going
DIRECT, sometimes we get connection timeouts (110) and sometimes we
see that "somehow" an :443 is added to the URL-lines. STRANGE.
Any hints appreciated.
---

Regards from Germany,
Mit freundlichen Grüßen
Philipp Rusch






[squid-users] Squid 2.6 - access hosts outside LAN through proxy with https://a.b.c.d:8080

2007-10-04 Thread Philipp Rusch

How would I define the correct ACL and/or http_access rule
to access external hosts, that are to be reached through a https-
admin interface that is using port 8080 ?
I tried to add 8080 to the list of "SSL-Ports" like
acl SSL_ports 443 563 8080
and thus allow it to CONNECT directly with
http_access deny CONNECT !SSL_ports
(rest is kept to the recommended defaults)

... but Squid keeps on telling me that the connection is refused (111)

Regards,
--

Mit freundlichen Grüßen,
Philipp Rusch




[squid-users] Valid ACL-Types for cache_peer_access

2008-08-22 Thread Philipp Nyffenegger
Hello,
i'm facing a problem with  selective Forwarding in Squid. I'm using
cache_peer_access to divert different URLs to different Scanning
Engines. Most of the ACL's are of type "dstdomains". They all work
fine.

Now my Problem is as follows :

.doubleclick.net is being sent to a URL-Filter which blocks the whole
.doubleclick.net Domain. Now i would like to have something like
"http://.*.doubleclick.net/blabla/"; being sent towards AV Engine thus
allowing access to this specific Site/URL.

Whenever i add an url_regex ACL-Type like
"^http:\/\/.*\.doubleclick.net/blabla$" to a
"cache_peer_access"-Directive, it's never being redirected
accordingly. Squid does not complain about wrong ACL-Type used or the
like.

When i change the ACL-Type to "dstdomain" and add ".doubleclick.net"
to it, Squid works as expected.

What are the valid ACL-Types working in conjunction with "cache_peer_access" ?

Any help is greatly appreciated.

Kind regards,
Philipp


[squid-users] Problem downloading large files through squid

2005-04-22 Thread Philipp Nyffenegger
Hi list,
why can't i download the a file like :
http://mirror.switch.ch/ftp/mirror/fedora/linux/core/3/i386/iso/FC3-i386-DVD.iso
with a size of  2.3 GB, while there is no problem with downloading
another file from the same directory like this :
http://mirror.switch.ch/ftp/mirror/fedora/linux/core/3/i386/iso/FC3-i386-disc1.iso
?

The latter one is a normal CD-ISO Image and the first is a DVD-ISO
Image the DVD is much bigger than the CD. I firts thought of a
filesystem limitation, so i wrote a no_cache directive, which did not
help. I asume there is a limitation on the filesize. I found an
Article covering a  similar topic (Filesize) but with FTP and not
HTTP.

How can i let my users downloading such large files without bypassing
squid ? When i route the request through Viruswall directly instead of
Squid, it works fine. I just have no clue on how i can route request
for large files directly to the viruswall instead of the squid.

Is there a patch or a compile option for letting squid support large files. 

Configuration : 
OS  :   Fedora Core 3 Linux 
Squid  :2.5.STABLE6

The Trendmicro Viruswall is configured as a parent to the squid.

Any help is gratefully appreciated.

Cheers,
Philipp


[squid-users] access.log analysis

2005-08-19 Thread Philipp Snizek
Hi

I'm looking for a tool that analyses Squid's access.log the same or in a
similar way as Sarg does. 
The problem with Sarg is it's quite slow performance on a P3-500 MHz with
512MB Ram.

Can you help me

TIA
Philipp


smime.p7s
Description: S/MIME cryptographic signature


Re: [squid-users] Error acessing a page with Underscore( _ ) in the url

2006-01-25 Thread Philipp Nyffenegger
Check the squid FAQ's. Everything is explained in detail there :

http://www.squid-cache.org/Doc/FAQ/FAQ-11.html#ss11.8
and
http://www.squid-cache.org/Doc/FAQ/FAQ-11.html#ss11.9

Cheers,
Philipp

On 1/25/06, Jacob, Stanley (GE Consumer Finance, consultant)
<[EMAIL PROTECTED]> wrote:
> Hi All,
>
> I am having a strange problem when runnning squid2.5 stable 9
>
> when i try to access a website(Internal) with _ (underscore) in the URL it 
> gives me the following error.
>
> But when i try with a non-squid proxy i am bale to access the page.
> Is there way where i can allow this charcter in the URL


[squid-users] Antivirus and squid

2006-03-18 Thread Philipp Snizek
Hi

I intend to use squid as reverse proxy combined with antivirus.

This reverse proxy will protect a web mail server. I don't want users to
upload viruses and distribute them via this web mail user interface. 
It would be great if there existed a squid/antivirus solution that scans
uploaded/downloaded files only ignoring the other html traffic. Also it
would be great if this av solution had an api for clamav or bitdefender.
I couldn't find anything during my search this morning. Maybe some of you
have an idea?

Thanks in advance
Philipp
 



[squid-users] Strange Problems with redirector

2006-05-17 Thread Philipp Neuhaus
Hi,

I'm trieing to configure Squid (2.5.STABLE12) to use squidGuard (1.2.0 )
on a OpenSuSE 10.1.

But it doesn't work. Squid without a redictor starts up, but configured
with a redirector (even if I use "cat") crashes on startup.

Should I post my squid.conf on this ML?

Philipp



signature.asc
Description: OpenPGP digital signature


Re: [squid-users] restart authentication helpers

2006-05-17 Thread Philipp Neuhaus
Mark Elsen schrieb:
>>
>> Hi,
>>
>> I'm using Squid 2.5.stable13 on RHEL4 with the squid_radius_auth
>> helper, and have checked Google, the squid FAQ, and the config guide.
>>
>> After a given squid_radius_auth has been running for a while it starts
>> to generate errors.
> 
> 
> What are these errors ?

The server is not here. I thought I copied all the file onto my
notebook. Ok, I just tried to use the squid and squidguard-Version of my
ubuntu with that config. And it works.

Does anybody know about bugs in the package of SuSE 10.1?


Philipp





signature.asc
Description: PGP signature


signature.asc
Description: OpenPGP digital signature


Re: [squid-users] Strange Problems with redirector

2006-05-18 Thread Philipp Neuhaus

hey Peter,

Peter Albrecht wrote:



Please check, if AppArmor is running. It is started by default and there is a 
profile for Squid which limits using redirectors and authentication. To check 
if AppArmor is protecting Squid, do:
 



this is a great idea.
This is the first SuSE I installed (i didn't choose it. I'm not guilty. 
;-) ) and it semms to me that AppArmor is activated by default on SuSE 
10.1. I will proof this the next time I'm working at the server.


Thanks
Philipp


Re: [squid-users] Allowing/Unblocking Skype with Squid

2006-06-06 Thread Philipp Nyffenegger


acl N_IPS urlpath_regex ^[0-9]+\.[0-9]+\.[0-9]+\.[0-9]+
acl connect method CONNECT

http_access allow connect N_IPS all


Why do all these tipps refer to "urlpath_regex" ? This is IMHO false.
At least it does not match at my site. There is no URL-Path in the
CONNECT-Method, iirc.

This works fine in blocking Skype via Squid at my site :

acl CONNECT method CONNECT
acl skype url_regex ^[0-9]+\.[0-9]+\.[0-9]+\.[0-9]+
.
deny_info ERR_CLIENT_HTTPS2IP_DENIED skype
http_access deny CONNECT skype
.
.
.

Cheers,
Philipp


[squid-users] Parent proxy without ICP

2007-02-18 Thread Philipp Leusmann
Hi,

I have set up a transparent squid for my LAN which works great without a
parent cache. I would now like to use the parent cache of my provider
but it doesn´t seem to support ICP.
As far as I have understood, when using a parent cache squid first
checks if the parent cache has the request cached and if not forward the
request to the original site and caches the returned objects. Correct?
This makes sense when using ICP.
But since I would like to limit requests going through my internet
uplink I think, without ICP, it would make sense to first look up an
object in the local cache. If it doesn´t exist locally forward the
request to the parent cache which takes then care if it has cached the
object and maybe forwards the request to the original site. I would then
need squid to cache the returned objects locally no matter if they come
from the original site or the parent cache.
Would that be possible? Is this the behaviour of a sibling cache
eventually? In some FAQ I read that sibling caches should not be located
towards the route to the internet, so that is why I am asking.

Thanks,
 Philipp



Re: [squid-users] Parent proxy without ICP

2007-02-19 Thread Philipp Leusmann
Matus UHLAR - fantomas schrieb:
> On 18.02.07 14:38, Philipp Leusmann wrote:
>   
>> I have set up a transparent squid for my LAN which works great without a
>> parent cache. I would now like to use the parent cache of my provider
>> but it doesn´t seem to support ICP.
>> 
>
> try if it supports HTCP. Ask your ISP. Also, ask if it supports cache
> digests.
>   
I will try to ask, but since it is a very large provider and I didn´t
find any information regarding that topic with google I don´t have much
hope.
>
>   
>> But since I would like to limit requests going through my internet
>> uplink I think, without ICP, it would make sense to first look up an
>> object in the local cache.
>> 
>
> how do you mean this? do you have more thn one caches? configure them as
> siblings, then.
>   
I have one local cache running on my router to the internet (c1) and my
ISP has one cache running (as I said most probably without ICP or HTCP)
(c2).
I desired the following behavior:
Client request--> TCP_MISS from c1 ---> forward to c2
--> TCP_MISS from c2 ---> c2 forwards to direct  === c1
caches response
| 
|
|  
 ->TCP_HIT from c2  
=== c1 caches response
->   TCP_HIT from c1  === desirable behaviour

I hope the above is readable in the post :) And understandable...
currently the parent is configured with:
cache_peer proxy.arcor-ip.net parent 8080 3130 no-query
As proposed by Henrik I have

prefer_direct off
nonhierarchical_direct on

Thanks for your help,
 Philipp






Re: [squid-users] Parent proxy without ICP

2007-02-19 Thread Philipp Leusmann
Philipp Leusmann schrieb:
> Matus UHLAR - fantomas schrieb:
>   
>> On 18.02.07 14:38, Philipp Leusmann wrote:
>>   
>> 
>>> I have set up a transparent squid for my LAN which works great without a
>>> parent cache. I would now like to use the parent cache of my provider
>>> but it doesn´t seem to support ICP.
>>> 
>>>   
>> try if it supports HTCP. Ask your ISP. Also, ask if it supports cache
>> digests.
>>   
>> 
> I will try to ask, but since it is a very large provider and I didn´t
> find any information regarding that topic with google I don´t have much
> hope.
>   
>>   
>> 
>>> But since I would like to limit requests going through my internet
>>> uplink I think, without ICP, it would make sense to first look up an
>>> object in the local cache.
>>> 
>>>   
>> how do you mean this? do you have more thn one caches? configure them as
>> siblings, then.
>>   
>> 
> I have one local cache running on my router to the internet (c1) and my
> ISP has one cache running (as I said most probably without ICP or HTCP)
> (c2).
> I desired the following behavior:
> Client request--> TCP_MISS from c1 ---> forward to c2
> --> TCP_MISS from c2 ---> c2 forwards to direct  === c1
> caches response
> | 
> |
> |  
>  ->TCP_HIT from c2  
> === c1 caches response
> ->   TCP_HIT from c1  === desirable behaviour
>
> I hope the above is readable in the post :) And understandable...
> currently the parent is configured with:
> cache_peer proxy.arcor-ip.net parent 8080 3130 no-query
> As proposed by Henrik I have
>
> prefer_direct off
> nonhierarchical_direct on
>
> Thanks for your help,
>  Philipp
>
>   
Since the ascii representation didn´t seem to work out, you can find a
graphic scheme here: http://img226.imageshack.us/img226/5991/squidaj7.png

Regards,
 Philipp



[squid-users] squid authentication problem

2003-12-10 Thread Dreimann, Philipp
hello,

always after i enable authentication i'm unable to browse on any webpage and i'm not 
even getting an errorpage from squid.

so heres my buggy squid.conf


http_port 8080
icp_port 0
htcp_port 0
hierarchy_stoplist cgi ? 
acl QUERY urlpath_regex cgi \?
no_cache deny QUERY
cache_dir diskd /var/spool/squid 100 16 256
ftp_user [EMAIL PROTECTED]
authenticate_program /my/auth/program
authenticate_children 5
proxy_auth_realm Meine Proxy-Anmeldung
range_offset_limit 0 KB
ident_timeout 1 seconds
acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl FTP proto FTP
always_direct allow FTP
acl VERTRAUT src 192.168.0.0/255.255.0.0
acl VERTRAUT src 81.89.229.64/255.255.255.240
acl ANMELDUNG proxy_auth REQUIRED
http_access allow ANMELDUNG
cache_mgr [EMAIL PROTECTED]
visible_hostname proxy.go-gate.de
forwarded_for off
cachemgr_passwd disable all



i know the VERTRAUT acl isnt used currently..

my authentication program can also be replaced with the ncsa_auth tool, its the same 
problem.
i tried it with squid 2.4 and 2.5 (and changed the authentication params so that it 
works with both versions.)

i hope someone has a hint for me.

thanks,
Philipp




[squid-users] Problem excluding single client from redirector program

2010-10-07 Thread Philipp Herz - Profihost AG

Hello everybody,

actually i'm trying to migrate a Squid/SquidGuard setup from Squid 
(3.0.STABLE19) to Squid (3.1.3).


The problem is, that i am not able to exclude a single client identified 
by it's ip or mac address from being proccessed by SquidGuard as the 
redirector.


acl my_net  src 192.168.0.0/16
acl c_by_IP src 192.168.0.99
acl c_by_MACarp aa:bb:cc:dd:ee:ff

http_access allow my_net
http_access deny  all

redirector_access  deny c_by_IP
redirector_access  deny c_by_MAC

# url_rewrite_access deny c_by_IP
# url_rewrite_access deny c_by_MAC

url_rewrite_program  /usr/bin/squidGuard
url_rewrite_children 5

None of the attempts above are working for Squid (3.1.3). Using 
directive "redirector_access deny" with Squid (3.0.STABLE19) works as 
expected.


So, could you please give me any hints on how to get this thing working 
or is there any known bug or limitation why it's not working with 3.1.3?


Thanks - philipp


Re: [squid-users] Problem excluding single client from redirector program

2010-10-08 Thread Philipp Herz - Profihost AG

On 08/10/10 03:26, Philipp Herz - Profihost AG wrote:

Hello everybody,

actually i'm trying to migrate a Squid/SquidGuard setup from Squid
(3.0.STABLE19) to Squid (3.1.3).

The problem is, that i am not able to exclude a single client identified
by it's ip or mac address from being proccessed by SquidGuard as the
redirector.

acl my_net src 192.168.0.0/16
acl c_by_IP src 192.168.0.99
acl c_by_MAC arp aa:bb:cc:dd:ee:ff

http_access allow my_net
http_access deny all

redirector_access deny c_by_IP
redirector_access deny c_by_MAC

# url_rewrite_access deny c_by_IP
# url_rewrite_access deny c_by_MAC

url_rewrite_program /usr/bin/squidGuard
url_rewrite_children 5

None of the attempts above are working for Squid (3.1.3). Using
directive "redirector_access deny" with Squid (3.0.STABLE19) works as
expected.

So, could you please give me any hints on how to get this thing working
or is there any known bug or limitation why it's not working with 3.1.3?

Thanks - philipp


Nothing comes to mind. IP should be working even if ARP fails.

NP: "url_rewrite_access" is the correct config out of those attempts and
is identical in meaning for all Squid since redirector_access was
deprecated by 2.5.

Firstly check the order of url_rewrite_access lines (*all of them*).
First match wins.

Then try tracing the access control tests in cache.log with:
debug_options 28,3 61,5

If that does not show the problem up try with the latest Squid-3.1 code.


Amos


Hi Amos,

thanks for your information. I have tested it again with "debug_options" 
set. From the output it seems to me, that there must be something 
absolutely wrong with the IP/MAC based ACL.


As i understand cache.log shows that client is identified by it's 
ip-address, then checked against "http_access" and granted by "my_NET" 
match. When it comes to "checking url_rewrite_access" aclIpMatchIp does 
not know the IP anymore - therefore comparison fails - no match.


And Yes, I have double checked the ip-address of my client and the ACL.

So if you have any ideas/suggestions what to check, i would appreciate it.

Thanks again - philipp

here the complete snippet from cache.log:

2010/10/08 08:47:39.260| ACLChecklist::preCheck: 0x8fede08 checking 
'http_access allow my_NET'

2010/10/08 08:47:39.260| ACLList::matches: checking my_NET
2010/10/08 08:47:39.260| ACL::checklistMatches: checking 'my_NET'
2010/10/08 08:47:39.260| aclIpMatchIp: '192.168.1.193:4587' found
2010/10/08 08:47:39.260| ACL::ChecklistMatches: result for 'my_NET' is 1
2010/10/08 08:47:39.260| aclmatchAclList: 0x8fede08 returning true (AND 
list satisfied)
2010/10/08 08:47:39.260| ACLChecklist::markFinished: 0x8fede08 checklist 
processing finished
2010/10/08 08:47:39.260| ACLChecklist::check: 0x8fede08 match found, 
calling back with 1

2010/10/08 08:47:39.261| ACLChecklist::checkCallback: 0x8fede08 answer=1
2010/10/08 08:47:39.261| ACLChecklist::preCheck: 0x8fede08 checking 
'adaptation_access service_req allow all'

2010/10/08 08:47:39.261| ACLList::matches: checking all
2010/10/08 08:47:39.261| ACL::checklistMatches: checking 'all'
2010/10/08 08:47:39.261| aclIpMatchIp: '192.168.1.193:4587' found
2010/10/08 08:47:39.261| ACL::ChecklistMatches: result for 'all' is 1
2010/10/08 08:47:39.261| aclmatchAclList: 0x8fede08 returning true (AND 
list satisfied)
2010/10/08 08:47:39.261| ACLChecklist::markFinished: 0x8fede08 checklist 
processing finished
2010/10/08 08:47:39.261| ACLChecklist::check: 0x8fede08 match found, 
calling back with 1

2010/10/08 08:47:39.261| ACLChecklist::checkCallback: 0x8fede08 answer=1
2010/10/08 08:47:39.261| ACLChecklist::preCheck: 0x8fede08 checking 
'url_rewrite_access denyc_by_IP'

2010/10/08 08:47:39.261| ACLList::matches: checking c_by_IP
2010/10/08 08:47:39.261| ACL::checklistMatches: checking 'c_by_IP'
2010/10/08 08:47:39.261| aclIpMatchIp: '[::]' NOT found
2010/10/08 08:47:39.261| ACL::ChecklistMatches: result for 'c_by_IP' is 0
2010/10/08 08:47:39.261| aclmatchAclList: 0x8fede08 returning false (AND 
list entry failed to match)
2010/10/08 08:47:39.261| aclmatchAclList: async=0 nodeMatched=0 
async_in_progress=0 lastACLResult() = 0 finished() = 0
2010/10/08 08:47:39.261| ACLChecklist::preCheck: 0x8fede08 checking 
'url_rewrite_access denyc_by_MAC'

2010/10/08 08:47:39.261| ACLList::matches: checking c_by_MAC
2010/10/08 08:47:39.261| ACL::checklistMatches: checking 'c_by_MAC'
2010/10/08 08:47:39.261| aclMatchArp: [::] NOT found
2010/10/08 08:47:39.262| ACL::ChecklistMatches: result for 'c_by_MAC' is 0
2010/10/08 08:47:39.262| aclmatchAclList: 0x8fede08 returning false (AND 
list entry failed to match)
2010/10/08 08:47:39.262| aclmatchAclList: async=0 nodeMatched=0 
async_in_progress=0 lastACLResult() = 0 finished() = 0
2010/10/08 08:4

Re: [squid-users] Problem excluding single client from redirector program

2010-10-08 Thread Philipp Herz - Profihost AG

On Thu, Oct 07, 2010 at 04:26:29PM +0200, Philipp Herz - Profihost AG wrote:

Hello everybody,

actually i'm trying to migrate a Squid/SquidGuard setup from Squid
(3.0.STABLE19) to Squid (3.1.3).

The problem is, that i am not able to exclude a single client identified
by it's ip or mac address from being proccessed by SquidGuard as the
redirector.

acl my_net  src 192.168.0.0/16
acl c_by_IP src 192.168.0.99
acl c_by_MACarp aa:bb:cc:dd:ee:ff

http_access allow my_net
http_access deny  all

redirector_access  deny c_by_IP
redirector_access  deny c_by_MAC

# url_rewrite_access deny c_by_IP
# url_rewrite_access deny c_by_MAC

url_rewrite_program  /usr/bin/squidGuard
url_rewrite_children 5

None of the attempts above are working for Squid (3.1.3). Using
directive "redirector_access deny" with Squid (3.0.STABLE19) works as
expected.

So, could you please give me any hints on how to get this thing working
or is there any known bug or limitation why it's not working with 3.1.3?

I tested it on squid 3.1.8 and it works.

My configuration is:
url_rewrite_program /opt/sg/bin/squidGuard
url_rewrite_children 30
...
acl ip1 src 10.0.0.1
...
url_rewrite_access deny ip1
url_rewrite_access allow all
# also the following lines work as expected
# url_rewrite_access allow ip1
# url_rewrite_access deny all



Hi Peter,

now i have compiled and tested squid (v3.1.8) and yes it works as 
expected :-)


So, thanks to anybody!

Regards - philipp


Re: [squid-users] Squid requirements

2008-07-16 Thread Philipp Rusch - New Vision

Adrian Chadd schrieb:

What we're really missing is a bunch of "hardware x, config y, testing
z, results a, b, c." TMF used to have some stuff up for older hardware
but there's just nothing recent to use as a measuring stick..



Adrian


2008/7/16 Chris Robertson <[EMAIL PROTECTED]>:
  

Luis Daniel Lucio Quiroz wrote:


HI folks

I already know that there is not a recipe for squid.  But I wonder if
anyone knows an official document that lists squid requirements.

Regards,

LD

  

That's a bit like asking "What kind of a car should I get?".  You need to
give some details of the expected workload.

In general, get a higher clocked CPU, as much RAM and as many drives as you
can afford, and use regex based ACLs sparingly.

Chris

OK - then let's start collecting some numbers with more recent hardware:

we have a Squid 3 stable 5 on a opensuse 10.3 running on following 
system for about 100 users

with adequate response times:
IBM xSeries 3250 M2
1x Intel Core 2 Duo E4600 2.4 Ghz/800 MHz (2 MB L2 cache)
3 GB PC2-5300 CL5 ECC DDR2 SDRAM DIMM
2x 250 GB SATA hard drive as a mirror configuration

This system is doing virus-scanning with ICAP-enabled Squid through KAV 5.5
Kaspersky AntiVirus for Internet Gateways
AND it is doing web-content filtering with SquidGuard 1.3
AND it is doing NTLM AUTH against the internal W2k3-ADS-domain

Best regards,
--

Mit freundlichen Grüßen,
Philipp Rusch




Re: [squid-users] Blocking by ip

2008-09-30 Thread Philipp Rusch - NewVision-IT

John Doe schrieb:

Hello, I have in mysql database a list of ip of networks what I want
to admit connect to the proxy. Is possible verify if a new connection
try to connect to the proxy verify if their ip is in the database and
if is in, in that case, allow him to connect?



You could write an external_acl program that checks if the IP is in your mysql 
table and returns OK or ERR.
Check the "helpers" directory in the squid sources for inspiration.

JD
  
Or you have a look at www.squidguard.org, the new version 1.4 supports 
MySQL DB


HTH, Philipp



Re: [squid-users] Blocked Domains help :(

2009-05-25 Thread Philipp Rusch - New Vision-IT

IBT schrieb:

Hi,

I am still working on this strange error with my groups and permissions. I
think I found something. now i just have to work out how to resolve it...

2009/05/25 18:08:02| logfileClose: closing log c:/squid/var/logs/store.log
2009/05/25 18:08:02| logfileClose: closing log c:/squid/var/logs/access.log
2009/05/25 18:08:02| Squid Cache (Version 2.7.STABLE6): Exiting normally.
2009/05/25 18:08:02| Starting Squid Cache version 2.7.STABLE6 for
i686-pc-winnt...
2009/05/25 18:08:02| Running as Squid Windows System Service on Windows
Server 2003
2009/05/25 18:08:02| Service command line is: 
2009/05/25 18:08:02| Process ID 3228

2009/05/25 18:08:02| With 2048 file descriptors available
2009/05/25 18:08:02| With 2048 CRT stdio descriptors available
2009/05/25 18:08:02| Windows sockets initialized
2009/05/25 18:08:02| Using select for the IO loop
2009/05/25 18:08:02| Performing DNS Tests...
2009/05/25 18:08:02| Successful DNS name lookup tests...
2009/05/25 18:08:02| DNS Socket created at 0.0.0.0, port 2544, FD 5
2009/05/25 18:08:02| Adding nameserver 192.168.2.3 from squid.conf
2009/05/25 18:08:02| Adding nameserver 192.168.2.1 from squid.conf
2009/05/25 18:08:02| helperStatefulOpenServers: Starting 5
'mswin_negotiate_auth.exe' processes
2009/05/25 18:08:02| helperOpenServers: Starting 5
'mswin_check_lm_group.exe' processes
2009/05/25 18:08:02| User-Agent logging is disabled.
2009/05/25 18:08:02| Referer logging is disabled.
2009/05/25 18:08:02| logfileOpen: opening log c:/squid/var/logs/access.log
2009/05/25 18:08:02| Unlinkd pipe opened on FD 48
2009/05/25 18:08:02| Swap maxSize 1024000 + 32768 KB, estimated 0 objects
2009/05/25 18:08:02| Target number of buckets: 4064
2009/05/25 18:08:02| Using 8192 Store buckets
2009/05/25 18:08:02| Max Mem  size: 32768 KB
2009/05/25 18:08:02| Max Swap size: 1024000 KB
2009/05/25 18:08:02| Local cache digest enabled; rebuild/rewrite every
3600/3600 sec
2009/05/25 18:08:02| logfileOpen: opening log c:/squid/var/logs/store.log
2009/05/25 18:08:02| Rebuilding storage in c:/squid/var/cache (CLEAN)
2009/05/25 18:08:02| Using Least Load store dir selection
2009/05/25 18:08:02| Current Directory is C:\squid\sbin
2009/05/25 18:08:02| Loaded Icons.
2009/05/25 18:08:02| Accepting proxy HTTP connections at 0.0.0.0, port 8085,
FD 54.
2009/05/25 18:08:02| Accepting ICP messages at 0.0.0.0, port 3130, FD 55.
2009/05/25 18:08:02| Accepting HTCP messages on port 4827, FD 56.
2009/05/25 18:08:02| Accepting SNMP messages on port 3401, FD 57.
2009/05/25 18:08:02| Ready to serve requests.
2009/05/25 18:08:02| Store rebuilding is  5.5% complete
2009/05/25 18:08:03| Done reading c:/squid/var/cache swaplog (74365 entries)
2009/05/25 18:08:03| Finished rebuilding storage from disk.
2009/05/25 18:08:03| 74365 Entries scanned
2009/05/25 18:08:03| 0 Invalid entries.
2009/05/25 18:08:03| 0 With invalid flags.
2009/05/25 18:08:03| 74365 Objects loaded.
2009/05/25 18:08:03| 0 Objects expired.
2009/05/25 18:08:03| 0 Objects cancelled.
2009/05/25 18:08:03| 0 Duplicate URLs purged.
2009/05/25 18:08:03| 0 Swapfile clashes avoided.
2009/05/25 18:08:03|   Took 0.5 seconds (140047.1 objects/sec).
2009/05/25 18:08:03| Beginning Validation Procedure
2009/05/25 18:08:03|   Completed Validation Procedure
2009/05/25 18:08:03|   Validated 74365 Entries
2009/05/25 18:08:03|   store_swap_size = 921584k
2009/05/25 18:08:03| storeLateRelease: released 0 objects
/mswin_check_lm_group.exe NetUserGetGroups() failed.'
/mswin_check_lm_group.exe NetUserGetGroups() failed.'
/mswin_check_lm_group.exe NetUserGetGroups() failed.'

how do i fix this mswin check error. I guess that is where all the problems
lay. 


:)
  

Did you specify the command line like the ones here:

Squid [for Windows]  doesn't know how to run external helpers based on 
scripts, like .bat, .cmd, .vbs, .pl, etc.
So in squid.conf the interpreter path must be always specified, for 
example:


redirect_program c:/perl/bin/perl.exe c:/squid/libexec/redir.pl
redirect_program c:/winnt/system32/cmd.exe /C c:/squid/libexec/redir.cmd


Have a look here:
http://squid.acmeconsulting.it/

HTH,
Philipp Rusch
www.newvision-it.de




Re: [squid-users] users bypassing rules.. Help!?

2009-07-11 Thread Philipp Rusch - New Vision IT

Roland Roland schrieb:

Hello,

for a while now.. almost 3 weeks I've been using an ACL tht matches a 
specific file content with url_regex
in this file there's facebook, and a few other sites that I don't want 
users to access.


users have found a way to bypass these restrictions
by using online sites that supports such a thing.. like using google 
translate service to translate sites which by default would be blocked..

or simply using other online websites that masks such a usage...


anyone has a better way for me to block such sites?

thanks in advance,

Roland


Hi Roland,

use squidguard for this purpose : http://www.squidguard.org/

Regards from Germany,
Philipp





Re: [squid-users] Insert Header or Footer into retrieved pages?

2008-09-24 Thread Philipp Rusch - New Vision-IT

Alex Rousskov schrieb:

On Wed, 2008-09-24 at 10:26 -0700, Rodre Ghorashi-Zadeh wrote:

  

Does anyone know where I can get the "reference" icap server mentioned
here: http://wiki.squid-cache.org/Features/ICAP with a 404 URL of
http://www.icap-forum.org/spec/icap-server10.tar.gz ?



Tried the Internet Archive? If you cannot find it anywhere, please let
me know and I will try to dig up a copy. I do not know whether I have
one though.

  

Can someone offer up a different solution to just inject a simple html
header into the pages returned via the squid proxy?



You can also wait for eCAP work to be completed. I am supposed to commit
the missing bits by September 29.

HTH,

Alex.



  

Hello Rodre,

http://www.icap-forum.org/documents/other/icap-server10.zip

they changed the URL ...

to Alex: that is great news about eCAP, we appreciate your work !

Regards from Germany,
Philipp Rusch



Re: [squid-users] Re: cached MS updates !

2008-12-21 Thread Philipp Rusch - New Vision IT

Richard Neville schrieb:

Henrik Nordstrom  henriknordstrom.net> writes:

  

On mån, 2008-06-16 at 08:16 -0700, pokeman wrote:

thanks henrik for you reply 
any other way to save bandwidth windows updates almost use 30% of my entire
bandwidth 
  

Microsoft has a update server you can run locally. But you need to have
some control over the clients to make them use this instead of windows
update...

Or you could look into sponsoring some Squid developer to add caching of
partial objects with the goal of allowing http access to windows update
to be cached. (the versions using https can not be done much about...)

Regards
Henrik




Hi, Just thought id let you know, I currently am using an IPCop Firewall,
and one of the plugins (the reason i went with IPCOP) is an 
update accelerator plugin, that stores Windows, Apple, Symmantec, Avast and

linux updates on the
firewalls drive..

I actually found this site because i was trying to get help, and the developer
of the plugin seems cranky at the best of times.

Basically the system works, updates that a PC doesnt have gets loaded from the
firewall rather then the internet, but the updates themselves, it seems that MS
use multiple servers to store each update, now when I update a SP2 XP pro
system, it sees SP3, it downloaded a 850meg file, thats fine, it must be
multilanguage versions that its downloading..

the problem is that i update another SP2 system and it starts downloading the

850 megs again as its got the same file name, but comming from a different
 server.

would anyone here know how to rectify this?

im a 100% noob at linux but i have managed to get it up and running without too
much issue.

here's the plugin website for those interested.

http://update-accelerator.advproxy.net/

any help would be appreciated :)
planetx...@gmail.com

Why don't use the way Hendrik already recommended ?
I'd use Microsoft WSUS, its free and easy to setup.
And it will manage all these issues you have automagically.

HTH, Philipp



Re: [squid-users] cached MS updates !

2008-12-21 Thread Philipp Rusch - New Vision IT

Richard Neville schrieb:
Hi Phillip, the issue is: I run a computer repair business, the pcs 
that are comming in needed updates have various network 
configurations, as far as I'm aware, WSUS is good if you have existing 
set PC list that you configure to look at your server for updates, as 
I'm always getting different systems, I thought a fully transparent 
system would be best


Thanks for the email!

Happy christmas!  


Sent from my iPhone

On 21/12/2008, at 10:42 PM, Philipp Rusch - New Vision IT 
mailto:philipp.ru...@newvision-it.de>> 
wrote:



Richard Neville schrieb:

Henrik Nordstrom  henriknordstrom.net> writes:

  

On mån, 2008-06-16 at 08:16 -0700, pokeman wrote:

thanks henrik for you reply 
any other way to save bandwidth windows updates almost use 30% of my entire
bandwidth 
  

Microsoft has a update server you can run locally. But you need to have
some control over the clients to make them use this instead of windows
update...

Or you could look into sponsoring some Squid developer to add caching of
partial objects with the goal of allowing http access to windows update
to be cached. (the versions using https can not be done much about...)

Regards
Henrik



Hi, Just thought id let you know, I currently am using an IPCop Firewall,
and one of the plugins (the reason i went with IPCOP) is an 
update accelerator plugin, that stores Windows, Apple, Symmantec, Avast and

linux updates on the
firewalls drive..

I actually found this site because i was trying to get help, and the developer
of the plugin seems cranky at the best of times.

Basically the system works, updates that a PC doesnt have gets loaded from the
firewall rather then the internet, but the updates themselves, it seems that MS
use multiple servers to store each update, now when I update a SP2 XP pro
system, it sees SP3, it downloaded a 850meg file, thats fine, it must be
multilanguage versions that its downloading..

the problem is that i update another SP2 system and it starts downloading the

850 megs again as its got the same file name, but comming from a different
 server.

would anyone here know how to rectify this?

im a 100% noob at linux but i have managed to get it up and running without too
much issue.

here's the plugin website for those interested.

http://update-accelerator.advproxy.net/

any help would be appreciated :)
planetx...@gmail.com <mailto:planetx...@gmail.com>

Why don't use the way Hendrik already recommended ?
I'd use Microsoft WSUS, its free and easy to setup.
And it will manage all these issues you have automagically.

HTH, Philipp

Richard, ok - I see and I understand your point of view.
But still, I would suggest something like the c't offline updater then:
http://www.heise.de/software/download/ct_offline_update/38170
(there is also an english version of this around ...)

This is far less complicated than Olegs solution and saves a lot of 
bandwidth

while being perfetctly suited for your various systems needs.
You just start the script and it does the rest from a local cache.

Happy christmas to you , too!

HTH, Philipp from Germany



Re: [squid-users] SquidGuard Replacement

2009-01-06 Thread Philipp Rusch - New Vision-IT

Joseph L. Casale schrieb:

When logging in to MS Technet, I get this:

ERROR
The requested URL could not be retrieved

The following error was encountered while trying to retrieve the URL: http:443
Unable to determine IP address from host name 
The DNS server returned:

Name Error: The domain name does not exist.This means that the cache was not 
able to resolve the hostname presented in the URL. Check if the address is 
correct.
Your cache administrator is root.

Generated Tue, 06 Jan 2009 19:12:01 GMT by dev.activenetwerx.int 
(squid/3.0.STABLE9)

What does http:443 mean? This is only a problem when squidGuard is enabled?

The url that it tanked on is:
https://login.live.com/login.srf?wa=wsignin1.0&rpsnv=10&ct=1231267843&rver=5.5.4177.0&wp=MCMBI&wlcxt=technet%24technet%24technet&wreply=https%3a%2f%2ftechnet.microsoft.com%2fen-ca%2fsubscriptions%2fmanage%2fbb980931.aspx&lc=1033&id=254354&cru=http%3a%2f%2ftechnet.microsoft.com%2fen-ca%2fsubscriptions%2fdefault.aspx

Why would it work without squidGuard? I am seeming to have a lot of problems 
with
squidGuard, anyone got a reco on a replacement?

Thanks!
jlc

Hello Joseph,

I'm using Squid3STABLE9 and SquidGuard 1.3 on three openSUSE10.3 boxes 
and tested the URL you gave us above
without hanving any problems to access the TechNet site. So this must be 
something with your specific setup.
What's the version of SG are you using ? Maybe you can post your problem 
to http://www.squidguard.org/mailinglist.html


Regards,
- Philipp





Re: [squid-users] SquidGuard Replacement

2009-01-07 Thread Philipp Rusch - New Vision IT

Joseph L. Casale schrieb:

I switched to ufdbguard and have been real pleased with it's performance
and support.



Thomas,
Do I understand this right, the software is free but the db is not? Can one
use shalla lists with this software?

Thanks!
jlc

  

Joseph,
I wasn't able to access the systems with the SG-config today.
So let's solve your problem with SG tomorrow instead of hunting for
a "suboptimal" solution.
Did you try to post your prob to Shalla / Christine Kronberg ?
She is usually a great help.

CU, Philipp



Re: [squid-users] SquidGuard Replacement

2009-01-08 Thread Philipp Rusch - New Vision-IT

Thomas Raef schrieb:

How do you figure that ufdb Guard is "sub-optimal"?
 
Yes you can use shalla lists with this.
 
I suggest you contact the owner and discuss your needs with him. He 
reads this list so I think he'll be available.
 
Thomas J. Raef

www.ebasedsecurity.com <http://www.ebasedsecurity.com>
"You're either hardened, or you're hacked!"

----
*From:* Philipp Rusch - New Vision IT 
[mailto:philipp.ru...@newvision-it.de]

*Sent:* Wed 1/7/2009 1:12 PM
*To:* squid-users@squid-cache.org
*Subject:* Re: [squid-users] SquidGuard Replacement

Joseph L. Casale schrieb:
>> I switched to ufdbguard and have been real pleased with it's 
performance

>> and support.
>>
>

> Thomas,
> Do I understand this right, the software is free but the db is not? 
Can one

> use shalla lists with this software?
>
> Thanks!
> jlc
>
>  
Joseph,

I wasn't able to access the systems with the SG-config today.
So let's solve your problem with SG tomorrow instead of hunting for
a "suboptimal" solution.
Did you try to post your prob to Shalla / Christine Kronberg ?
She is usually a great help.

CU, Philipp


Thomas,
I did not say that ufdbguard is a "suboptimal" solution.
ALL I wanted to express with my mail was, that Joseph's
search for a solution was leading to a somewhat suboptimal setup.
He already had everything in place and encountered some problems,
so I advised him to search for the reasons of that problem and solve
them instaed of replacing components on a trial and error basis.
And despite the possible second meaning of my original posting,
I really wasn't trying to offend somebody.
AND, btw, please keep in mind that english is not my mother's tongue.

Regards from Germany,
Philipp

in his setup




[squid-users] Squid, firewall in Suse 9.1

2009-01-11 Thread Philipp Rusch - New Vision IT

vaisakh schrieb:

Hi all,
Im working as a system/network admin. We are using suse9.1 for fetchmail.
its working fine. now thw management wants to make the linux box as our
firewall and proxy. i am not aware about Linux. basically im an MCSE...is
any body pls help me to do this.now the ADSl is connected directly to
the switch and linux box on the same switchhow to change the setup...pls
help me...its urgent...how to configure thispls give me in details

thanks and regards
Vaisakh
vaisakhm...@yahoo.com
  

Hi Vaisakh,
I will try to help you, I assume your "ADSL"-whatever thingy has an 
ethernet-interface.
We need to know your IP-configuration, if you like, you can give details 
with private mail.
First thing you need is a second ethernet interface for your linux-box, 
otherwise it would
be senseless to setup a firewall on the box, since the "ADSL" would stay 
connected

with all the rest of your LAN.
Second, you will have to assign another IP-network to that second 
ethernet interface,
let'say this is your external connection from now on. SuSE-Linux assigns 
names like

"eth0" ,"eth1" and so on for its physical ethernet interfaces.
So from now on you have a two interface firewall box with "eth0" as your 
internal and

"eth1" as your external interface.
The external zone which comprises your "ADSL"-device and the "eth1" 
interface can
be connected by a ethernet crossover cable. If you don't have one, try 
an ordinary LAN-
patchcable, sometimes the "ADSL" boxes are smart enough to recognize the 
correct

pinout for themselves.
The internal "eth0" interface is now the only connection to and from the 
outside of the
LAN, this cable goes to your switch. The ADSL has no longer any 
connection to the

switch.

So, enough for this first things to do, it's up to you now.

BTW, where are you from ?

Regards from Germany
- Philipp