10.0.2.110 is the machine run squid and dansguardian, thank you for your reply.
- Original Message -
From: "Henrik Nordstrom" <[EMAIL PROTECTED]>
To: "zhang yikai" <[EMAIL PROTECTED]>
Cc: "Amos Jeffries" <[EMAIL PROTECTED]>; "Kinkie" <[EMAIL PROTECTED]>;
Sent: Tuesday, November 11, 2008
On tis, 2008-11-11 at 11:36 +0800, zhang yikai wrote:
> - DIRECT/10.0.2.110 text/html
> 1226418445.662137 127.0.0.1 TCP_MISS/503 1883 GET http://www.google.com/
> - DIRECT/10.0.2.110 text/html
Why does your Squid server resolve www.google.com to 10.0.2.110?
Regards
Henrik
signature.asc
De
On tis, 2008-11-11 at 15:24 +1300, Amos Jeffries wrote:
> Not fully 1.1, but from (0.9 + 1.0) to fully 1.0 + partial 1.1. Which is
> weird because 2.6 went almost fully 1.0 as well quite a while back.
From this discussion it seems Squid-3 no longer accepts the obsolete
HTTP/0.9 style requests.
S
Hi,
I am a squid newbie. I am trying to set up daily download quotas for NCSA
authorized users. I have a daemon running which checks the log files, and
whnever the download limit is reached (for a particular user), it blocks
that user in the config and reconfigures squid (squid -k reconfigure) for
Now I found when run squid and dansguardian in different machine it will be
ok, but can't work in one, what is the problem?
Thanks for your help
- Original Message -
From: "Kinkie" <[EMAIL PROTECTED]>
To: "zhang yikai" <[EMAIL PROTECTED]>
Cc:
Sent: Monday, November 10, 2008 9:13
- Original Message -
From: "Amos Jeffries" <[EMAIL PROTECTED]>
To: "zhang yikai" <[EMAIL PROTECTED]>
Cc: "Kinkie" <[EMAIL PROTECTED]>;
Sent: Tuesday, November 11, 2008 10:38 AM
Subject: Re: [squid-users] Run squid2.5.6 and dansguardian got error message:
(111) Connection refused
>> th
On Tue, Nov 11, 2008 at 9:31 AM, Amos Jeffries <[EMAIL PROTECTED]> wrote:
>
>
> Ahh okay. "cache_peer 202.169.51.118" should be the web server IP as seen
> from Squid (internal IP if squid is internal, external IP if squid is
> external, localhost maybe if squid is on same machine).
>
> Amos
>
so
> thanks for your help, I run wget
>
> [EMAIL PROTECTED] logs]# wget www.google.com
> --09:19:40-- http://www.google.com/
>=> `index.html'
> Connecting to 10.0.2.110:9090... connected.
> Proxy request sent, awaiting response... 403 Forbidden
> 09:19:41 ERROR 403: Forbidden.
>
>
> this
> all :
> it's works now
> but from Internal
> from External ( internet ) it's look like Domain without any space hosting
>
> i remove :
>> http_port 80 accel defaultsite=monitor.gpi-g.com
>> cache_peer 202.169.51.118 parent 80 0 no-query originserver name=myAccel
>> acl our_sites dstdomain monitor
> Thanks for your response
>
>> That message means there was no HTTP/1.0 tag on the request line.
>> Squid begins assuming HTTP/0.9 traffic.
>>
>>
>>> Squid 2.6 handled these fine, and my configuration hasnt changed, so
> was
>>> there something introduced in Squid3 that demands a hostname?
>>
>> n
Thanks for your response
> That message means there was no HTTP/1.0 tag on the request line.
> Squid begins assuming HTTP/0.9 traffic.
>
>
>> Squid 2.6 handled these fine, and my configuration hasnt changed, so
was
>> there something introduced in Squid3 that demands a hostname?
>
> no.
Something
thanks for your help, I run wget
[EMAIL PROTECTED] logs]# wget www.google.com
--09:19:40-- http://www.google.com/
=> `index.html'
Connecting to 10.0.2.110:9090... connected.
Proxy request sent, awaiting response... 403 Forbidden
09:19:41 ERROR 403: Forbidden.
this is the info from ac
all :
it's works now
but from Internal
from External ( internet ) it's look like Domain without any space hosting
i remove :
> http_port 80 accel defaultsite=monitor.gpi-g.com
> cache_peer 202.169.51.118 parent 80 0 no-query originserver name=myAccel
> acl our_sites dstdomain monitor.gpi-g.com
> h
> I've just rolled back a failed Squid migration from 2.6 to 3.0, and I'm
> looking for reasons why it failed. I have been successfully using the
> latest Squid 2.6 to http-accel a pool of backend web servers, with a
> load-balancer in front to direct traffic.
>
> The load-balancer hits the squid
I've just rolled back a failed Squid migration from 2.6 to 3.0, and I'm
looking for reasons why it failed. I have been successfully using the
latest Squid 2.6 to http-accel a pool of backend web servers, with a
load-balancer in front to direct traffic.
The load-balancer hits the squid server with
> 3.1 is certainly ready for testing. That's why we started making beta
> releases (3.1.0.X).
>
> Please give it a try and report back your findings. I don't think this
> is a setup that is commonly tested so it's very good if you can test
> this now while the release is actively being tested.
>
>
On sön, 2008-11-09 at 16:11 +0900, Mikio Kishi wrote:
> Hi, would you tell me the ICAP implementation on squid.
>
> - Question.1
> If there is no "icap_access" setting,
> The default icap access control is "allow" or "deny" ?
> It looks "allow"...
Should be deny.. icap_access selects which
3.1 is certainly ready for testing. That's why we started making beta
releases (3.1.0.X).
Please give it a try and report back your findings. I don't think this
is a setup that is commonly tested so it's very good if you can test
this now while the release is actively being tested.
Regards
Henrik
Hi,
I would like to setup squid proxy server for NTLM proxying (i.e.
connection pinning) + ICAP (clamav). I hope someone could advise if
there is any catch I need to pay attention with.
Thanks a lot.
John Mok
On tis, 2008-11-11 at 03:14 +1300, Amos Jeffries wrote:
> Henrik Nordstrom wrote:
> > From the error it sounds like it has declared the peer down.
>
> But why? I'm thinking forwarding loops.
Forwarding loops is logged very aggressively in cache.log as such, and
don't result in an error to the use
Ubuntu - apt-get install ufdbGuard worked here, may have to locate the
correct repository for it though.
Not sure on any other formats
Alex
-Original Message-
From: a bv [mailto:[EMAIL PROTECTED]
Sent: 10 November 2008 14:27
To: Alex Huxham
Subject: Re: [squid-users] URL Filtering for Sq
Martin Mulder wrote:
Hi,
I have (maybee a stupid) question.
I have an apache server as reverse proxy, squid as caching server and
Zope/Plone as backend servers.
Senario:
1) Apache gets a request for my.domain.com
2) Apache does a ProxyPass to my balancer
3) I have 2 "sticky" vhosts in apache w
Hello Gregory,
While setting up a squid+wccp solution i found this information really
helpful:
http://www.reub.net/node/3
Best Regards!
Egi
Gregory Machin wrote:
Hi
I'm looking for a howto or some docs to show how to do load balancing
. I have a single cisco router and would like to have t
Henrik Nordstrom wrote:
From the error it sounds like it has declared the peer down.
But why? I'm thinking forwarding loops.
On mån, 2008-11-10 at 11:35 +0700, ░▒▓ ɹɐzǝupɐɥʞ ɐzɹıɯ ▓▒░ wrote:
here is my squid .conf
===
http_port 2210 transparent
icp_port 3130
snmp_port 3401
cache_mgr
Thanks Chuck.
Unfortunately, this little 1U server does not have room for more than
1 hard drive. :-(
Do you have any specific "config options" in mind about how to best
use the memory?
Thank You,
Ed
On Sun, Nov 9, 2008 at 6:12 PM, Chuck Kollars <[EMAIL PROTECTED]> wrote:
>> ... It's my unders
Hi
I'm looking for a howto or some docs to show how to do load balancing
. I have a single cisco router and would like to have two or more
squid caches. in a load balanced configuration .. Any suggestions ?
Thanks
On Mon, Nov 10, 2008 at 11:11 AM, zhang yikai <[EMAIL PROTECTED]> wrote:
>
>
> hi all,
>
> I installed squid and it work properly, then I run dansguardian, connect to
> squid port 3128 ok, but when I using dansguardian port 8080 as a proxy, I got
> the error message (111) Connection refused, I d
Hi,
I have (maybee a stupid) question.
I have an apache server as reverse proxy, squid as caching server and
Zope/Plone as backend servers.
Senario:
1) Apache gets a request for my.domain.com
2) Apache does a ProxyPass to my balancer
3) I have 2 "sticky" vhosts in apache which are the balancer m
On mån, 2008-11-10 at 09:50 +0100, yagh mur wrote:
> http_access allow to_mynetwork users1
> http_access allow password users1
> http_access allow mynetwork
> http_access deny all
I think the above should be
http_access allow mynetwork to_mynetwork
http_access allow mynetwork users1
http_access
From the error it sounds like it has declared the peer down.
On mån, 2008-11-10 at 11:35 +0700, ░▒▓ ɹɐzǝupɐɥʞ ɐzɹıɯ ▓▒░ wrote:
> here is my squid .conf
> ===
> http_port 2210 transparent
> icp_port 3130
> snmp_port 3401
> cache_mgr admin
> emulate_httpd_log off
> cache_replacement_policy h
Henrik,
I read FAQ and implemented almost most of the suggestion to reduce
memory usage. I am not much concern about memory usage as there plenty
of available memory but the issue is CPU usage goes high up to 100%
and slows down squid response once squid grows beyond allocated
cache_mem size . Doe
hi all,
I installed squid and it work properly, then I run dansguardian, connect to
squid port 3128 ok, but when I using dansguardian port 8080 as a proxy, I got
the error message (111) Connection refused, I don't know what is the problem?
thank you.
-Original Message-
From: Alex Huxham
Sent: 10 November 2008 09:52
To: 'a bv'
Subject: RE: [squid-users] URL Filtering for Squid
Yes and no, there are free ones, however http://urlblacklist.com allows
you a FREE copy, only ONCE though, your first download of the blacklist
is free, its wo
Yet another yes from me, used within our school, and works perfectly for
900+ students and 200+ staff. Easy to configure, documented well and
there are plenty of resources on a google search to get you going
perfectly.
Alex
-Original Message-
From: Marcus Kool [mailto:[EMAIL PROTECTED]
S
I am the author of ufdbGuard which is based on squidGuard.
ufdbGuard is free and can be used with both free and commercial databases.
-Marcus
a bv wrote:
Hi,
What is /are the popular /commanly used open source (and maybe also
the other free ones) URL/content filtering solution/software. And
ufdbGuard is the best. You can get it from www.urlfilterdb.com
I like it because it's fast, updated frequently, easy to use and
customize. And the guy running it is extremely helpful.
No I am not related. I've used others in the past and when I finally
upon this one, I felt such a relief. The guy
Hi,
What is /are the popular /commanly used open source (and maybe also
the other free ones) URL/content filtering solution/software. And who
are maintaining url databases?
Regards
Hi all,
I've to configure Squid2 in order to ask for a password if a user go
to an external address
and no password have to be asked if the destination address is internal
I've this rules:
acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl t
38 matches
Mail list logo