wangzicai wrote:
Hello everyone!
I am using squid squid-2.5.stable14 in linux ws3 , when the system reboot
the squid can not run automatically.
How can I solve it .
Hello Wangzicai,
Starting squid at bootup can be done by configuring the rc scripts of
your OS environment or configuring
* On 07/09/06 21:54 +0200, Christoph Haas wrote:
| On Thursday 07 September 2006 21:22, Dan Thomson wrote:
| I'm sure this has been discussed before, but I'm curious about what
| people think are the best file systems to use for your cache dirs.
|
| I've read that ReiserFS and XFS are good
On Fri, Sep 08, 2006, Odhiambo WASHINGTON wrote:
Where does this discussion put other OSes other than Linux? For example,
we don't use ext3/reiserfs/xfs on FreeBSD.
When the discussion goes the direction of the OS, I sometimes do feel
that we need to split the list for sanity purposes:
On Friday 08 September 2006 04:12, Henrik Nordstrom wrote:
tor 2006-09-07 klockan 21:45 +0200 skrev Christoph Haas:
obviously a cache - what WebWasher isn't. You could as well try to use
both through an ICAP connection since WebWasher works both as a
HTTP/HTTPS/FTP proxy and as an ICAP
I fogot to tell my linux and squid version:
Linux ipcop1.4.10, Kernel 2.4.31, squid 2.5.STABLE12
-Message d'origine-
De : paulo braga [mailto:[EMAIL PROTECTED]
Envoyé : jeudi 7 septembre 2006 23:00
À : squid-users@squid-cache.org
Objet : [squid-users] no access to sites Intranet
Hi
On Fri, Sep 08, 2006, Christoph Haas wrote:
On Friday 08 September 2006 04:12, Henrik Nordstrom wrote:
tor 2006-09-07 klockan 21:45 +0200 skrev Christoph Haas:
obviously a cache - what WebWasher isn't. You could as well try to use
both through an ICAP connection since WebWasher works both
I don't know any free SSL scanner. We are using the WebWasher for much more
than just SSL scanning anyway. Squid isn't sufficient at all for enforcing
a corporate security policy. This may change once large companies will
stop using crap like Windows and especially the Internet Explorer.
Note: You won't be able to scan CONNECT requests (https) via ICAP. ICAP
only deals with HTTP data, and CONNECT switches to tunnel mode outside
of HTTP.
The only thing I wanted to know is if I can chain webwasher with squid;
i.e. if it works like a proxy. The documentation on the website is not
The only thing I wanted to know is if I can chain webwasher with squid;
i.e. if it works like a proxy. The documentation on the website is not
really helpful.
This seems to be possible, so I will investigate further.
Yours,
Jakob Curdes
On Friday 08 September 2006 10:46, Jakob Curdes wrote:
Note: You won't be able to scan CONNECT requests (https) via ICAP.
ICAP only deals with HTTP data, and CONNECT switches to tunnel mode
outside of HTTP.
The only thing I wanted to know is if I can chain webwasher with squid;
You can. No
fre 2006-09-08 klockan 10:08 +0200 skrev paulo braga:
The only problem I am having is that I can't access any of my Intranet web
servers on the other side of the VPN, even if I use the IP address. I can
ping them by their name and access the share folders but I can't open the
web pages of the
To ALL;
I have several clients that use SQUID as their only proxy server, all of
them are using SQUID-2.5.STABLE10 either with RH ES 4.0, RH ES 3.0, RH 7.2
or SLES 9.0 versions of LINUX and they are all experiencing some weird
intermittent delays when the Workstations (XP w/IE 6.0) try to open
Hi all. I'm researching squid and it's ability to work with routers that
support wccp. In my research it looks like folks have two options when
using wccp as far as kernel modules go. One is to use the ip_wccp module
and ip_gre. The other is that ip_gre in the 2.6.17 kernel includes wccp
support.
fre 2006-09-08 klockan 09:33 -0400 skrev Carmelo A. Zizza:
So, you are now asking what can you do; I would like some people, not all
the SQUID users, but maybe the US users if not the NYC users to try hitting
the site and let me know if you see any delays with the page fully loading
during
Nothing have changed.
I don't know how but around 25 mins after error squid run again
successfully.
Just to learn what is cache_effective_user use? and how can I change it?
Thks!
- Original Message -
From: Adrian Chadd [EMAIL PROTECTED]
To: Jaime Solorzano B [EMAIL PROTECTED]
Cc:
Hi -
I'm a noob and working through the same thing. Its definately gre now from
what I have read and the advice I have rec'd from this forum.
Check the wccp section of the squid faq, and here is another relavent link.
http://www.reub.net/node/3
Rich
From: Errol Neal [EMAIL PROTECTED]
To:
Greetings squid-users!
I'm Trying to get a basic squid / squidguard config running on debian
linux (sarge). Squid version is 2.5.STABLE9, squidguard is 1.2.0
with Berkeley DB 4.1.25. I've spent a few days trying to find a
basic how-to for squid with squidGuard but no luck.
I've tried
Hello,
I have a load balancer in front of two Squids in a reverse proxy setup.
The load balancer decides whether a Squid is healthy by opening a TCP
connection to it periodically. During the shutdown interval governed by
configuration parameter 'shutdown_lifetime', Squid 2.5 STABLE12
Thanks for trying, your attempts were at 10AM NYC time which is right in
line with what I asked for, further you have seen the same problems I have
seen in the logs and via ethereal.
So, I am off the hook, now to explain it to the client.
Regards,
Carmelo
-Original Message-
From: Henrik
fre 2006-09-08 klockan 10:03 -0400 skrev Errol Neal:
In the above link, It's mentioned that ip_wccp module does not support
transmitting so you need a gre tunnel to the router to get data back to
the client. Is this still the case?
Has always been the case but is never needed in any
fre 2006-09-08 klockan 11:29 -0600 skrev Jaime Solorzano B:
The problem was resolved yesterday without any action but we have again same
problem.
How can we resolve this definitively?
The most probable case is that Squid isn't running and the pid file is
stale mentioning an old pid which
fre 2006-09-08 klockan 11:48 -0700 skrev Ben Drees:
Hello,
I have a load balancer in front of two Squids in a reverse proxy setup.
The load balancer decides whether a Squid is healthy by opening a TCP
connection to it periodically. During the shutdown interval governed by
configuration
Hi,
When we request a URL with a fragment id (anchor:
http://foo.com/page#bar) through Squid, we get a 404 back immediately.
Is my client in violation of the HTTP spec or is this a Squid
limitation? My HTTP client is Java JDK 5.
Thanks.
--
-Raj
On 9/7/06, Pranav Desai [EMAIL PROTECTED] wrote:
On 9/6/06, Henrik Nordstrom [EMAIL PROTECTED] wrote:
ons 2006-09-06 klockan 18:54 -0700 skrev Pranav Desai:
64-bit.
Then I have no idea. Have 64-bit Squid's being many GB in size..
Ok thanks. Let me try to find some more information.
I
fre 2006-09-08 klockan 15:55 -0700 skrev Pranav Desai:
The other thing I found is in the strace. Last brk() before it failed.
It seems like some 24bit limit. Any ideas ?
I have only seen such limits on 32-bit systems. Never on an 64-bit
system running 64-bit software.
So triple-check that
fre 2006-09-08 klockan 14:39 -0700 skrev Ritu Raj Tiwari:
Hi,
When we request a URL with a fragment id (anchor:
http://foo.com/page#bar) through Squid, we get a 404 back immediately.
Works here..
What does access.log say?
Do you use any redirectors?
Does the web server you query handle
Hello,
I have recently started reading on SQUID as a
transparent interception proxy.
From the FAQ's, I gathered that SQUID needs to be
configured with DNS server(s) addresses, and it will
not fetch web pages if it cannot perform DNS name
resolutions for the URL. Is my understanding correct?
If
27 matches
Mail list logo