From: Muthukumar [EMAIL PROTECTED]
If a client requests an ftp://ftp.rfc-editor.org/some_RFC_#.txt
the browser displays it without trouble (the address is not
correct because at now I'm not there), but with all other
ftp requests (it is ftp://ftp.gnu.org, ftp://ftp.microsoft.com,
FATAL: getpwnam failed to find userid for effective user 'squid'
Squid Cache (Version 2.4.STABLE7): Terminated abnormally.
CPU Usage: 0.010 seconds = 0.010 user + 0.000 sys
Maximum Resident Size: 0 KB
Page faults with physical i/o: 317
Aborted
Problem is because of your
If a client requests an ftp://ftp.rfc-editor.org/some_RFC_#.txt
the browser displays it without trouble (the address is not
correct because at now I'm not there), but with all other
ftp requests (it is ftp://ftp.gnu.org, ftp://ftp.microsoft.com,
ftp://ftp.cdrom.com) the browser client's
From the squid server with an ftp client I can connect
with those sites without trouble.
You can connect to the ftp sites from the squid server,but not from the client side.
Check this.
squid faq
12.17 Can I make my regular FTP clients use a Squid cache?
Nope, its not possible. Squid only
For some reason, a linux client works fine with konqueror, mozilla-firefox and lynx...
however, IE 6.x just hangs
ideas?
What is the hanging messages?!
If you mail with some clarification,that will be helpful to give ideas!.
Regards,
Muthukumar.
Sorry for the lack of info..
As stated..Linux web browsers work fine..enough said.
Using an XP box with IE 6.x, the client just comes back with Page cannot be found.
While tailing the squid access.log I never see the clients activity.
Config: Dialup users into a Cisco AS5300. Purpose: Redirect
The goal is to apply a special deny error page to source traffic that meets a
certian acl,
with is working fine for Linux.
Provide config info and some tcpdump's of the sessions.
Regards,
Muthukumar.
On Fri, 13 Feb 2004, Jay Turner wrote:
Is there something that analyzes the various *_HIT statuses in the log
and produces a what might have been report? Does anyone know of any
tools that are not listed on the Squid Cache web site that would provide
this type of report?
Your
Kemi,
I increased my hit ratio by running pages and script output
through a cacheability tool and taking corrective action as
required. The main thing was to add mod_expires and mod_headers to
my servers.
http://www.cacheflow.com/technology/tools/friendly/cacheability/index.cfm
John Kent
On Fri, 13 Feb 2004, Kent, Mr. John (Contractor) wrote:
Duane and Henrik,
Thank you both for responding. I'm thinking that a glance at my
config file will reveal the problem to you so here it is:
What I'm trying to do is run Squid on port for testing,
have it accelerate servers
On Fri, 13 Feb 2004, Scott Phalen wrote:
Hello, I am very new to the world of squid.
First question ~ will WCCP V2 be incorporated in Squid 3.0?
Most likely not. There has not been any WCCPv2 patches contributed for
Squid-3.0 yet, and in addition Squid-3.0 is in feature freeze since long
On Sat, 14 Feb 2004, Scott Phalen wrote:
Is there a point where squid will stop consuming more RAM?
When your disk cache is full.
Regards
Henrik
On Sat, 14 Feb 2004, Elsen Marc wrote:
I've a beautifull squid 2.5.4 server running with gentoo.
What is 'gentoo' ?
http://www.gentoo.org/
Regards
Henrik
On Sun, 15 Feb 2004, [EMAIL PROTECTED] wrote:
The results are that on the squid side I saw nothing and on
the PC side I saw
- an ARP broadcast request where the PC
asked for the MAC of the 192.168.1.250 (it's the Cisco router
that is the default gateway either from all the PCs, either from
On Sat, 14 Feb 2004, Mike Stuber wrote:
The problem is when the local hub goes down, I loose the cache in all the
offices relying on that hub. I'd like to set them up using the 'cache_peer'
option to fail-over to the next hub and then the home office as a last
resort, but I can't seem to
I want to use a httpd accelerator that does background check.
Basically I want to:
When Client request comes in: If cached copy serve that regardless, then
check if content changed and refresh the cache in background. If further
request comes in during this refresh period then just serve the
I am thinking of solutions for minimizing apache's
memory use on a small memory server in the presence of
several slow clients or long-running http requests
(large downloads) and with about 50% of the requested
pages being dynamic.
If You go for Reverse proxy method,Dynamic contents such
On Sun, 15 Feb 2004, Seun Osewa wrote:
I need a reverse proxy server that can buffer output
from apache so that I won't need many active apache
processes to be able to serve slow clients, and I'm
considering squid with caching disabled.
Squid is quite commonly used for this purpose in front
18 matches
Mail list logo