[squid-users] fatal error!!!!

2004-02-14 Thread Giuseppe Dota

After 

#squid -z 

return this error

FATAL: getpwnam failed to find userid for effective user 'squid'
Squid Cache (Version 2.4.STABLE7): Terminated abnormally.
CPU Usage: 0.010 seconds = 0.010 user + 0.000 sys
Maximum Resident Size: 0 KB
Page faults with physical i/o: 317
Aborted

What is the problem??? 

hello
Giuseppe
[EMAIL PROTECTED]




RE: [squid-users] SQUID SETTINGS

2004-02-14 Thread Scott Phalen
Duane,

I have been watching my processes and cache all day today.  Squid is the
only thing running on this server and seems to be running at 99% CUP
utilization.  I currently have 600MB free mem out of 2GB (I reset squid 2
hours ago).  I changed my cache_mem to 128MB and set the cache_dir to 5
16 256.  What I am concerned about is running out of RAM.  From the output
of my "top" screen... there is no swap being used.

Is there a point where squid will stop consuming more RAM?
Does it hurt the caching process with frequent reboots?

Thanks in advance for your advice.

Scott Phalen



RE: [squid-users] is it a DOS attack ??

2004-02-14 Thread Danish Khan
Yea I can saw the forwarding loop thing in cache.log.. but plz tell me in
detail that how I overcome that.

Regards

Danish Khan

-Original Message-
From: Duane Wessels [mailto:[EMAIL PROTECTED] 
Sent: Sunday, February 15, 2004 5:51 AM
To: Danish Khan
Cc: [EMAIL PROTECTED]
Subject: RE: [squid-users] is it a DOS attack ??




On Sat, 14 Feb 2004, Danish Khan wrote:

> I have configured my box with 8192 FD but still I got warnings of FD's and
> tooo many comm.(23) Port error WHY plz update :(
>
> Danish
>
> -Original Message-
> From: Mahmood Ahmed [mailto:[EMAIL PROTECTED]
> Sent: Saturday, February 14, 2004 10:24 PM
> To: [EMAIL PROTECTED]
> Subject: [squid-users] is it a DOS attack ??
>
> Hello List!
>
> I have been facing this strange problem for last 3 days. I hope some one
> here will be able to shed light on it. I dont know wheather its a bug or a
> virus or a DOS attack but it is hitting my squid box very hard. in my
access
>
> log i am seeing a lot of these.
>
> 1076806934.151451 202.133.44.214 TCP_MISS/000 0 GET
> http://www.microsoft.com/ - NONE/- -
> 1076806934.163461 202.133.44.214 TCP_MISS/000 0 GET

This looks to me like a forwarding loop.

Are you using HTTP interception?

Duane W.



[squid-users] Transparent proxy issue

2004-02-14 Thread Mark Seamans
For some reason, a linux client works fine with konqueror, mozilla-firefox and lynx...
however, IE 6.x just hangs
ideas?

Thanks,

Mark


pgp0.pgp
Description: PGP signature


[squid-users] Squid As A Non-Caching Reverse Proxy/Web Accelerator?

2004-02-14 Thread Seun Osewa
Hi,
 
I am thinking of solutions for minimizing apache's
memory use on a small memory server in the presence of
several slow clients or long-running http requests
(large downloads) and with about 50% of the requested
pages being dynamic.
 
I need a reverse proxy server that can buffer output
from apache so that I won't need many active apache
processes to be able to serve slow clients, and I'm
considering squid with caching disabled.
 
I'd like to know how squid, in reverse proxy
mode, handles a situation where the origin server is
very fast but the client it's serving is slow.  Does
it buffer the server's response and allow it to close
the connection quickly and serve other processes?  Is
there an architecture document somewhere that _fully_
answers my question?
 
Regards,
Seun Osewa






___
BT Yahoo! Broadband - Free modem offer, sign up online today and save £80 
http://btyahoo.yahoo.co.uk


[squid-users] Fail-over of cache_peer parent...

2004-02-14 Thread Mike Stuber
I work in a small company with even smaller sales offices all over the
country with very limited network connectivity. 56k legacy leased lines in
some cases.

We have a hub and spoke network connecting all our offices.

In order to maximize the responsiveness of our intranet in the field
offices, I've implemented squid in a caching hierarchy distribution.

The problem is when the local hub goes down, I loose the cache in all the
offices relying on that hub.  I'd like to set them up using the 'cache_peer'
option to fail-over to the next hub and then the home office as a last
resort, but I can't seem to figure out exactly how to do this from the FAQ's
and the configuration guide.

This is a representation of the hierarchy:

192.168.1.10 (HQ)
 10.0.1.10 (Hub1)
  - 10.0.1.11 (Field1)
  - 10.0.1.15 (Field2)
 10.0.1.20 (Hub2)
  - 10.0.1.21 (Field3)
  - 10.0.1.25 (Field4)
 10.0.2.10 (Hub3)
  - 10.0.2.11 (Field5)
  - 10.0.2.15 (Field6)
 10.0.2.20 (Hub4)
  - 10.0.2.21 (Field7)
  - 10.0.2.25 (Field8)
 10.0.2.30 (Hub5)
  - 10.0.2.31 (Field9)
  - 10.0.2.35 (Field10)

Each field office has the nearest hub as their httpd_accel_host

httpd_accel_host 10.0.1.10
httpd_accel_port 80
httpd_accel_single_host on
httpd_accel_with_proxy off
httpd_accel_uses_host_header off

Each hub has the home office apache server as it's httpd_accel_host

httpd_accel_host 192.168.1.10
httpd_accel_port 80
httpd_accel_single_host on
httpd_accel_with_proxy off
httpd_accel_uses_host_header off

Here's what I'm thinking:

cache_peer 10.0.1.10 parent 80 3130 weight=1
cache_peer 10.0.1.20 parent 80 3130 weight=2
cache_peer 10.0.2.10 parent 80 0 weight=3
cache_peer 10.0.2.20 parent 80 0 weight=3
cache_peer 10.0.2.30 parent 80 0 weight=3
cache_peer 192.168.1.10 parent 80 0 weight=4 default
prefer_direct off

Any guidance would be greatly appreciated.

Thanks,
Mike


Re: [squid-users] cache_dir L1 L2 question

2004-02-14 Thread OTR Comm
> > I have a 80GB drive on a system that I would like to dedicate to a squid
> > server.  The notes in squid.conf say that I should subtract 20% and use
> > that number for the Mbytes field for cache_dir.  So I would have 64000.
> 
> You should do that to start with.  After Squid has been running with
> a full cache you can think about increasing the cache size.

You mean set the cache size to 64GB to start with, right?

> Also, please read: http://www.oreilly.com/catalog/squid/chapter/index.html

I bought your book a few weeks ago, now I am waiting for Amazon to ship
it.


Thanks,

Murrah Boswell


Re: [squid-users] ftp clients trouble: helpme please

2004-02-14 Thread Duane Wessels



On Sat, 14 Feb 2004, [EMAIL PROTECTED] wrote:

> I've a beautifull squid 2.5.4 server running with gentoo.
> It works really well with http and https client requests.
> If a client requests an ftp://ftp.rfc-editor.org/some_RFC_#.txt
> the browser displays it without trouble (the address is not
> correct because at now I'm not there), but with all other
> ftp requests (it is ftp://ftp.gnu.org, ftp://ftp.microsoft.com,
> ftp://ftp.cdrom.com) the browser client's can't display the
> page and after a long time (about 30-60 seconds) it
> give up. From the squid server with an ftp client I can connect
> with those sites without trouble. In the squid access logs
> I can't see the ftp client request that don't work, they aren't
> logged.
> I can't understand what's happening, please helpme.

You can probably get some additional info by enabling debugging
for the FTP code:

debug_options All,1 9,9

Then make some FTP requests to the servers that don't
work and look at cache.log.  Feel free to send us some cache.log
excerpts if they don't make any sense.

Duane W.


Re: [squid-users] Proxy-Chaining

2004-02-14 Thread Duane Wessels



On Sat, 14 Feb 2004, Andreas Neumeier wrote:

> Hello there,
>
> I tried to build a proxy chain with
>
> cache_peer
> and
> cache_peer_access
> as well using:
> always_direct deny
> never_direct allow
>
> Now, normal operation seems to work like this:
>
> client <-> squid1 <-> squid2 <-> target-net

You proably shouldn't mix always_direct and never_direct.
On squid1 you should probably only put:

never_direct allow all

> Only thing that doesn't seem to work: Any POST seems to be ignored (by
> proxy1, probably).

You need to explain what you mean by ignored.  Be as specific as possible.

>
> Also, I'm not sure how to handle SSL, (CONNECT). This must return DIRECT,
> which actually must bypass both squids. Am I right here?

You need to either configure your clients to forward SSL requests to
squid1, or configure your firewall to allow SSL traffic to pass
through directly.

Duane W.


Re: [squid-users] cache_dir L1 L2 question

2004-02-14 Thread Duane Wessels



On Sat, 14 Feb 2004, OTR Comm wrote:

> Hello,
>
> I have a 80GB drive on a system that I would like to dedicate to a squid
> server.  The notes in squid.conf say that I should subtract 20% and use
> that number for the Mbytes field for cache_dir.  So I would have 64000.

You should do that to start with.  After Squid has been running with
a full cache you can think about increasing the cache size.

> The question is, what is a reasonable L1 and L2 to put for this setting?

I'm not sure it matters much.  You can use 16/256 (the default) or
try 32/512 if you want.

> Also, I don't understand the different storage types, (ufs, aufs, diskd,
> etc), but for the system I want to set up, would either one be
> preffered?

Depends on your operating system and expected load.  I advise that
you stick with UFS (the default) for now and try one of the
others if you suspect that performance is suffering due to disk
I/O bottlenecks.

Also, please read: http://www.oreilly.com/catalog/squid/chapter/index.html

Duane W.


RE: [squid-users] is it a DOS attack ??

2004-02-14 Thread Duane Wessels



On Sat, 14 Feb 2004, Danish Khan wrote:

> I have configured my box with 8192 FD but still I got warnings of FD's and
> tooo many comm.(23) Port error WHY plz update :(
>
> Danish
>
> -Original Message-
> From: Mahmood Ahmed [mailto:[EMAIL PROTECTED]
> Sent: Saturday, February 14, 2004 10:24 PM
> To: [EMAIL PROTECTED]
> Subject: [squid-users] is it a DOS attack ??
>
> Hello List!
>
> I have been facing this strange problem for last 3 days. I hope some one
> here will be able to shed light on it. I dont know wheather its a bug or a
> virus or a DOS attack but it is hitting my squid box very hard. in my access
>
> log i am seeing a lot of these.
>
> 1076806934.151451 202.133.44.214 TCP_MISS/000 0 GET
> http://www.microsoft.com/ - NONE/- -
> 1076806934.163461 202.133.44.214 TCP_MISS/000 0 GET

This looks to me like a forwarding loop.

Are you using HTTP interception?

Duane W.


Re: [squid-users] Low hit rate

2004-02-14 Thread Duane Wessels



On Sat, 14 Feb 2004, Kemi Salam-Alada wrote:

> Hi all,
>
> How can I tune my squid so that I can generate high hit rate?  Presently, I
> am running squid using FreeBSD 4.3 OS and Squid 2.5 STABLE2.
> The file system used for the disk is aufs.

See the 'refersh_pattern' directive in squid.conf.
You can probably increase your hit ratio by increasing
the values of the refresh_pattern line(s).

Duane W.


Re: [squid-users] child squid proxy not saving data on disk, but anyway operating correctly.

2004-02-14 Thread Duane Wessels



On Sun, 15 Feb 2004, [iso-8859-2] Éliás [iso-8859-2] Tamás wrote:

> Hy all! I have an ordinary transparent squid proxy defined on my gateway, wich
> operates correctly. An other transparent proxy has been set up on the other
> part of the network, wich has the proxy on the gateway as a parent defined.
> cache_peer parentproxy.net 3113 3030
> No other options (like proxy-only) defined.
> OS is a Linux woody, squid is beta3 from 17th january.


> Interesting, the proxy wich has the parent defined do not save anything in it's
> cache, although it seems operating correctly, so there are FIRST_PARENT_MISSes
> and other lines in the log, indicating the data has been sent through. My

Could it be that your system clock is set incorrectly?

Also, what do you get from:

% squidclient mgr:store_check_cachable_stats


> firewall is operating correctly, by redirecting packages from port 80 to my
> proxy port 3113. When the child proxy is going direct (on queries containing ?
> or cgi-bin) it is accessing the port 80 of the parent proxy, wich is
> transparent, so redirects the query again.

you can probably avoid this forwarding loop by adding this
to the child:

   acl all src 0/0
   never_direct allow all


> Has anyone idea why the child proxy
> is not saving anything on the disk?

see my suggestions above

Duane W.


[squid-users] child squid proxy not saving data on disk, but anyway operating correctly.

2004-02-14 Thread Éliás Tamás
Hy all! I have an ordinary transparent squid proxy defined on my gateway, wich 
operates correctly. An other transparent proxy has been set up on the other 
part of the network, wich has the proxy on the gateway as a parent defined.
cache_peer parentproxy.net 3113 3030
No other options (like proxy-only) defined.
OS is a Linux woody, squid is beta3 from 17th january.
Interesting, the proxy wich has the parent defined do not save anything in it's 
cache, although it seems operating correctly, so there are FIRST_PARENT_MISSes 
and other lines in the log, indicating the data has been sent through. My 
firewall is operating correctly, by redirecting packages from port 80 to my 
proxy port 3113. When the child proxy is going direct (on queries containing ? 
or cgi-bin) it is accessing the port 80 of the parent proxy, wich is 
transparent, so redirects the query again. Has anyone idea why the child proxy 
is not saving anything on the disk?

-- 

Thomas Elias
Tel.: +3630/3299315
ICQ UIN: 206-714-459


-

- Vége a továbbított üzenetnek -


-- 

Thomas Elias
Tel.: +3630/3299315
ICQ UIN: 206-714-459


-


[squid-users] Low hit rate

2004-02-14 Thread Kemi Salam-Alada
Hi all,

How can I tune my squid so that I can generate high hit rate?  Presently, I
am running squid using FreeBSD 4.3 OS and Squid 2.5 STABLE2.
The file system used for the disk is aufs.

Regards.




RE: [squid-users] is it a DOS attack ??

2004-02-14 Thread Danish Khan
I have configured my box with 8192 FD but still I got warnings of FD's and
tooo many comm.(23) Port error WHY plz update :(

Danish

-Original Message-
From: Mahmood Ahmed [mailto:[EMAIL PROTECTED] 
Sent: Saturday, February 14, 2004 10:24 PM
To: [EMAIL PROTECTED]
Subject: [squid-users] is it a DOS attack ??

Hello List!

I have been facing this strange problem for last 3 days. I hope some one 
here will be able to shed light on it. I dont know wheather its a bug or a 
virus or a DOS attack but it is hitting my squid box very hard. in my access

log i am seeing a lot of these.

1076806934.151451 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806934.163461 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806934.170419 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806934.173403 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806934.182391 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806934.184361 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806934.191314 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806934.236318 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806934.282365 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806934.285350 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806934.325372 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806934.454134 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806934.784383 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806934.862418 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806934.892334 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806935.048381 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806935.048380 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806935.048374 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806935.055337 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806935.101358 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806935.178412 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806935.353530 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806935.362539 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806935.439585 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806935.563694 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - DIRECT/www.microsoft.com -
1076806935.641751 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806935.710784 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - DIRECT/www.microsoft.com -
1076806935.730802 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - DIRECT/www.microsoft.com -
1076806935.730775 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - DIRECT/www.microsoft.com -
1076806935.747786 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - DIRECT/www.microsoft.com -
1076806935.789781 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - DIRECT/www.microsoft.com -
1076806935.811802 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - DIRECT/www.microsoft.com -
1076806935.845746 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - DIRECT/www.microsoft.com -
1076806935.854685 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806935.868698 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806936.169653 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806936.169612 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806936.169610 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806936.304707 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - DIRECT/www.microsoft.com -
1076806936.407775 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - DIRECT/www.microsoft.com -
1076806937.343   1171 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - DIRECT/www.microsoft.com -
1076806937.663   1322 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - DIRECT/www.microsoft.com -
1076806937.815   1289 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - DIRECT/www.microsoft.com -
1076806937.873   1266 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - DIRECT/www.microsoft.com -
1076806937.923   10

[squid-users] is it a DOS attack ??

2004-02-14 Thread Mahmood Ahmed
Hello List!

I have been facing this strange problem for last 3 days. I hope some one 
here will be able to shed light on it. I dont know wheather its a bug or a 
virus or a DOS attack but it is hitting my squid box very hard. in my access 
log i am seeing a lot of these.

1076806934.151451 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806934.163461 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806934.170419 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806934.173403 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806934.182391 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806934.184361 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806934.191314 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806934.236318 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806934.282365 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806934.285350 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806934.325372 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806934.454134 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806934.784383 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806934.862418 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806934.892334 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806935.048381 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806935.048380 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806935.048374 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806935.055337 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806935.101358 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806935.178412 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806935.353530 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806935.362539 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806935.439585 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806935.563694 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - DIRECT/www.microsoft.com -
1076806935.641751 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806935.710784 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - DIRECT/www.microsoft.com -
1076806935.730802 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - DIRECT/www.microsoft.com -
1076806935.730775 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - DIRECT/www.microsoft.com -
1076806935.747786 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - DIRECT/www.microsoft.com -
1076806935.789781 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - DIRECT/www.microsoft.com -
1076806935.811802 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - DIRECT/www.microsoft.com -
1076806935.845746 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - DIRECT/www.microsoft.com -
1076806935.854685 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806935.868698 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806936.169653 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806936.169612 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806936.169610 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806936.304707 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - DIRECT/www.microsoft.com -
1076806936.407775 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - DIRECT/www.microsoft.com -
1076806937.343   1171 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - DIRECT/www.microsoft.com -
1076806937.663   1322 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - DIRECT/www.microsoft.com -
1076806937.815   1289 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - DIRECT/www.microsoft.com -
1076806937.873   1266 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - DIRECT/www.microsoft.com -
1076806937.923   1083 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - DIRECT/www.microsoft.com -
1076806938.002   1473 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - NONE/- -
1076806938.080   1279 202.133.44.214 TCP_MISS/000 0 GET 
http://www.microsoft.com/ - DIRECT/www.microsoft.com -
1076806938.090   11

Re: [squid-users] NTLMv2

2004-02-14 Thread Kinkie
Henrik Nordstrom <[EMAIL PROTECTED]> writes:

> On Fri, 13 Feb 2004 [EMAIL PROTECTED] wrote:
>
>> is there any plan to support NTLMv2 authentication through winbind helpers
>> in the next releases of Squid ?
>
> Yes. It should already work with Samba-3.0.2 and the current Squid nightly 
> snapshots, but has not yet been verified. See the Squid release notes for 
> the nightly snapshots.

I'm about to run such a test, hopefully as soon as next Monday.

-- 
kinkie (kinkie-squid [at] kinkie [dot] it)
Random fortune, unrelated to the message:
Three actors, Tom, Fred, and Cec, wanted to do the jousting scene
from Don Quixote for a local TV show.  "I'll play the title role," proposed
Tom.  "Fred can portray Sancho Panza, and Cecil B. De Mille."


[squid-users] cache_dir L1 L2 question

2004-02-14 Thread OTR Comm
Hello,

I have a 80GB drive on a system that I would like to dedicate to a squid
server.  The notes in squid.conf say that I should subtract 20% and use
that number for the Mbytes field for cache_dir.  So I would have 64000.

The question is, what is a reasonable L1 and L2 to put for this setting?

Also, I don't understand the different storage types, (ufs, aufs, diskd,
etc), but for the system I want to set up, would either one be
preffered?


Thank you,

Murrah Boswell


[squid-users] Proxy-Chaining

2004-02-14 Thread Andreas Neumeier
Hello there,

I tried to build a proxy chain with

cache_peer
and
cache_peer_access
as well using:
always_direct deny
never_direct allow

Now, normal operation seems to work like this:

client <-> squid1 <-> squid2 <-> target-net

Only thing that doesn't seem to work: Any POST seems to be ignored (by
proxy1, probably).

Also, I'm not sure how to handle SSL, (CONNECT). This must return DIRECT, 
which actually must bypass both squids. Am I right here?

If anyone had a similar problem and is able to help, i could supply more 
excact configurations and acls.

Thanks a lot!

--
Andreas Neumeier   _o o
Walter-Paetzmannstr 9 -\<,   
Germany
(+49)89-61098960 
(+49)89-38077929 
(+49)179-2431882 
http://andreas.neumeier.org
mailto:andreas(at)neumeier.org
UIN:14143331
--



RE: [squid-users] ftp clients trouble: helpme please

2004-02-14 Thread unixware

--- Elsen Marc <[EMAIL PROTECTED]> wrote:
> 
> 
>  
> > 
> > I've a beautifull squid 2.5.4 server running with
> gentoo. 
> 
>   What is 'gentoo' ?

it is favour of Linux which compiled from source
many find it very robust ..

www.gentoo.org
> 
>   M.
> 
> > It works really well with http and https client
> requests. 
> > If a client requests an
> ftp://ftp.rfc-editor.org/some_RFC_#.txt
> > the browser displays it without trouble (the
> address is not 
> > correct because at now I'm not there), but with
> all other
> > ftp requests (it is ftp://ftp.gnu.org,
> ftp://ftp.microsoft.com,
> > ftp://ftp.cdrom.com) the browser client's can't
> display the
> > page and after a long time (about 30-60 seconds)
> it
> > give up. From the squid server with an ftp client
> I can connect
> > with those sites without trouble. In the squid
> access logs
> > I can't see the ftp client request that don't
> work, they aren't
> > logged. 
> > I can't understand what's happening, please
> helpme.
> > 
> > Andrea
> > 
> > 


__
Do you Yahoo!?
Yahoo! Finance: Get your refund fast by filing online.
http://taxes.yahoo.com/filing.html


RE: [squid-users] ftp clients trouble: helpme please

2004-02-14 Thread Elsen Marc


 
> 
> I've a beautifull squid 2.5.4 server running with gentoo. 

  What is 'gentoo' ?

  M.

> It works really well with http and https client requests. 
> If a client requests an ftp://ftp.rfc-editor.org/some_RFC_#.txt
> the browser displays it without trouble (the address is not 
> correct because at now I'm not there), but with all other
> ftp requests (it is ftp://ftp.gnu.org, ftp://ftp.microsoft.com,
> ftp://ftp.cdrom.com) the browser client's can't display the
> page and after a long time (about 30-60 seconds) it
> give up. From the squid server with an ftp client I can connect
> with those sites without trouble. In the squid access logs
> I can't see the ftp client request that don't work, they aren't
> logged. 
> I can't understand what's happening, please helpme.
> 
> Andrea
> 
> 


[squid-users] ftp clients trouble: helpme please

2004-02-14 Thread [EMAIL PROTECTED]
I've a beautifull squid 2.5.4 server running with gentoo.
It works really well with http and https client requests.
If a client requests an ftp://ftp.rfc-editor.org/some_RFC_#.txt
the browser displays it without trouble (the address is not
correct because at now I'm not there), but with all other
ftp requests (it is ftp://ftp.gnu.org, ftp://ftp.microsoft.com,
ftp://ftp.cdrom.com) the browser client's can't display the
page and after a long time (about 30-60 seconds) it
give up. From the squid server with an ftp client I can connect
with those sites without trouble. In the squid access logs
I can't see the ftp client request that don't work, they aren't
logged.
I can't understand what's happening, please helpme.

Andrea