[squid-users] URL NOT WORKING VIA SQUID

2005-12-21 Thread Carlo Henrico
Hi All

I am a newbie to this so please bear with me.

I have 2 ways to access the Internet.  The one used for e-mail etc is 128k
and one via ADSL 512k line.  The second one (512k) is via a proxy using
Squid 2.5.STABLE3.

If I access the link below via the slower link (no Squid) it works but via
the faster link (with Squid) it does not work.  Any ideas please and where
can I start looking?

Link:

https://www2.swift.com/swift/login/login.fcc?TYPE=33554433&REALMOID=06-41b08
1f5-598b-00a5--2dfc2dfc&GUID=&SMAUTHREASON=0&METHOD=GET&SMAGENTNAME=
$SM$DR8wJiVhQvTQbh2hYvA2X%2bpwP2kTqn6iIwuudMYA16covN3tw9fYv%2fsE8M90mnAO&TAR
GET=$SM$https%3a%2f%2fwww2%2eswift%2ecom%2fformz%2fmain%2findex%2ecfm%3fform
_config%3dsochange%26form_title%3dReplace$%20Main$%20SWIFTNet$%20Security$%2
0Officers%26form_roadmap%3dhttp%3a%2f%2fwww%2eswift%2ecom%2findex%2ecfm%3fit
em_id%3d57272


Thank you
__

Carlo Henrico  


Re: [squid-users] URL NOT WORKING VIA SQUID

2005-12-21 Thread Mark Elsen
> Hi All
>
> I am a newbie to this so please bear with me.
>
> I have 2 ways to access the Internet.  The one used for e-mail etc is 128k
> and one via ADSL 512k line.  The second one (512k) is via a proxy using
> Squid 2.5.STABLE3.
>
> If I access the link below via the slower link (no Squid) it works but via
> the faster link (with Squid) it does not work.  Any ideas please and where
> can I start looking?
>
>

  You may also want to tell us; what happens for the failing URL ?

  - exact and full error as seen in the browser
  - entry in access.log for the failing URL
  - any additional errors in cache.log ?

  This way , does-not-work : may get a more-productive-meaning , ...

  M.


RE: [squid-users] URL NOT WORKING VIA SQUID

2005-12-21 Thread Carlo Henrico


Hi All and Mark

Answers to your questions:

- exact and full error as seen in the browser - After 10 minutes still just
a blank screen.
- entry in access.log for the failing URL - No entries in access.log at all
- any additional errors in cache.log ? - No entries in cache.log at all

Thank you

Carlo

-Original Message-
From: Mark Elsen [mailto:[EMAIL PROTECTED]
Sent: 21 December 2005 10:26
To: Carlo Henrico
Cc: squid-users@squid-cache.org
Subject: Re: [squid-users] URL NOT WORKING VIA SQUID


> Hi All
>
> I am a newbie to this so please bear with me.
>
> I have 2 ways to access the Internet.  The one used for e-mail etc is 128k
> and one via ADSL 512k line.  The second one (512k) is via a proxy using
> Squid 2.5.STABLE3.
>
> If I access the link below via the slower link (no Squid) it works but via
> the faster link (with Squid) it does not work.  Any ideas please and where
> can I start looking?
>
>

  You may also want to tell us; what happens for the failing URL ?

  - exact and full error as seen in the browser
  - entry in access.log for the failing URL
  - any additional errors in cache.log ?

  This way , does-not-work : may get a more-productive-meaning , ...

  M.


[squid-users] ESI request forwarding does not work?

2005-12-21 Thread Stefan Palme

Hello,

I have a very simple ESI configuration using squid in front of
a Zope server. Incoming client requests are rewritten using a
redirector.

When I submit a client request from a browser I get an error message
from squid:



While trying to retrieve the URL:
http://127.0.0.1:8080/VirtualHostBase/http/devel:80/plone/VirtualHostRoot/test.html

The following error was encountered:

  * Unable to forward this request at this time.

This request could not be forwarded to the origin server or to any
parent caches. The most likely cause for this error is that:

  * The cache administrator does not allow this cache to make direct
connections to origin servers, and
  * All configured parent caches are currently unreachable.



In the cache_log I see the following (after increasing the appropriate
debug level):


fwdStart: 
'http://127.0.0.1:8080/VirtualHostBase/http/devel:80/plone/VirtualHostRoot/test.html'
fwdStartComplete: 
http://127.0.0.1:8080/VirtualHostBase/http/devel:80/plone/VirtualHostRoot/test.html
fwdStartFail: 
http://127.0.0.1:8080/VirtualHostBase/http/devel:80/plone/VirtualHostRoot/test.html
fwdFail: ERR_CANNOT_FORWARD "Service Unavailable"

http://127.0.0.1:8080/VirtualHostBase/http/devel:80/plone/VirtualHostRoot/test.html
fwdStateFree: 0x853c018


The client, squid and Zope are all running on the same host. When
I enter the above URL, which causes the error in squid, directly in
the browser, it works - i.e. I don't get "Service Unavailable".

When sniffing the network traffic between squid and Zope, there is 
*NO* connection attempt from squid to Zope. So squid decides from its
own point of view, that the "Service is Unavailable".

There seems to be a problem with request forwarding in squid. 
Can anyone help? My squid configuration is attached below.

Best regards
-Stefan-





[squid-users] Re: ESI request forwarding does not work?

2005-12-21 Thread Stefan Palme

Forgot to say, that I am using the latest CVS version of squid.
And here is my squid configuration:

http_port 80 vhost 
icp_port 3130
cache_dir aufs /var/cache/squid 100 16 256
access_log /var/log/squid/access.log
cache_log /var/log/squid/cache.log
cache_store_log none
pid_filename /var/run/squid.pid
debug_options 17,7
redirect_program /usr/local/sbin/redirect.pl
redirect_rewrites_host_header off
refresh_pattern ^ftp:   144020% 10080
refresh_pattern ^gopher:14400%  1440
refresh_pattern .   0   20% 4320
acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl to_localhost dst 127.0.0.0/8
acl SSL_ports port 443 563
acl Safe_ports port 80  # http
acl Safe_ports port 21  # ftp
acl Safe_ports port 443 563 # https, snews
acl Safe_ports port 70  # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535  # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl Safe_ports port 8080# zope
acl CONNECT method CONNECT
http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access allow all
http_access deny all
http_reply_access allow all
icp_access allow all
httpd_accel_surrogate_id office
esi_parser libxml2
strip_query_terms off
coredump_dir /var/cache/squid
ie_refresh on
minimum_expiry_time 0 seconds




RE: [squid-users] parse reply headers problem

2005-12-21 Thread Laurikainen, Tuukka
Opened a bug Bug 1465.

Regards,

Tuukka

> -Original Message-
> From: Serassio Guido [mailto:[EMAIL PROTECTED]
> Sent: Tuesday, December 20, 2005 8:30 PM
> To: Laurikainen, Tuukka; Squid Users
> Subject: RE: [squid-users] parse reply headers problem
> 
> Hi,
> 
> At 18.23 20/12/2005, Laurikainen, Tuukka wrote:
> 
> >More on this, I now get this quite often:
> >
> >2005/12/20 17:00:56| ctx: exit levels from  1 down to  0
> >2005/12/20 17:00:56| assertion failed: mem_node.cc:65:
> >"n->write_pending"
> >
> >After which Squid restarts.
> >
> >The version I'm trying out now is PRE3-20051219.
> >
> >Should I open a bug of this?
> >
> >Tuukka
> 
> Yes, please do it.
> 
> And add any useful information to reproduce the problem.
> 
> Regards
> 
> Guido
> 
> 
> 
> -
> 
> Guido Serassio
> Acme Consulting S.r.l. - Microsoft Certified Partner
> Via Lucia Savarino, 1   10098 - Rivoli (TO) - ITALY
> Tel. : +39.011.9530135  Fax. : +39.011.9781115
> Email: [EMAIL PROTECTED]
> WWW: http://www.acmeconsulting.it/



Re: [squid-users] Good/Bad string problem...

2005-12-21 Thread Christoph Haas
Palula...

On Wednesday 21 December 2005 06:17, Palula Brasil wrote:
> I created a file with a some strings I don't want my clients to access.
> Very nice it works fine, but it is blocking some sites with string I
> don't want it to block... So I created another acl with permitted
> strings ok? So the thing goes like this...
>
> acl bad_strings url_regex "path_to_file/file"
> acl good_strings url_regex "path_to_file/file"
>
> Denial:
>
> http_access allow good_strings
> http_access deny bad_strings
>
> But the problem is that I blocked the word "anal" on the bad strings
> file and I have the word "canal" (means channel in portuguese) in the
> good_strings file. But now, the word anal can be searched/accessed etc.
> How can I overcome this...

Your syntactical solution would be:

http_access deny bad_strings !good_strings

However blocking by keywords has proven to be very inefficient. It takes a 
user with an IQ of a three year old child to circumvent this "security".
Take the google cache, all the anonymizing proxies, web anonymizers etc.
You can't block "bad content" by using URL keywords decently. Rather - 
depending on the seriousness of blocking - try SquidGuard or consider 
throwing money at a commercial product.

 Christoph
-- 
~
~
".signature" [Modified] 2 lines --100%--2,41 All


Re: [squid-users] ESI request forwarding does not work?

2005-12-21 Thread Stefan Palme

Solved.

This missing point was "always_direct allow all" (or at least an
explicit rule to allow direct access for forwarded requests).

Is this a bug? Isn't the default setting for "always_direct" to
allow all direct requests?

Best regards
-Stefan-


On Wed, 2005-12-21 at 10:18 +0100, Stefan Palme wrote:
> Hello,
> 
> I have a very simple ESI configuration using squid in front of
> a Zope server. Incoming client requests are rewritten using a
> redirector.
> 
> When I submit a client request from a browser I get an error message
> from squid:
> 
> 
> 
> While trying to retrieve the URL:
> http://127.0.0.1:8080/VirtualHostBase/http/devel:80/plone/VirtualHostRoot/test.html
> 
> The following error was encountered:
> 
>   * Unable to forward this request at this time.
> 
> This request could not be forwarded to the origin server or to any
> parent caches. The most likely cause for this error is that:
> 
>   * The cache administrator does not allow this cache to make direct
> connections to origin servers, and
>   * All configured parent caches are currently unreachable.
> 
> 
> 
> In the cache_log I see the following (after increasing the appropriate
> debug level):
> 
> 
> fwdStart: 
> 'http://127.0.0.1:8080/VirtualHostBase/http/devel:80/plone/VirtualHostRoot/test.html'
> fwdStartComplete: 
> http://127.0.0.1:8080/VirtualHostBase/http/devel:80/plone/VirtualHostRoot/test.html
> fwdStartFail: 
> http://127.0.0.1:8080/VirtualHostBase/http/devel:80/plone/VirtualHostRoot/test.html
> fwdFail: ERR_CANNOT_FORWARD "Service Unavailable"
> 
> http://127.0.0.1:8080/VirtualHostBase/http/devel:80/plone/VirtualHostRoot/test.html
> fwdStateFree: 0x853c018
> 
> 
> The client, squid and Zope are all running on the same host. When
> I enter the above URL, which causes the error in squid, directly in
> the browser, it works - i.e. I don't get "Service Unavailable".
> 
> When sniffing the network traffic between squid and Zope, there is 
> *NO* connection attempt from squid to Zope. So squid decides from its
> own point of view, that the "Service is Unavailable".
> 
> There seems to be a problem with request forwarding in squid. 
> Can anyone help? My squid configuration is attached below.
> 
> Best regards
> -Stefan-
> 
> 
> 
-- 
---
Dipl. Inf. (FH) Stefan Palme
 
email: [EMAIL PROTECTED]
www:   http://hbci4java.kapott.org
icq:   36376278
phon:  +49 341 3910484
fax:   +49 1212 517956219
mobil: +49 178 3227887
 
key fingerprint: 1BA7 D217 36A1 534C A5AD  F18A E2D1 488A E904 F9EC
---



[squid-users] Transparent proxying for clients on same host as proxy?

2005-12-21 Thread Erik Forsberg
Hi!

I'm trying to configure transparent proxying with Squid and Linux
2.6/netfilter.

I think I have it working for clients that use the squid box as
standard gateway (the FAQ is pretty clear on how to do that), but I'd
also like http requests originating from the same host as squid is
running on to be intercepted. The machine is both running squid and
serving user desktops, including browser, using a thin client solution
(ThinLinc). 

I can't get this to work with the standard rule in the PREROUTING
chain of the nat table. Any hints on how to do? 

(I realize that I'll probably have to make sure squid is doing
 outbound connections from an address that is not redirected, or an
 infinite loop will appear) 

Thanks,
\EF
-- 
Erik ForsbergOpenSource-based Thin Client Technology
Cendio ABPhone: +46-13-21 46 00
 Web: http://www.cendio.com
 



[squid-users] Always-Direct

2005-12-21 Thread Rick G. Kilgore

Good Day to all

Question about always-direct.

	Our application team does not want to rewrite a program that relies 
heavily on the IP address to maintain the identity of a connection to 
the client. I hate this idea and want the application redone correctly. 
It has been suggested that I use always-direct to bypass squid IP 
masking. My understanding was that always-direct just stopped the search 
of the cache and sends request directly to the listed server/s.

Can some confirm or deny for me.


Thanks in advance.


--
   ¡Feliz Navidad y Feliz Año Nuevo a todos!



fase del dia:
-> ¡se levantó por cualquier otro nombre es todavía se levantó!

This message is for the designated recipient only and may contain
privileged, proprietary, or otherwise private information.  If you have
received it in error, please notify the sender immediately and delete 
the original.

Any other use of the email by you is prohibited.


Este mensaje está para el recipiente señalado solamente y puede contener 
la información privilegiada, propietaria, o de otra manera privada.
 Si usted lo ha recibido en error, notifique por favor el remitente 
inmediatamente y suprima la original. Cualquier otro uso del email de 
usted se prohíbe.



Rick G. Kilgore
State of Colorado Department of Revenue IT/ESG/CSTARS/ISO 
(DDP/CCR/RWOC/ROAD)

E-Mail: [EMAIL PROTECTED]
Phone: (303) 205-5659
Fax: (303) 205-5715


smime.p7s
Description: S/MIME Cryptographic Signature


Re: [squid-users] Good/Bad string problem...

2005-12-21 Thread Palula Brasil
The syntax looks very nice to me. In fact I changed all the two lined
permissions with exceptions within my squid.conf but still...

When I put canal on the good_strings file, the word anal can now be accessed
all over the place...

- Original Message - 
From: "Christoph Haas" <[EMAIL PROTECTED]>
To: 
Sent: Wednesday, December 21, 2005 7:25 AM
Subject: Re: [squid-users] Good/Bad string problem...


Palula...

On Wednesday 21 December 2005 06:17, Palula Brasil wrote:
> I created a file with a some strings I don't want my clients to access.
> Very nice it works fine, but it is blocking some sites with string I
> don't want it to block... So I created another acl with permitted
> strings ok? So the thing goes like this...
>
> acl bad_strings url_regex "path_to_file/file"
> acl good_strings url_regex "path_to_file/file"
>
> Denial:
>
> http_access allow good_strings
> http_access deny bad_strings
>
> But the problem is that I blocked the word "anal" on the bad strings
> file and I have the word "canal" (means channel in portuguese) in the
> good_strings file. But now, the word anal can be searched/accessed etc.
> How can I overcome this...

Your syntactical solution would be:

http_access deny bad_strings !good_strings

However blocking by keywords has proven to be very inefficient. It takes a
user with an IQ of a three year old child to circumvent this "security".
Take the google cache, all the anonymizing proxies, web anonymizers etc.
You can't block "bad content" by using URL keywords decently. Rather -
depending on the seriousness of blocking - try SquidGuard or consider
throwing money at a commercial product.

 Christoph
-- 
~
~
".signature" [Modified] 2 lines --100%--2,41 All



Re: [squid-users] Good/Bad string problem...

2005-12-21 Thread Christoph Haas
On Wednesday 21 December 2005 13:25, Palula Brasil wrote:
> The syntax looks very nice to me. In fact I changed all the two lined
> permissions with exceptions within my squid.conf but still...
>
> When I put canal on the good_strings file, the word anal can now be
> accessed all over the place...

Can you post the configuration and the two good/bad_strings files here
unless they are extremely huge?

 Christoph
-- 
~
~
".signature" [Modified] 2 lines --100%--2,41 All


Re: [squid-users] Always-Direct

2005-12-21 Thread Christoph Haas
Rick...

your application team is stupid. ;)

On Wednesday 21 December 2005 14:12, Rick G. Kilgore wrote:
> Our application team does not want to rewrite a program that relies
> heavily on the IP address to maintain the identity of a connection to
> the client. I hate this idea and want the application redone correctly.

If they are too lazy do fix that I can offer an (untested) nifty workaround 
in case you use an Apache server. It takes the IP address from the 
X-Forwarded-For line in the HTTP header and sets the REMOTE_ADDR 
environment variable correctly (which is probably the address they are 
using in a CGI to find out the source IP address of the requester):

SetEnvIf X-Forwarded-For (.*) REMOTE_ADDR=$1

Documented at:
http://httpd.apache.org/docs/2.2/mod/mod_setenvif.html#setenvif

> It has been suggested that I use always-direct to bypass squid IP
> masking. My understanding was that always-direct just stopped the search
> of the cache and sends request directly to the listed server/s.

Neither. "always_direct" is used when you use a proxy chain (all your 
requests are send upstream to another proxy server) to tell Squid to send 
the requests *directly* to the web server instead of querying the parent 
proxy in the chain. So it's not connected to your case. What you probably 
think of is "no_cache" (which doesn't help either).

You can't do a thing about that on the proxy. Once the request is handled 
by Squid the web server will see the proxy's IP address. (See also: 
http://www.squid-cache.org/Doc/FAQ/FAQ-7.html#ss7.13 - but you probably 
don't want that.)

 Christoph
-- 
~
~
".signature" [Modified] 2 lines --100%--2,41 All


Re: [squid-users] Good/Bad string problem...

2005-12-21 Thread Palula Brasil
Here is the squid.conf


#   ARQUIVOS DE CONFIGURACAO DO SQUID  #


http_port 3128

hierarchy_stoplist cgi-bin ?
acl QUERY urlpath_regex cgi-bin \?
no_cache deny QUERY
cache_dir ufs /var/cache/squid 10 16 256
cache_access_log /var/log/squid/access.log
ftp_user [EMAIL PROTECTED]
cache_mgr [EMAIL PROTECTED]


# CONFIGURACAO DE ACCESS LISTS #


acl all src 0/0
acl minha_rede src 192.168.100.0/24
acl bad_strings url_regex "/etc/squid/bad_strings.acl"
acl bad_ips dst "/etc/squid/bad_ips.acl"
acl bad_sites dstdomain "/etc/squid/bad_sites.acl"
acl bad_files urlpath_regex "/etc/squid/bad_files.acl"
acl good_strings url_regex "/etc/squid/good_strings.acl"
acl good_sites dstdomain "/etc/squid/permitted.acl"
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl ssl_ports port 443 563

acl safe_ports port 80  # http
acl safe_ports port 21  # ftp
acl safe_ports port 443 563 # https, snews
acl safe_ports port 70  # gopher
acl safe_ports port 210  # wais
acl safe_ports port 1025-65535 # unregistered ports
acl safe_ports port 280  # http-mgmt
acl safe_ports port 488  # gss-http
acl safe_ports port 591  # filemaker
acl safe_ports port 777  # multiling http


#SEGURANCA DE HEADERS  #


header_access Via deny all
header_access X-Forwarded-For deny all
header_access Proxy-Connection deny all
header_access Accept-Encoding deny all
header_access User-Agent deny all

header_replace Via Stealthed
header_replace X-Forwarded-For Unknown
header_replace User-Agent Mozilla/5.0 (X11; U; Linux i686; en-US; rv:0.9.6+)
Gecko/20011122


#   PERMISSOES #


acl CONNECT method CONNECT

http_access deny bad_sites !good_sites
http_access deny bad_strings !good_strings
http_access deny bad_ips
http_access deny bad_files
http_access deny CONNECT !ssl_ports
http_access allow safe_ports
http_access allow manager localhost
http_access deny manager
http_access allow minha_rede

http_access deny all

visible_hostname netradio.com.br
coredump_dir /var/cache/squid
httpd_accel_host virtual
httpd_accel_port 80
httpd_accel_with_proxy on
httpd_accel_uses_host_header on

-- And here is the content of each ACL file.

bad_files.acl

#
#EXTENSOES DE ARQUIVOS BLOQUEADOS   #
#
\.(pif)$
\.(scr)$
\.(vbs)$
#\.(mp3)$
#\.(wav)$
#\.(aif)$
#\.(wma)$
#\.(wmv)$
#\.(avi)$
#\.(mpg)$

bad_ips.acl

#
#LISTA DE IP'S BLOQUEADOS   #
#

200.140.108.246

bad_sites.acl

#
#   LISTA DE SITES BLOQUEADOS   #
#
.parperfeito.com.br
.sexy.com.br
.sexo.com.br
.cracks.am
.bps.uol.com.br
.batepapo.uol.com.br
.astalavista.box.sk
.flogbrasil.terra.com.br
.bangbus.com
.blackbroswhitehoes.com
.circuspenis.com
.bangbros.com
.monstersofcock.com
.voxcards.ig.com.br
.mipagina.americaonline.com.mx
.rapidupload.com
.bogojevic.com
.emoction.webcindario.com
.forum.reset.ru
.tuscaloosa.al.us
.mulherespetacular.t35.com
.tiscali.cz
.gratisweb.com
.tufos.com.br
.sexlog.com.br
.icomcity.com
.feias.com
.garotasbrasileiras.com.ar
.macstar.com.br
.mileninha.com
.tanaonda.net

bad_strings.acl

#
#  LISTA DE PALAVRAS BLOQUEADAS #
#

# Palavras de conteudo pornografico
sex
porn
cum
fuck
bitch
dick
puta
putinha
rola
pau
caralho
buceta
ninfeta
gostosa
bunda
anal
safad
mulheresnuas
mulhernu
mulheresnuas
mulhernua
siririca
punheta
bordel
boquete
piroca
brasileirinhas

# Palavras de ceonteudo duvidoso
warez
crack
hack
serial

good_strings.acl

##
#   PALAVRAS PERMITIDAS  #
##
computador

permitted.acl

##
#   DOMINIOS PERMITIDOS  #
##
.uol.com.br










- Original Message - 
From: "Christoph Haas" <[EMAIL PROTECTED]>
To: 
Sent: Wednesday, December 21, 2005 12:51 PM
Subject: Re: [squid-users] Good/Bad string problem...


On Wednesday 21 December 2005 13:25, Palula Brasil wrote:
> The syntax looks very nice to me. In fact I changed all the two lined
> permissions with exceptions within my squid.conf but still...
>
> When I put canal on the good_strings file, the word anal can now be
> accessed all over the place...

Can you post the configuration and the two good/bad_strings files here
unless they are extremely huge?

 Christoph
-- 
~
~
".signature" [Modified] 2 lines --100%--2,41 All



[squid-users] strange squid behaviour, httpSendRequest: FD 284FATAL: Received Segment Violation...dying.

2005-12-21 Thread Agung T. Apriyanto
hi there,

recently i noticed that my squid dies and get start
automaticly in
several times. i'm using openbsd 3.8 and
squid-2.5.STABLE12,
my compile option:
Squid Cache: Version 2.5.STABLE12
configure options:  --prefix=/squid
--enable-storeio=diskd --enable-underscores
--disable-ident-lookups --enable-removal-policies=heap
--enable-pf-transparent --disable-hostname-checks
--enable-gnuregex --enable-delay-pools --enable-snmp
--disable-wccp --enable-cache-digests
--enable-default-err-languages=English
--enable-err-languages=English --enable-stacktraces
--enable-x-accelerator-vary

here's a copy of my -k debug. and i don't know how to
start fix it. i hope someone able to help :-)

best regards,

Agung

__
Do You Yahoo!?
Tired of spam?  Yahoo! Mail has the best spam protection around 
http://mail.yahoo.com 2005/12/21 21:46:53| comm_write: FD 283: sz 447: hndl 0x1c033860: data 
0x7cde9690.
2005/12/21 21:46:53| cbdataLock: 0x7cde9690
2005/12/21 21:46:53| commSetSelect: FD 283 type 2
2005/12/21 21:46:53| cbdataUnlock: 0x8b1d1a50
2005/12/21 21:46:53| comm_poll: FD 284 ready for writing
2005/12/21 21:46:53| comm_connect_addr: FD 284 connected to 217.146.179.200:80
2005/12/21 21:46:53| comm_remove_close_handler: FD 284, handler=0x1c01c540, 
data=0x872e8b10
2005/12/21 21:46:53| cbdataUnlock: 0x872e8b10
2005/12/21 21:46:53| commSetTimeout: FD 284 timeout -1
2005/12/21 21:46:53| commConnectFree: FD 284
2005/12/21 21:46:53| cbdataFree: 0x872e8b10
2005/12/21 21:46:53| cbdataFree: Freeing 0x872e8b10
2005/12/21 21:46:53| cbdataValid: 0x8b29fb50
2005/12/21 21:46:53| fwdConnectDone: FD 284: 
'http://bc.row.yahoo.com/b?P=axCQ.M6.J7FN0DduQ6Zp1gBF2Ox3DkOpan4ACTya&T=140nder3o
%2fX%3d1135176319%2fE%3d150520152%2fR%3daamailfold%2fK%3d5%2fV%3d1.1%2fW%3dJR%2fY%3dASIA%2fF%3d3658585817%2fS%3d1%2fJ%3dF430BE
CE&U=136o3v7ds%2fN%3dE4_Sx86.Iqg-%2fC%3d239460.5037759.6166529.4913800%2fD%3dN%2fB%3d2179869&U=138vuo4hj%2fN%3dEI_Sx86.Iqg-%2f
C%3d239460.6018435.7038474.5855569%2fD%3dMNW%2fB%3d2575095&U=138evtmoi%2fN%3dCo_Sx86.Iqg-%2fC%3d239460.6018437.7038471.5855571
%2fD%3dSW1%2fB%3d2179872&U=138ttei87%2fN%3dC4_Sx86.Iqg-%2fC%3d239460.6018438.7038472.5855572%2fD%3dSW2%2fB%3d2179873&U=1384112
ik%2fN%3dDI_Sx86.Iqg-%2fC%3d239460.6018439.7038473.5855573%2fD%3dSW3%2fB%3d2179874&U=138je7rfm%2fN%3dDY_Sx86.Iqg-%2fC%3d239460
.7520153.8403727.7300328%2fD%3dSW4%2fB%3d3122709&U=1264kdqv8%2fN%3dCI_Sx86.Iqg-%2fC%3d-1%2fD%3dRS%2fB%3d-1&U=1270n2qe9%2fN%3dC
Y_Sx86.Iqg-%2fC%3d-2%2fD%3dRS2%2fB%3d-2&U=127mtjl3j%2fN%3dDo_Sx86.Iqg-%2fC%3d-2%2fD%3dSW5%2fB%3d-2&U=127mfkrg9%2fN%3dD4_Sx86.I
qg-%2fC%3d-2%2fD%3dSW6%2fB%3d-2&U=1272lims7%2fN%3dEo_Sx86.Iqg-%2fC%3d-2%2fD%3dMIL%2fB%3d-2&Q=0&O=0.33836312357584186'
2005/12/21 21:46:53| fwdDispatch: FD 265: Fetching 'GET 
http://bc.row.yahoo.com/b?P=axCQ.M6.J7FN0DduQ6Zp1gBF2Ox3DkOpan4ACTya&T
=140nder3o%2fX%3d1135176319%2fE%3d150520152%2fR%3daamailfold%2fK%3d5%2fV%3d1.1%2fW%3dJR%2fY%3dASIA%2fF%3d3658585817%2fS%3d1%2f
J%3dF430BECE&U=136o3v7ds%2fN%3dE4_Sx86.Iqg-%2fC%3d239460.5037759.6166529.4913800%2fD%3dN%2fB%3d2179869&U=138vuo4hj%2fN%3dEI_Sx
86.Iqg-%2fC%3d239460.6018435.7038474.5855569%2fD%3dMNW%2fB%3d2575095&U=138evtmoi%2fN%3dCo_Sx86.Iqg-%2fC%3d239460.6018437.70384
71.5855571%2fD%3dSW1%2fB%3d2179872&U=138ttei87%2fN%3dC4_Sx86.Iqg-%2fC%3d239460.6018438.7038472.5855572%2fD%3dSW2%2fB%3d2179873
&U=1384112ik%2fN%3dDI_Sx86.Iqg-%2fC%3d239460.6018439.7038473.5855573%2fD%3dSW3%2fB%3d2179874&U=138je7rfm%2fN%3dDY_Sx86.Iqg-%2f
C%3d239460.7520153.8403727.7300328%2fD%3dSW4%2fB%3d3122709&U=1264kdqv8%2fN%3dCI_Sx86.Iqg-%2fC%3d-1%2fD%3dRS%2fB%3d-1&U=1270n2q
e9%2fN%3dCY_Sx86.Iqg-%2fC%3d-2%2fD%3dRS2%2fB%3d-2&U=127mtjl3j%2fN%3dDo_Sx86.Iqg-%2fC%3d-2%2fD%3dSW5%2fB%3d-2&U=127mfkrg9%2fN%3
dD4_Sx86.Iqg-%2fC%3d-2%2fD%3dSW6%2fB%3d-2&U=1272lims7%2fN%3dEo_Sx86.Iqg-%2fC%3d-2%2fD%3dMIL%2fB%3d-2&Q=0&O=0.33836312357584186
'
2005/12/21 21:46:53| httpStart: "GET 
http://bc.row.yahoo.com/b?P=axCQ.M6.J7FN0DduQ6Zp1gBF2Ox3DkOpan4ACTya&T=140nder3o%2fX%3d11
35176319%2fE%3d150520152%2fR%3daamailfold%2fK%3d5%2fV%3d1.1%2fW%3dJR%2fY%3dASIA%2fF%3d3658585817%2fS%3d1%2fJ%3dF430BECE&U=136o
3v7ds%2fN%3dE4_Sx86.Iqg-%2fC%3d239460.5037759.6166529.4913800%2fD%3dN%2fB%3d2179869&U=138vuo4hj%2fN%3dEI_Sx86.Iqg-%2fC%3d23946
0.6018435.7038474.5855569%2fD%3dMNW%2fB%3d2575095&U=138evtmoi%2fN%3dCo_Sx86.Iqg-%2fC%3d239460.6018437.7038471.5855571%2fD%3dSW
1%2fB%3d2179872&U=138ttei87%2fN%3dC4_Sx86.Iqg-%2fC%3d239460.6018438.7038472.5855572%2fD%3dSW2%2fB%3d2179873&U=1384112ik%2fN%3d
DI_Sx86.Iqg-%2fC%3d239460.6018439.7038473.5855573%2fD%3dSW3%2fB%3d2179874&U=138je7rfm%2fN%3dDY_Sx86.Iqg-%2fC%3d239460.7520153.
8403727.7300328%2fD%3dSW4%2fB%3d3122709&U=1264kdqv8%2fN%3dCI_Sx86.Iqg-%2fC%3d-1%2fD%3dRS%2fB%3d-1&U=1270n2qe9%2fN%3dCY_Sx86.Iq
g-%2fC%3d-2%2fD%3dRS2%2fB%3d-2&U=127mtjl3j%2fN%3dDo_Sx86.Iqg-%2fC%3d-2%2fD%3dSW5%2fB%3d-2&U=127mfkrg9%2fN%3dD4_Sx86.Iqg-%2fC%3
d-2%2fD%3dSW6%2fB%3d-2&U=1272lims7%2fN%3dEo_Sx86.Iqg-%2fC%3d-2%2fD%3dMIL%2fB%3d-2&Q=0&O=0.3383631

Re: [squid-users] MSN problems with squid+NTLM

2005-12-21 Thread Guilherme Oliveira
Hi again!

I've done that and no success. I thing there is some problem in squid
or msn with authentication :-/


acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl to_localhost dst 127.0.0.0/8
acl SSL_ports port 443 563 1863
acl Safe_ports port 80  # http
acl Safe_ports port 21  # ftp
acl Safe_ports port 443 563 8087# https, snews, webmail
acl Safe_ports port 70  # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535  # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl Safe_ports port 25 110  # External Mail
acl Safe_ports port 1863# MSN
acl update-micro-dom dstdomain .microsoft.com
acl update-micro-dom dstdomain .windowsupdate.com
acl CONNECT method CONNECT

acl localhost_acl src 127.0.0.1
redirector_access deny localhost_acl
acl SSL_ports port 443 563 1863
redirector_access deny SSL_ports
http_access allow localhost

acl msnmessenger url_regex -i gateway.dll
acl msnmessenger url_regex -i RST.srf
http_access allow msnmessenger
http_access allow update-micro-dom

acl chat_external dstdomain .msn.com
acl chat_external dstdomain .hotmail.com
acl chat_external dstdomain loginnet.passport.com
always_direct allow chat_external
#never_direct allow all

acl NTLMUsers proxy_auth REQUIRED
http_access allow NTLMUsers


1135183240.828   2165 127.0.0.1 TCP_MISS/200 355 HEAD
http://gateway.messenger.hotmail.com/gateway/gateway.dll? -
DIRECT/65.54.239.21 application/x-msn-messenger
1135183241.269440 127.0.0.1 TCP_MISS/200 356 GET
http://gateway.messenger.hotmail.com/gateway/gateway.dll? -
DIRECT/65.54.239.21 application/x-msn-messenger
1135183242.986  45704 192.168.1.36 TCP_MISS/200 389 POST
http://gateway.messenger.hotmail.com/gateway/gateway.dll? -
DIRECT/65.54.239.21 application/x-msn-messenger
1135183244.468   1414 127.0.0.1 TCP_MISS/400 334 HEAD
http://207.46.1.3/gateway/gateway.dll? - DIRECT/207.46.1.3
application/x-msn-messenger
1135183244.971   1985 192.168.1.36 TCP_MISS/200 532 POST
http://207.46.1.3/gateway/gateway.dll? - DIRECT/207.46.1.3
application/x-msn-messenger
1135183245.306317 127.0.0.1 TCP_MISS/400 334 HEAD
http://207.46.1.3/gateway/gateway.dll? - DIRECT/207.46.1.3
application/x-msn-messenger
1135183246.013   1042 192.168.1.36 TCP_MISS/200 524 POST
http://207.46.1.3/gateway/gateway.dll? - DIRECT/207.46.1.3
application/x-msn-messenger
1135183246.390329 127.0.0.1 TCP_MISS/200 354 HEAD
http://207.46.1.3/gateway/gateway.dll? - DIRECT/207.46.1.3
application/x-msn-messenger
1135183246.718677 127.0.0.1 TCP_MISS/200 384 HEAD
http://loginnet.passport.com/RST.srf - DIRECT/65.54.179.192 text/html
1135183246.746354 127.0.0.1 TCP_MISS/400 334 GET
http://207.46.1.3/gateway/gateway.dll? - DIRECT/207.46.1.3
application/x-msn-messenger
1135183248.227   2171 192.168.1.36 TCP_MISS/400 334 POST
http://207.46.1.3/gateway/gateway.dll? - DIRECT/207.46.1.3
application/x-msn-messenger
1135183248.239   1520 127.0.0.1 TCP_MISS/200 722 GET
http://loginnet.passport.com/RST.srf - DIRECT/65.54.183.192 text/html
1135183249.400   3385 192.168.1.36 TCP_MISS/000 0 POST
http://loginnet.passport.com/RST.srf - DIRECT/65.54.179.192 -




On 12/21/05, Guilherme Oliveira <[EMAIL PROTECTED]> wrote:
> -- Forwarded message --
> From: Kashif Ali Bukhari <[EMAIL PROTECTED]>
> Date: Dec 20, 2005 6:49 PM
> Subject: Re: [squid-users] MSN problems with squid+NTLM
> To: Guilherme Oliveira <[EMAIL PROTECTED]>
> Cc: squid-users@squid-cache.org
>
>
> bypass gateway.dll from auth program
>
> On 12/20/05, Guilherme Oliveira <[EMAIL PROTECTED]> wrote:
> > I have configured Squid integrated with AD/w3k using NTLM but even
> > giving DIRECT access to .msn.com+passport.com, MSN don't authenticate
> > itself.
> >
> > It's strange because it works without NTLM auth. Lots of people are
> > having this problem.
> >
> > On 12/19/05, Kashif Ali Bukhari <[EMAIL PROTECTED]> wrote:
> > > you did not describe your problem :P
> > >
> > > On 12/19/05, Guilherme Oliveira <[EMAIL PROTECTED]> wrote:
> > > > Hi !
> > > >
> > > > I've searched the FAQ, Archives, Google, ... and found a lot of people
> > > > with this problem but none valid answer or correction to the problem.
> > > > It happens when squid uses NTLM authentication while MSN tries to log 
> > > > on.
> > > >
> > > > Any solution ?
> > > >
> > > > Thanks.


Re: [squid-users] URL NOT WORKING VIA SQUID

2005-12-21 Thread Mark Elsen
>
>
> Hi All and Mark
>
> Answers to your questions:
>
> - exact and full error as seen in the browser - After 10 minutes still just
> a blank screen.
> - entry in access.log for the failing URL - No entries in access.log at all

   Double verify , whether nothing is in access.log , if you use the STOP
feature in the browser.

> - any additional errors in cache.log ? - No entries in cache.log at all
>

  M.


Re: [squid-users] strange squid behaviour, httpSendRequest: FD 284FATAL: Received Segment Violation...dying.

2005-12-21 Thread Mark Elsen
> hi there,
>
> recently i noticed that my squid dies and get start
> automaticly in
> several times. i'm using openbsd 3.8 and
> squid-2.5.STABLE12,
> my compile option:
> Squid Cache: Version 2.5.STABLE12
> configure options:  --prefix=/squid
> --enable-storeio=diskd --enable-underscores
> --disable-ident-lookups --enable-removal-policies=heap
> --enable-pf-transparent --disable-hostname-checks
> --enable-gnuregex --enable-delay-pools --enable-snmp
> --disable-wccp --enable-cache-digests
> --enable-default-err-languages=English
> --enable-err-languages=English --enable-stacktraces
> --enable-x-accelerator-vary
>
> here's a copy of my -k debug. and i don't know how to
> start fix it. i hope someone able to help :-)
>
  For this last matter (getting help) :

>  http://www.squid-cache.org/Doc/FAQ/FAQ-11.html#ss11.19

 M.


Re: [squid-users] Anonymize http requests

2005-12-21 Thread Funieru Bogdan
yes i realize but i don't want those machines to see
it, all i want is to see either the final server or my
server but not all the ip's i mean if i have a proxy
from a friend and he has the ip x.x.x.x i don't want
any1 else to see it not even when they access
www.yourip.com(i think this is the one)

or i want them to see my friends ip but only that one,
so i can have a safe journey through the web... 

the second solution is far more in my advantage and i
am looking forward for it. anyway this is what i
want,so is there a way ?


--- Christoph Haas <[EMAIL PROTECTED]> wrote:

> On Sunday 18 December 2005 23:36, Funieru Bogdan
> wrote:
> > hello everyone, i have a little request , i tryed
> > almost everything , but it still doesn't work ,
> how
> > can i anonymize complete the ip of my squid server
> ? i
> > mean when i access www.whatismyip.com i want it to
> > appear unknown, even if i press 5 or 10 time the
> > refresh button
> > i looked for forwarded_for .. switcvhed it to off,
> > client_netmask tryed with 0.0.0.0 or
> 255.255.255.255
> > and eventhe header option but it still doesn't
> work
> > pls can you give me a hand ?
> 
> You do know that when communicating on the internet
> you need an IP address?
> 
>  Christoph
> -- 
> ~
> ~
> ".signature" [Modified] 2 lines --100%--
>2,41 All
> 


---
Funieru Bogdan
Admin MosilorNET

Contact Info:
Mob:
0742158956
0726592752
0744301506(very rare)
---

__
Do You Yahoo!?
Tired of spam?  Yahoo! Mail has the best spam protection around 
http://mail.yahoo.com 


[squid-users] Problems compiling 2.5 STABLE 12 on Solaris 9

2005-12-21 Thread Eckles, David

I'm attempting to compile on Solaris 9 using gcc 3.4.2.  I've tried the
compile in the csh, ksh, sh and bash shells with no success.  Configure
is being run with no arguments and completes succesfully in all shells.
Any thoughts?

> Here is the complete compile output:
> 
> [hpov2/var/spool/pkg/squid-2.5.STABLE12] make all
> Making all in lib
> source='Array.c' object='Array.o' libtool=no \
> depfile='.deps/Array.Po' tmpdepfile='.deps/Array.TPo' \
> depmode=gcc3 /bin/sh ../cfgaux/depcomp \
> gcc -DHAVE_CONFIG_H -I. -I. -I../include -I../include -I../include
> -g -O2 -Wall -c `test -f Array.c || echo './'`Array.c
> source='base64.c' object='base64.o' libtool=no \
> depfile='.deps/base64.Po' tmpdepfile='.deps/base64.TPo' \
> depmode=gcc3 /bin/sh ../cfgaux/depcomp \
> gcc -DHAVE_CONFIG_H -I. -I. -I../include -I../include -I../include
> -g -O2 -Wall -c `test -f base64.c || echo './'`base64.c
> source='getfullhostname.c' object='getfullhostname.o' libtool=no \
> depfile='.deps/getfullhostname.Po'
> tmpdepfile='.deps/getfullhostname.TPo' \
> depmode=gcc3 /bin/sh ../cfgaux/depcomp \
> gcc -DHAVE_CONFIG_H -I. -I. -I../include -I../include -I../include
> -g -O2 -Wall -c `test -f getfullhostname.c || echo './'`getfu
> llhostname.c
> source='hash.c' object='hash.o' libtool=no \
> depfile='.deps/hash.Po' tmpdepfile='.deps/hash.TPo' \
> depmode=gcc3 /bin/sh ../cfgaux/depcomp \
> gcc -DHAVE_CONFIG_H -I. -I. -I../include -I../include -I../include
> -g -O2 -Wall -c `test -f hash.c || echo './'`hash.c
> source='heap.c' object='heap.o' libtool=no \
> depfile='.deps/heap.Po' tmpdepfile='.deps/heap.TPo' \
> depmode=gcc3 /bin/sh ../cfgaux/depcomp \
> gcc -DHAVE_CONFIG_H -I. -I. -I../include -I../include -I../include
> -g -O2 -Wall -c `test -f heap.c || echo './'`heap.c
> source='html_quote.c' object='html_quote.o' libtool=no \
> depfile='.deps/html_quote.Po' tmpdepfile='.deps/html_quote.TPo' \
> depmode=gcc3 /bin/sh ../cfgaux/depcomp \
> gcc -DHAVE_CONFIG_H -I. -I. -I../include -I../include -I../include
> -g -O2 -Wall -c `test -f html_quote.c || echo './'`html_quote
> .c
> source='iso3307.c' object='iso3307.o' libtool=no \
> depfile='.deps/iso3307.Po' tmpdepfile='.deps/iso3307.TPo' \
> depmode=gcc3 /bin/sh ../cfgaux/depcomp \
> gcc -DHAVE_CONFIG_H -I. -I. -I../include -I../include -I../include
> -g -O2 -Wall -c `test -f iso3307.c || echo './'`iso3307.c
> source='md5.c' object='md5.o' libtool=no \
> depfile='.deps/md5.Po' tmpdepfile='.deps/md5.TPo' \
> depmode=gcc3 /bin/sh ../cfgaux/depcomp \
> gcc -DHAVE_CONFIG_H -I. -I. -I../include -I../include -I../include
> -g -O2 -Wall -c `test -f md5.c || echo './'`md5.c
> source='radix.c' object='radix.o' libtool=no \
> depfile='.deps/radix.Po' tmpdepfile='.deps/radix.TPo' \
> depmode=gcc3 /bin/sh ../cfgaux/depcomp \
> gcc -DHAVE_CONFIG_H -I. -I. -I../include -I../include -I../include
> -g -O2 -Wall -c `test -f radix.c || echo './'`radix.c
> source='rfc1035.c' object='rfc1035.o' libtool=no \
> depfile='.deps/rfc1035.Po' tmpdepfile='.deps/rfc1035.TPo' \
> depmode=gcc3 /bin/sh ../cfgaux/depcomp \
> gcc -DHAVE_CONFIG_H -I. -I. -I../include -I../include -I../include
> -g -O2 -Wall -c `test -f rfc1035.c || echo './'`rfc1035.c
> source='rfc1123.c' object='rfc1123.o' libtool=no \
> depfile='.deps/rfc1123.Po' tmpdepfile='.deps/rfc1123.TPo' \
> depmode=gcc3 /bin/sh ../cfgaux/depcomp \
> gcc -DHAVE_CONFIG_H -I. -I. -I../include -I../include -I../include
> -g -O2 -Wall -c `test -f rfc1123.c || echo './'`rfc1123.c
> source='rfc1738.c' object='rfc1738.o' libtool=no \
> depfile='.deps/rfc1738.Po' tmpdepfile='.deps/rfc1738.TPo' \
> depmode=gcc3 /bin/sh ../cfgaux/depcomp \
> gcc -DHAVE_CONFIG_H -I. -I. -I../include -I../include -I../include
> -g -O2 -Wall -c `test -f rfc1738.c || echo './'`rfc1738.c
> source='rfc2617.c' object='rfc2617.o' libtool=no \
> depfile='.deps/rfc2617.Po' tmpdepfile='.deps/rfc2617.TPo' \
> depmode=gcc3 /bin/sh ../cfgaux/depcomp \
> gcc -DHAVE_CONFIG_H -I. -I. -I../include -I../include -I../include
> -g -O2 -Wall -c `test -f rfc2617.c || echo './'`rfc2617.c
> source='safe_inet_addr.c' object='safe_inet_addr.o' libtool=no \
> depfile='.deps/safe_inet_addr.Po'
> tmpdepfile='.deps/safe_inet_addr.TPo' \
> depmode=gcc3 /bin/sh ../cfgaux/depcomp \
> gcc -DHAVE_CONFIG_H -I. -I. -I../include -I../include -I../include
> -g -O2 -Wall -c `test -f safe_inet_addr.c || echo './'`safe_i
> net_addr.c
> source='splay.c' object='splay.o' libtool=no \
> depfile='.deps/splay.Po' tmpdepfile='.deps/splay.TPo' \
> depmode=gcc3 /bin/sh ../cfgaux/depcomp \
> gcc -DHAVE_CONFIG_H -I. -I. -I../include -I../include -I../include
> -g -O2 -Wall -c `test -f splay.c || echo './'`splay.c
> source='Stack.c' object='Stack.o' libtool=no \
> depfile='.deps/Stack.Po' tmpdepfile='.deps/Stack.TPo' \
> depmode=gcc3 /bin/sh ../cfgaux/depcomp \
> gcc -DHAVE_CONFIG_H -I. -I. -I../include -I../include -I../include
> -g -O2 -Wall -c `test -f Stack.c || echo './'`Stack.c
> source='stub_memaccount.c' object='stub_memaccount.o

[squid-users] blacklists

2005-12-21 Thread Rick G. Kilgore
	I have finally got my hands around some basic ACL's. Would like to use 
some blacklists if they work ok. Do I really need to load squidgaurd to 
use the blacklists properly/avoid performance issues?
	Would like to see some nested ACL's is any body has time to help a 
little on that as well


Thanks for your time in advance.


--
   ¡Feliz Navidad y Feliz Año Nuevo a todos!



fase del dia:
-> ¡se levantó por cualquier otro nombre es todavía se levantó!

This message is for the designated recipient only and may contain
privileged, proprietary, or otherwise private information.  If you have
received it in error, please notify the sender immediately and delete 
the original.

Any other use of the email by you is prohibited.


Este mensaje está para el recipiente señalado solamente y puede contener 
la información privilegiada, propietaria, o de otra manera privada.
 Si usted lo ha recibido en error, notifique por favor el remitente 
inmediatamente y suprima la original. Cualquier otro uso del email de 
usted se prohíbe.



Rick G. Kilgore
State of Colorado Department of Revenue IT/ESG/CSTARS/ISO 
(DDP/CCR/RWOC/ROAD)

E-Mail: [EMAIL PROTECTED]
Phone: (303) 205-5659
Fax: (303) 205-5715


smime.p7s
Description: S/MIME Cryptographic Signature


[squid-users] Fatal Error

2005-12-21 Thread trainier
I receive the following email from squid sem-frequently:

From: squid
To: [EMAIL PROTECTED]
Subject: The Squid Cache (version 3.0-PRE3-20050510) died.

You've encountered a fatal error in the Squid Cache version 
3.0-PRE3-20050510.
If a core file was created (possibly in the swap directory),
please execute 'gdb squid core' or 'dbx squid core', then type 'where',
and report the trace back to [EMAIL PROTECTED]

There's never any core file, and squid continues to function just fine.

What's the scoop?

Tim



Re: [squid-users] Problems compiling 2.5 STABLE 12 on Solaris 9

2005-12-21 Thread Guido Serassio

Hi,

At 20.29 21/12/2005, Eckles, David wrote:



I'm attempting to compile on Solaris 9 using gcc 3.4.2.  I've tried the
compile in the csh, ksh, sh and bash shells with no success.  Configure
is being run with no arguments and completes succesfully in all shells.
Any thoughts?

> Here is the complete compile output:
>


Please, post only the errors  :-)

This is a know problem: please use the daily snapshot.
http://www.squid-cache.org/Versions/v2/2.5/bugs/#STABLE12

Regards

Guido


> [hpov2/var/spool/pkg/squid-2.5.STABLE12] make all
> Making all in lib
> source='Array.c' object='Array.o' libtool=no \
> depfile='.deps/Array.Po' tmpdepfile='.deps/Array.TPo' \
> depmode=gcc3 /bin/sh ../cfgaux/depcomp \
> gcc -DHAVE_CONFIG_H -I. -I. -I../include -I../include -I../include
> -g -O2 -Wall -c `test -f Array.c || echo './'`Array.c
> source='base64.c' object='base64.o' libtool=no \
> depfile='.deps/base64.Po' tmpdepfile='.deps/base64.TPo' \
> depmode=gcc3 /bin/sh ../cfgaux/depcomp \
> gcc -DHAVE_CONFIG_H -I. -I. -I../include -I../include -I../include
> -g -O2 -Wall -c `test -f base64.c || echo './'`base64.c
> source='getfullhostname.c' object='getfullhostname.o' libtool=no \
> depfile='.deps/getfullhostname.Po'
> tmpdepfile='.deps/getfullhostname.TPo' \
> depmode=gcc3 /bin/sh ../cfgaux/depcomp \
> gcc -DHAVE_CONFIG_H -I. -I. -I../include -I../include -I../include
> -g -O2 -Wall -c `test -f getfullhostname.c || echo './'`getfu
> llhostname.c
> source='hash.c' object='hash.o' libtool=no \
> depfile='.deps/hash.Po' tmpdepfile='.deps/hash.TPo' \
> depmode=gcc3 /bin/sh ../cfgaux/depcomp \
> gcc -DHAVE_CONFIG_H -I. -I. -I../include -I../include -I../include
> -g -O2 -Wall -c `test -f hash.c || echo './'`hash.c
> source='heap.c' object='heap.o' libtool=no \
> depfile='.deps/heap.Po' tmpdepfile='.deps/heap.TPo' \
> depmode=gcc3 /bin/sh ../cfgaux/depcomp \
> gcc -DHAVE_CONFIG_H -I. -I. -I../include -I../include -I../include
> -g -O2 -Wall -c `test -f heap.c || echo './'`heap.c
> source='html_quote.c' object='html_quote.o' libtool=no \
> depfile='.deps/html_quote.Po' tmpdepfile='.deps/html_quote.TPo' \
> depmode=gcc3 /bin/sh ../cfgaux/depcomp \
> gcc -DHAVE_CONFIG_H -I. -I. -I../include -I../include -I../include
> -g -O2 -Wall -c `test -f html_quote.c || echo './'`html_quote
> .c
> source='iso3307.c' object='iso3307.o' libtool=no \
> depfile='.deps/iso3307.Po' tmpdepfile='.deps/iso3307.TPo' \
> depmode=gcc3 /bin/sh ../cfgaux/depcomp \
> gcc -DHAVE_CONFIG_H -I. -I. -I../include -I../include -I../include
> -g -O2 -Wall -c `test -f iso3307.c || echo './'`iso3307.c
> source='md5.c' object='md5.o' libtool=no \
> depfile='.deps/md5.Po' tmpdepfile='.deps/md5.TPo' \
> depmode=gcc3 /bin/sh ../cfgaux/depcomp \
> gcc -DHAVE_CONFIG_H -I. -I. -I../include -I../include -I../include
> -g -O2 -Wall -c `test -f md5.c || echo './'`md5.c
> source='radix.c' object='radix.o' libtool=no \
> depfile='.deps/radix.Po' tmpdepfile='.deps/radix.TPo' \
> depmode=gcc3 /bin/sh ../cfgaux/depcomp \
> gcc -DHAVE_CONFIG_H -I. -I. -I../include -I../include -I../include
> -g -O2 -Wall -c `test -f radix.c || echo './'`radix.c
> source='rfc1035.c' object='rfc1035.o' libtool=no \
> depfile='.deps/rfc1035.Po' tmpdepfile='.deps/rfc1035.TPo' \
> depmode=gcc3 /bin/sh ../cfgaux/depcomp \
> gcc -DHAVE_CONFIG_H -I. -I. -I../include -I../include -I../include
> -g -O2 -Wall -c `test -f rfc1035.c || echo './'`rfc1035.c
> source='rfc1123.c' object='rfc1123.o' libtool=no \
> depfile='.deps/rfc1123.Po' tmpdepfile='.deps/rfc1123.TPo' \
> depmode=gcc3 /bin/sh ../cfgaux/depcomp \
> gcc -DHAVE_CONFIG_H -I. -I. -I../include -I../include -I../include
> -g -O2 -Wall -c `test -f rfc1123.c || echo './'`rfc1123.c
> source='rfc1738.c' object='rfc1738.o' libtool=no \
> depfile='.deps/rfc1738.Po' tmpdepfile='.deps/rfc1738.TPo' \
> depmode=gcc3 /bin/sh ../cfgaux/depcomp \
> gcc -DHAVE_CONFIG_H -I. -I. -I../include -I../include -I../include
> -g -O2 -Wall -c `test -f rfc1738.c || echo './'`rfc1738.c
> source='rfc2617.c' object='rfc2617.o' libtool=no \
> depfile='.deps/rfc2617.Po' tmpdepfile='.deps/rfc2617.TPo' \
> depmode=gcc3 /bin/sh ../cfgaux/depcomp \
> gcc -DHAVE_CONFIG_H -I. -I. -I../include -I../include -I../include
> -g -O2 -Wall -c `test -f rfc2617.c || echo './'`rfc2617.c
> source='safe_inet_addr.c' object='safe_inet_addr.o' libtool=no \
> depfile='.deps/safe_inet_addr.Po'
> tmpdepfile='.deps/safe_inet_addr.TPo' \
> depmode=gcc3 /bin/sh ../cfgaux/depcomp \
> gcc -DHAVE_CONFIG_H -I. -I. -I../include -I../include -I../include
> -g -O2 -Wall -c `test -f safe_inet_addr.c || echo './'`safe_i
> net_addr.c
> source='splay.c' object='splay.o' libtool=no \
> depfile='.deps/splay.Po' tmpdepfile='.deps/splay.TPo' \
> depmode=gcc3 /bin/sh ../cfgaux/depcomp \
> gcc -DHAVE_CONFIG_H -I. -I. -I../include -I../include -I../include
> -g -O2 -Wall -c `test -f splay.c || echo './'`splay.c
> source='Stack.c' object='Stack.o' libtool=no \
> depfile='.deps/Stack.Po' tmpdepfile='.deps/Stack.TPo' \
>

Re: [squid-users] Problems compiling 2.5 STABLE 12 on Solaris 9

2005-12-21 Thread Mark Elsen
On 12/21/05, Eckles, David <[EMAIL PROTECTED]> wrote:
>
> I'm attempting to compile on Solaris 9 using gcc 3.4.2.  I've tried the
> compile in the csh, ksh, sh and bash shells with no success.  Configure
> is being run with no arguments and completes succesfully in all shells.
> Any thoughts?
>
> > Here is the complete compile output:
> >
> > [hpov2/var/spool/pkg/squid-2.5.STABLE12] make all
> > Making all in lib
>...
>


http://www.squid-cache.org/Versions/v2/2.5/bugs/#squid-2.5.STABLE12-setenv

M.


Re: [squid-users] Segmentation fault on x86_64

2005-12-21 Thread Michał Margula

H napisał(a):


I have not read this thread from the beginning but I run since 6 month or so 
several FreeBSD-amd64 (including X2) servers extremely stable and with real 
performance advantage in comparism to i386 platforms. 





It usually happens to me. There are bugs that nobody else notices, 
because of load, number of clients, or solar radiation. I have no idea 
why :-). It may also be a small detail, that breaks everything.


My coworker noticed that crashes are directly connected with number of 
viruses hitting proxy. More viruses means much faster crash.


--
Michał Margula, [EMAIL PROTECTED], http://alchemyx.uznam.net.pl/
"W życiu piękne są tylko chwile" [Ryszard Riedel]