RE: [squid-users] squid logging

2006-02-10 Thread Gregori Parker
AWESOME - thanks mate!

One more question regarding this...

I'm trying to get the date format looking like "2006-02-10" but I only seem to 
have options for "10/Feb/2006:11:00:00 -" - any ideas?

Also, the doc claims that "%rq" is a valid token for the query line, but when I 
have it in my config, squid wont start; it just tells me:

FATAL: Can't parse configuration token: '%rq %>a %mailto:[EMAIL PROTECTED] 
Sent: Thursday, February 09, 2006 10:30 PM
To: Gregori Parker
Cc: Squid ML
Subject: Re: [squid-users] squid logging

On 2/9/06, Gregori Parker <[EMAIL PROTECTED]> wrote:
> I currently have Squid logging to access.log in httpd
> emulation...unfortunately, our origin servers log in W3C format.  We're
> working to make our parsers smart enough to handle it, but I thought
> it's worth asking: Are there any other controls over the format of
> access.log besides emulate_httpd_log?  Perhaps a patch or module?
>
> I would LOVE to have the ability to designate what fields get logged so
> I can trim the fat :)  Thanks in advance - Gregori

Yes, there is a patch which gives full control.

See the custom log patch, found from
http://devel.squid-cache.org/old_projects.html#customlog

I've been using it for many months now, no problems.

Kevin



RE: [squid-users] Authentication problem

2006-02-10 Thread Casey King
Okay thanks for the information.  Guess I will mess around with this
site from home then.

-Original Message-
From: Mark Elsen [mailto:[EMAIL PROTECTED] 
Sent: Friday, February 10, 2006 11:06 AM
To: Casey King
Cc: Squid Mailing List
Subject: Re: [squid-users] Authentication problem


> I am running CentOS 4.1 with squid-2.5.STABLE6-3.4E.5
>
> I am able to go and do as I please, except for one site.
>
> http://usarmy.skillport.com
>
> I am able to get to the site, and do my sign-in, but as the site is 
> trying to log me in, I continually get a pop-up from my proxy server 
> wanting me to authenticate and I cannot get beyond the authentication.

> I put my information in, and it will come back up after about 15-30 
> seconds.  From what I can see, it does not recognize the information I

> am putting in.  Normally I would see *Doman\*username, but I don't, 
> and I am sure this is why I cannot get beyond authentication, but 
> again. This is the only site I am having an issue with. Here is what 
> my log looks like:
>
>
> 1139498358.467  1 172.16.12.219 TCP_DENIED/407 1741 CONNECT
> usarmy.skillport.com:443 - NONE/- text/html
> 1139498358.490  1 172.16.12.219 TCP_DENIED/407 1740 CONNECT
> usarmy.skillport.com:443 - NONE/- text/html
> 1139498358.499  1 172.16.12.219 TCP_DENIED/407 1740 CONNECT
> usarmy.skillport.com:443 - NONE/- text/html
> 1139498358.505  0 172.16.12.219 TCP_DENIED/407 413 HEAD
> http://usarmy.skillport.com:443/rkusarmy/APPLET/snifferSimple/class.cl
> as
> s - NONE/- text/html
> 1139498358.508  0 172.16.12.219 TCP_DENIED/407 417 HEAD
>
http://usarmy.skillport.com:443/rkusarmy/APPLET/snifferSimple/class.clas
> s - NONE/- text/html
>
>


  The site, probably uses the NTLM auth. scheme, which  is not
proxyable.
  Even MS advices against using NTLM on internet-targeted webservers.

  M.



Re: [squid-users] Always Direct

2006-02-10 Thread Kinkie
On Fri, 2006-02-10 at 12:09 -0500, donovan wrote:
> question for you.
> 
> cahe dir size.
> 
> i have 3 x 120 gig ata drives.
> 
> is that too large? should i be caching that much?
> capacity = 116.69 GB
> Available = 57.13 GB
> Used = 59.56 GB <--- all cache
> 
> cache_dir ufs /Volumes/cache1/cache 65535 16 256
> cache_dir ufs /Volumes/cache_2a/cache 65535 16 256
> cache_dir ufs /Volumes/cache_2b/cache 65535 16 256

Yuck! Move to a more efficient cache_dir scheme, such as aufs or diskd.

You want to keep your used disk space below 75-80% of disk capacity for
better performance. You're doing this - good.

> ***  ,-- don't ask me where i got these numbers, i set it up about  
> two years ago.
> 
> how can adjust this number to match my drives to maximize their  
> efficiency?

You may want to check http://squidwiki.kinkie.it/SquidFaq/SquidMemory ,
especially the last chapter, to understand whether your RAM availability
matches your cache_dir settings.

Kinkie


Re: [squid-users] searching for a special blacklich for squidGuard

2006-02-10 Thread donovan

http://urlblacklist.com/


On Feb 10, 2006, at 11:50 AM, Stefan Vogel wrote:



Hello,

is there anywhere a (free)blacklist for squid guard avialable  
listing sites

like

ebay.com
amazon.com

I have checked the ones from www.squidguard.org but they don't have a
section like ecommerce.

Where can I find something like this?

CU
Stefan






Re: [squid-users] Authentication problem

2006-02-10 Thread Mark Elsen
> I am running CentOS 4.1 with squid-2.5.STABLE6-3.4E.5
>
> I am able to go and do as I please, except for one site.
>
> http://usarmy.skillport.com
>
> I am able to get to the site, and do my sign-in, but as the site is
> trying to log me in, I continually get a pop-up from my proxy server
> wanting me to authenticate and I cannot get beyond the authentication. I
> put my information in, and it will come back up after about 15-30
> seconds.  From what I can see, it does not recognize the information I
> am putting in.  Normally I would see *Doman\*username, but I don't, and
> I am sure this is why I cannot get beyond authentication, but again.
> This is the only site I am having an issue with. Here is what my log
> looks like:
>
>
> 1139498358.467  1 172.16.12.219 TCP_DENIED/407 1741 CONNECT
> usarmy.skillport.com:443 - NONE/- text/html
> 1139498358.490  1 172.16.12.219 TCP_DENIED/407 1740 CONNECT
> usarmy.skillport.com:443 - NONE/- text/html
> 1139498358.499  1 172.16.12.219 TCP_DENIED/407 1740 CONNECT
> usarmy.skillport.com:443 - NONE/- text/html
> 1139498358.505  0 172.16.12.219 TCP_DENIED/407 413 HEAD
> http://usarmy.skillport.com:443/rkusarmy/APPLET/snifferSimple/class.clas
> s - NONE/- text/html
> 1139498358.508  0 172.16.12.219 TCP_DENIED/407 417 HEAD
> http://usarmy.skillport.com:443/rkusarmy/APPLET/snifferSimple/class.clas
> s - NONE/- text/html
>
>


  The site, probably uses the NTLM auth. scheme, which  is not proxyable.
  Even MS advices against using NTLM on internet-targeted webservers.

  M.


Re: [squid-users] xstrdup: tried to dup a NULL pointer! on squid 2.5.STABLE10

2006-02-10 Thread Mark Elsen
> Hi Squid users,
>
> I'm running Squid Cache version 2.5.STABLE10 for
> i686-pc-linux-gnu...and today i received from log message:
>
> FATAL: xstrdup: tried to dup a NULL pointer!
>
> Squid Cache (Version 2.5.STABLE10): Terminated abnormally.
> CPU Usage: 1314455.440 seconds = 818608.480 user + 495846.960 sys
> Maximum Resident Size: 0 KB
> Page faults with physical i/o: 411
> Memory usage for squid via mallinfo():
> total space in arena:  1107701 KB
> Ordinary blocks:   1060599 KB 659240 blks
> Small blocks:   0 KB  0 blks
> Holding blocks: 32852 KB 20 blks
> Free Small blocks:  0 KB
> Free Ordinary blocks:   47101 KB
> Total in use:  1093451 KB 99%
> Total free: 47101 KB 4%
>
> I check from the link: http://www.squid-cache.org/bugs/show_bug.cgi?id=1020
>
> and it should be fixed on 2.5stable6.
>
> any info why my squid 2.5stable10 still get this message?
>
>


Because you may have triggered a new bug , either , file a bug
report (see FAQ) and or  upgrade to the latest STABLE release , to check
whether this issue will keep hunting you , in that version.

M.


Re: [squid-users] Always Direct

2006-02-10 Thread Kinkie
> is it faster to list the Ip address instead of domain name?

No, they should have just about the same level of efficiency.

Kinkie


[squid-users] searching for a special blacklich for squidGuard

2006-02-10 Thread Stefan Vogel

Hello,

is there anywhere a (free)blacklist for squid guard avialable listing sites
like

ebay.com
amazon.com

I have checked the ones from www.squidguard.org but they don't have a
section like ecommerce.

Where can I find something like this?

CU
Stefan



Re: [squid-users] Problem with .swf

2006-02-10 Thread sasa

"Roger" wrote:

Where in squid.conf is the http_access listed?  Is it prior to the deny
for the 'movie' acl?
I think the 'allow' should be before the deny.


..the sequence is:

http_access allow windowsupdate
http_access allow manager sasab
http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access deny to_localhost
http_access allow local_net goodsite
http_access deny mp3
http_access allow local_net
http_access allow localhost
http_access deny all

..but also with:

http_access allow local_net goodsite
http_access allow windowsupdate
http_access allow manager sasab
http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access deny to_localhost
http_access deny mp3
http_access allow local_net
http_access allow localhost
http_access deny all

the result isn't changed.
still thanks.

--
Salvatore.



Re: [squid-users] authentication

2006-02-10 Thread Kinkie
On Fri, 2006-02-10 at 07:58 -0500, boricua wrote:
> if i want to be able to use my proxy from outside the netwok. forcing a user 
> to login, how would i do that?
> TIA

http://squidwiki.kinkie.it/SquidFaq/ProxyAuthentication

Kinkie


Re: [squid-users] Always Direct

2006-02-10 Thread donovan


On Feb 10, 2006, at 11:02 AM, Kinkie wrote:


On Fri, 2006-02-10 at 10:29 -0500, donovan wrote:

greetings,

squid running like a champ. Anyhoo. i need to make squid go direct
for some sites. ( reasons: some edu sites don't cache well ).
did some reading

acl directlist dstdomain .some.com
always_direct allow directlist

two things. Where in my configuration should this line go.


Easy: anywhere :)


awesome.




 and
instead of listing each domain ina single line. can I point to a
file. If yes, what would the file format look like.


One domain per line. E.g:
.some.com
.someother.com
.some.edu

etc



acl directlist dstdomain /local/file ??


acl directlist dstdomain "/local/file"
(note the quotes)



okay thats easy enough

local file will have multiple entries one per line.
is it faster to list the Ip address instead of domain name?

--jeff


Re: [squid-users] Problem with .swf

2006-02-10 Thread Roger
Around Fri, Feb 10, 2006 at 03:28:43PM +0100,  sasa, wrote:
> Hi, I have a problem with file .swf then I don't view in web page but only 
> when the proxy is active.
> I use:
> 
> acl goodsite dstdomain .mywebsite.com
> .
> http_access allow local_net goodsite
> 
> ..in '/var/lib/squidguard/blacklists/porn/expressions'I have a 'movie' 
> word, if I cancel 'movie' I view correctly .swf files, but because also if 
> I use acl 'goodsite' I don't view .swf (the other web site withiout swf 
> specified in acl 'goodsite' are viewed correctly) ??
> thanks.
> 
Where in squid.conf is the http_access listed?  Is it prior to the deny
for the 'movie' acl?
I think the 'allow' should be before the deny.


-- 
Roger Morris
687-3579
[EMAIL PROTECTED]


Re: [squid-users] Always Direct

2006-02-10 Thread Kinkie
On Fri, 2006-02-10 at 10:29 -0500, donovan wrote:
> greetings,
> 
> squid running like a champ. Anyhoo. i need to make squid go direct
> for some sites. ( reasons: some edu sites don't cache well ).
> did some reading
> 
> acl directlist dstdomain .some.com
> always_direct allow directlist
> 
> two things. Where in my configuration should this line go.

Easy: anywhere :)

>  and
> instead of listing each domain ina single line. can I point to a
> file. If yes, what would the file format look like.

One domain per line. E.g:
.some.com
.someother.com
.some.edu

etc


> acl directlist dstdomain /local/file ??

acl directlist dstdomain "/local/file"
(note the quotes)

> 
> 
> (snip)
> acl manager proto cache_object
> acl localhost src 127.0.0.1/255.255.255.255
> 
> acl all src 0.0.0.0/0.0.0.0
> acl s3.1 src 192.199.0.0/16 192.200.0.0/16
> acl s4.1 src 192.208.0.0/16 192.209.0.0/16
> acl s5.1 src 192.217.0.0/16 192.218.0.0/16
> acl core src 10.3.1.0/24
> # access to http traffic
> 
> http_access allow manager localhost
> #http_access allow manager apache
> http_access allow s3.1
> http_access allow s4.1
> http_access allow s5.1
> http_access allow core
> http_access deny all
> (snip)
> 
> not sure where i would place that line.

Anywhere is fine.



Re: [squid-users] Always Direct

2006-02-10 Thread Roger

Around Fri, Feb 10, 2006 at 10:29:01AM -0500,  donovan, wrote:
> greetings,
> two things. Where in my configuration should this line go. and
> instead of listing each domain ina single line. can I point to a
> file. If yes, what would the file format look like.
> 
> acl directlist dstdomain /local/file ??
> 
> 
Looking in squid.conf


#
#   acl aclname acltype string1 ...
#   acl aclname acltype "file" ...
#
#   when using "file", the file should contain one item per line
#
#   acltype is one of src dst srcdomain dstdomain url_pattern
#   urlpath_pattern time port proto method browser user


It is okay to have multiple lines with the same name.  Look a little
further down the file where it defines 'Safe_ports' acl

I'm not sure if it matters where you put the line, just before you use
it I think.


--

Roger


Re: [squid-users] Can this be done?

2006-02-10 Thread Kinkie
On Fri, 2006-02-10 at 09:14 -0600, Jim Mainock wrote:
> We are currently running squid 2.5 using ncsa_auth to authenticate users. We 
> would like to be able to allow internet access to specific urls by users in 
> a department. My thought was to create a file containing userids for each 
> department then another file containing urls for each depart they can 
> access. How can I create an acl that will test if the user is in a 
> particular department. I've seen examples using ident but can this be done 
> using ncsa_auth? Any help would be appreciated.
> 

yes

acl dept1_sites dstdomain .foo.com .bar.com
acl dept1_users proxy_auth user1 user2 user3

acl dept2_sites dstdomain .gazonk.com .baz.com
acl dept2_users proxy_auth user4 user5 user6

http_access allow http dept1_sites dept1_users
http_access allow http dept2_sites dept2_users

etc.

Also see http://squidwiki.kinkie.it/SquidFaq/SquidAcl

Kinkie


[squid-users] xstrdup: tried to dup a NULL pointer! on squid 2.5.STABLE10

2006-02-10 Thread Yoseph Basri
Hi Squid users,

I'm running Squid Cache version 2.5.STABLE10 for
i686-pc-linux-gnu...and today i received from log message:

FATAL: xstrdup: tried to dup a NULL pointer!

Squid Cache (Version 2.5.STABLE10): Terminated abnormally.
CPU Usage: 1314455.440 seconds = 818608.480 user + 495846.960 sys
Maximum Resident Size: 0 KB
Page faults with physical i/o: 411
Memory usage for squid via mallinfo():
total space in arena:  1107701 KB
Ordinary blocks:   1060599 KB 659240 blks
Small blocks:   0 KB  0 blks
Holding blocks: 32852 KB 20 blks
Free Small blocks:  0 KB
Free Ordinary blocks:   47101 KB
Total in use:  1093451 KB 99%
Total free: 47101 KB 4%

I check from the link: http://www.squid-cache.org/bugs/show_bug.cgi?id=1020

and it should be fixed on 2.5stable6.

any info why my squid 2.5stable10 still get this message?

Thanks

YB


[squid-users] Always Direct

2006-02-10 Thread donovan

greetings,

squid running like a champ. Anyhoo. i need to make squid go direct
for some sites. ( reasons: some edu sites don't cache well ).
did some reading

acl directlist dstdomain .some.com
always_direct allow directlist

two things. Where in my configuration should this line go. and
instead of listing each domain ina single line. can I point to a
file. If yes, what would the file format look like.

acl directlist dstdomain /local/file ??


(snip)
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255

acl all src 0.0.0.0/0.0.0.0
acl s3.1 src 192.199.0.0/16 192.200.0.0/16
acl s4.1 src 192.208.0.0/16 192.209.0.0/16
acl s5.1 src 192.217.0.0/16 192.218.0.0/16
acl core src 10.3.1.0/24
# access to http traffic

http_access allow manager localhost
#http_access allow manager apache
http_access allow s3.1
http_access allow s4.1
http_access allow s5.1
http_access allow core
http_access deny all
(snip)

not sure where i would place that line.

--j


[squid-users] Authentication problem

2006-02-10 Thread Casey King
I am running CentOS 4.1 with squid-2.5.STABLE6-3.4E.5

I am able to go and do as I please, except for one site.

http://usarmy.skillport.com

I am able to get to the site, and do my sign-in, but as the site is
trying to log me in, I continually get a pop-up from my proxy server
wanting me to authenticate and I cannot get beyond the authentication. I
put my information in, and it will come back up after about 15-30
seconds.  From what I can see, it does not recognize the information I
am putting in.  Normally I would see *Doman\*username, but I don't, and
I am sure this is why I cannot get beyond authentication, but again.
This is the only site I am having an issue with. Here is what my log
looks like:


1139498358.467  1 172.16.12.219 TCP_DENIED/407 1741 CONNECT
usarmy.skillport.com:443 - NONE/- text/html
1139498358.490  1 172.16.12.219 TCP_DENIED/407 1740 CONNECT
usarmy.skillport.com:443 - NONE/- text/html
1139498358.499  1 172.16.12.219 TCP_DENIED/407 1740 CONNECT
usarmy.skillport.com:443 - NONE/- text/html
1139498358.505  0 172.16.12.219 TCP_DENIED/407 413 HEAD
http://usarmy.skillport.com:443/rkusarmy/APPLET/snifferSimple/class.clas
s - NONE/- text/html
1139498358.508  0 172.16.12.219 TCP_DENIED/407 417 HEAD
http://usarmy.skillport.com:443/rkusarmy/APPLET/snifferSimple/class.clas
s - NONE/- text/html



[squid-users] Can this be done?

2006-02-10 Thread Jim Mainock
We are currently running squid 2.5 using ncsa_auth to authenticate users. We 
would like to be able to allow internet access to specific urls by users in 
a department. My thought was to create a file containing userids for each 
department then another file containing urls for each depart they can 
access. How can I create an acl that will test if the user is in a 
particular department. I've seen examples using ident but can this be done 
using ncsa_auth? Any help would be appreciated.





[squid-users] Problem with .swf

2006-02-10 Thread sasa
Hi, I have a problem with file .swf then I don't view in web page but only 
when the proxy is active.

I use:

squid-2.5.STABLE3-2
squidguard-1.2.0-2

.. in log file I have:

10.0.0.11 - - [10/Feb/2006:14:25:25 +0100] "GET 
http://www.mywebsite.com/Movie/example.swf HTTP/1.0" 403 4135 
TCP_NEGATIVE_HIT:NONE


..in squid.conf I have:

acl goodsite dstdomain .mywebsite.com
.
http_access allow local_net goodsite

..in '/var/lib/squidguard/blacklists/porn/expressions'I have a 'movie' word, 
if I cancel 'movie' I view correctly .swf files, but because also if I use 
acl 'goodsite' I don't view .swf (the other web site withiout swf specified 
in acl 'goodsite' are viewed correctly) ??

thanks.

--
Salvatore. 



[squid-users] authentication

2006-02-10 Thread boricua
if i want to be able to use my proxy from outside the netwok. forcing a user to 
login, how would i do that?
TIA


Re: [squid-users] multi casting to multiple user

2006-02-10 Thread Mark Elsen
> I am trying to a virtual training environment,where result
> obtained from one user can be multicasted to several users in
> the system. Can I achieve this using squid.
> Any help in this matter will be greatly appreciated.

http://www.squid-cache.org/Doc/FAQ/FAQ-1.html#ss1.1

 M.


Re: [squid-users] Yahoo mail and squid

2006-02-10 Thread Mark Elsen
>
> I have seen this kind of problem in the list before
> but I could find a solution! Hope I get some help.
>
> Our departmental server (192.168.0.3 running Redhat
> 8.0) runs squid (squid-2.4.STABLE7-4) to provide
> Internet access to the local lan (192.168.0.0) with
> the parent cache as the university server
> (10.10.127.2). Everything fine except when accessing
> Yahoo mail or similer websites for example searching a
> word on squid-archive. The system returns timeout when
> I try to send a mail.
>
>...

 http://www.squid-cache.org/Doc/FAQ/FAQ-4.html#ss4.8

  M.


[squid-users] Yahoo mail and squid

2006-02-10 Thread Madhurjya P. Bora
Hi All!

I have seen this kind of problem in the list before
but I could find a solution! Hope I get some help.

Our departmental server (192.168.0.3 running Redhat
8.0) runs squid (squid-2.4.STABLE7-4) to provide
Internet access to the local lan (192.168.0.0) with
the parent cache as the university server
(10.10.127.2). Everything fine except when accessing
Yahoo mail or similer websites for example searching a
word on squid-archive. The system returns timeout when
I try to send a mail.

However, if I connect my local box directly to the
university parent proxy (10.10.127.2) bypassing the
local proxy (192.168.0.3), I get no error. How to
solve this problem as I want users to enfore using our
local proxy.

This morning I have configured the local squid
configuration and added the lines

cache_peer 10.10.127.2 parent 3128 3130 no-query
default
acl all src 0.0.0.0/0.0.0.0
never_direct allow all


which means that I am directing every request to the
parent proxy. This seems to be working!

I don't know what I have gained by doing this! Is the
local proxy not caching anything? Is the efficiency
being compromized? Any help is appreciated.

Madhurjya



__
Do You Yahoo!?
Tired of spam?  Yahoo! Mail has the best spam protection around 
http://mail.yahoo.com 


[squid-users] Yahoo mail and squid

2006-02-10 Thread Madhurjya P. Bora
Hi All!

I have seen this kind of problem in the list before
but I could find a solution! Hope I get some help.

Our departmental server (192.168.0.3 running Redhat
8.0) runs squid (squid-2.4.STABLE7-4) to provide
Internet access to the local lan (192.168.0.0) with
the parent cache as the university server
(10.10.127.2). Everything fine except when accessing
Yahoo mail or similer websites for example searching a
word on squid-archive. The system returns timeout when
I try to send a mail.

However, if I connect my local box directly to the
university parent proxy (10.10.127.2) bypassing the
local proxy (192.168.0.3), I get no error. How to
solve this problem as I want users to enfore using our
local proxy.

This morning I have configured the local squid
configuration and added the lines

cache_peer 10.10.127.2 parent 3128 3130 no-query
default
acl all src 0.0.0.0/0.0.0.0
never_direct allow all


which means that I am directing every request to the
parent proxy. This seems to be working!

I don't know what I have gained by doing this! Is the
local proxy not caching anything? Is the efficiency
being compromized? Any help is appreciated.

Madhurjya



__
Do You Yahoo!?
Tired of spam?  Yahoo! Mail has the best spam protection around 
http://mail.yahoo.com