[squid-users] How to make squid IM/IRC friendly?

2007-11-28 Thread Joel Bryan Juliano
I use GNOME and when I configure my desktop to use a squid proxy
address in "Network Proxy",
Pidgin automatically use "GNOME Proxy Settings", and could not connect.

Is there a squid configuration that will allow communication from MSN,
Yahoo, Gmail and IRC?

-- 
♐ Rules!!


Re: [squid-users] looking for testers: google maps/earth/youtubecaching

2007-11-28 Thread Adrian Chadd
On Thu, Nov 29, 2007, Dave wrote:
> Hi,
> 
> I have just setup a testing environment too.
> 
> I now get SWAPOUTS for google earth via the webpage (maps.google.com) 
> but not so for the client software for google earth.

You won't get that. I haven't yet sat down and looked into how the client
shuffles data around.

> I think this is something to do with the rewrite program though.
> 
> I am not sure what the relevance of the space at the end is, but see the 
> output below.

The space is because the helper receives a number of arguments on the command
line with the URL being the first one. The space delineates the end of the
first argument (ie, the URL) and the beginning of the second. Squid will
(should!) encode spaces in the URL so they're all %20's - the spaces will
only appear as argument delineators.

> # echo "http://kh.google.com/flatfile?q2-0122013223232230-q.153"; | 
> ./rewrite
> http://kh.google.com/flatfile?q2-0122013223232230-q.153
> 
> # echo "http://kh.google.com/flatfile?q2-0122013223232230-q.153 " | 
> ./rewrite
> http://keyhole-srv.google.com.SQUIDINTERNAL/flatfile?q2-0122013223232230-q.153

Yup, thats right. Just don't expect the flatfile contents to be cached
just for now.

> Secondly,
> 
> When I get a swapout and then use wget to change the host name I get a 
> miss the next time round and the following lines in my cache.log.
> 
> 2007/11/29 10:45:48| storeClientReadHeader: URL mismatch
> 2007/11/29 10:45:48|
> {http://mt3.google.com/mt?n=404&v=w2.63&x=15071&y=9828&zoom=3} != 
> {http://mt2.google.com/mt?n=404&v=w2.63&x=15071&y=9828&zoom=3}

Hm, I thought I fixed these. How annoying. It might be that you've got some
of those tiles in your cache already under their original URL with no store
URL and its confusing things. I'll eyeball the code some more and try to
make it more resilient (and fix the crash another poster is seeing - oops!)



Adrian

-- 
- Xenion - http://www.xenion.com.au/ - VPS Hosting - Commercial Squid Support -


Re: [squid-users] looking for testers: google maps/earth/youtubecaching

2007-11-28 Thread Dave

Hi,

I have just setup a testing environment too.

I now get SWAPOUTS for google earth via the webpage (maps.google.com) 
but not so for the client software for google earth.


I think this is something to do with the rewrite program though.

I am not sure what the relevance of the space at the end is, but see the 
output below.


# echo "http://kh.google.com/flatfile?q2-0122013223232230-q.153"; | 
./rewrite

http://kh.google.com/flatfile?q2-0122013223232230-q.153

# echo "http://kh.google.com/flatfile?q2-0122013223232230-q.153 " | 
./rewrite

http://keyhole-srv.google.com.SQUIDINTERNAL/flatfile?q2-0122013223232230-q.153

Secondly,

When I get a swapout and then use wget to change the host name I get a 
miss the next time round and the following lines in my cache.log.


2007/11/29 10:45:48| storeClientReadHeader: URL mismatch
2007/11/29 10:45:48|
{http://mt3.google.com/mt?n=404&v=w2.63&x=15071&y=9828&zoom=3} != 
{http://mt2.google.com/mt?n=404&v=w2.63&x=15071&y=9828&zoom=3}

2007/11/29 10:45:54| storeClientReadHeader: URL mismatch
2007/11/29 10:45:54|
{http://mt2.google.com/mt?n=404&v=w2.63&x=15071&y=9828&zoom=3} != 
{http://mt1.google.com/mt?n=404&v=w2.63&x=15071&y=9828&zoom=3}

2007/11/29 10:45:58| storeClientReadHeader: URL mismatch
2007/11/29 10:45:58|
{http://mt1.google.com/mt?n=404&v=w2.63&x=15071&y=9828&zoom=3} != 
{http://mt2.google.com/mt?n=404&v=w2.63&x=15071&y=9828&zoom=3}

2007/11/29 10:46:04| storeClientReadHeader: URL mismatch
2007/11/29 10:46:04|
{http://mt2.google.com/mt?n=404&v=w2.63&x=15071&y=9828&zoom=3} != 
{http://mt1.google.com/mt?n=404&v=w2.63&x=15071&y=9828&zoom=3}

2007/11/29 10:46:08| storeClientReadHeader: URL mismatch
2007/11/29 10:46:08|
{http://mt1.google.com/mt?n=404&v=w2.63&x=15071&y=9828&zoom=3} != 
{http://mt2.google.com/mt?n=404&v=w2.63&x=15071&y=9828&zoom=3}

2007/11/29 10:46:16| storeClientReadHeader: URL mismatch
2007/11/29 10:46:16|
{http://mt2.google.com/mt?n=404&v=w2.63&x=15071&y=9828&zoom=3} != 
{http://mt3.google.com/mt?n=404&v=w2.63&x=15071&y=9828&zoom=3}


eg,

wget -S "http://mt2.google.com/mt?n=404&v=w2.63&x=15071&y=9828&zoom=3"; 
HIT
wget -S "http://mt1.google.com/mt?n=404&v=w2.63&x=15071&y=9828&zoom=3";   
  MISS
wget -S "http://mt1.google.com/mt?n=404&v=w2.63&x=15071&y=9828&zoom=3";   
  NOW IS A HIT
wget -S "http://mt2.google.com/mt?n=404&v=w2.63&x=15071&y=9828&zoom=3"; 
IS A MISS NOW
wget -S "http://mt2.google.com/mt?n=404&v=w2.63&x=15071&y=9828&zoom=3"; 
IS NOW A HIT



Dave

Adrian Chadd wrote:

On Wed, Nov 28, 2007, Gleidson Antonio Henriques wrote:
  

Hey Adrian,

   For me google maps and google earth are working like a charm ! So many 
HITS and SWAPOUTS !



Thats great news! Thats two users who have reported success with
this work. Fantastic.

Who else is game to give it a go?

  
   I know that Youtube part is incomplete for now, but for me it works 
sometimes.
   Sometimes my squid screen shows this message and when this message 
occurrs the content
   is served from youtube source even if a have the video in cache. For 
your knowledge i put the line below.


   2007/11/28 11:18:45| storeClientReadHeader: URL mismatch
2007/11/28 11:18:45| 
{http://74.125.1.37/get_video?video_id=ZQkHAytCmFY&origin=dal-v26.dal.youtube.com} 
!= 
{http://74.125.1.101/get_video?video_id=ZQkHAytCmFY&origin=dal-v26.dal.youtube.com}



Try upgrading to the latest Squid-2.HEAD snapshot; I think I've fixed this.
Let me know if not and I'll take another look at it.



Adrian

  


Re: [squid-users] new squid setup, optimizing cache

2007-11-28 Thread Amos Jeffries
> Hello,
> I'm setting up squid on a new machine, this one an OpenBSD 4.2 router.
> The install went fine and all, but now i come to configuring squid. I gave
> the squid cache it's own partition in this case /dev/wd0h, on the first
> drive, it's got soft updates enabled on it, and is 17.6 gb in size. I was
> wondering the best squid optimizations?
> I read that the cache size shouldn't exceed 80% of the total space so
> would that be 80% of the 17.6 g partition And when entering this on the
> cache_dir line should the size be approx 15000 or should i append
> something,
> 15g for instance?or the entire disk? For the memory and cache replacement
> policies Heap LFUDa, or is there a better choice?

the cache_dir option is still a little messy, there are no units on the
sizes there. So your initial '15000' is correct to create a 15GB
cache-dir.

Optimisations would be to use aufs to grab asynchronous capabilities of
store. And possibly a small COSS cache-dir to speed up small object
caching.

> Unrelated to cache, but my logformat i've got the squid logformat
> uncommented, i'd like to display it's time formats in local time for
> interpreting at a glance. Is this doable?

Yes. Easiest way is to "emulate_httpd on". that creates access.log in
common apache format.
Otherwise you can make your own log format.

This should help you out:
http://www.squid-cache.org/Versions/v2/2.6/cfgman/

Amos




Re: [squid-users] Making a Squid Start Page

2007-11-28 Thread Chris Robertson

Reid wrote:
I am trying to create a start page so that Squid will display a disclaimer when clients login. 


I found the following script on a previous posting, and put it on my server:

#!/usr/bin/perl
$|=1;
my %logged_in;

while(<>) {
  if (!defined($logged_in{$_})) {
$logged_in{$_} = 1;
print "ERR\n";
  } else {
print "OK\n";
  }
}

  


This never "expires" sessions from the hash "logged_in".   The following 
hack would at least help there:


#!/usr/bin/perl
$|=1;
my %logged_in;
my $session_ttl = 3600;
my $time = time();

while(<>) {
 if ((!defined($logged_in{$_}) || 
 ($logged_in{$_} + $session_ttl > $time)) {

   $logged_in{$_} = $time;
   print "ERR\n";
 } else {
   print "OK\n";
 }
}


No promises that it won't set your hair on fire, or scratch your CD's, etc.


I gave it proper permissions, and then put the following in my squid.conf:


external_acl_type session negative_ttl=0 %LOGIN 
/usr/lib/squid/my_squid_session.pl
acl session external session
http_access deny !session
deny_info http://##.##.##.##/startpage.php session


Using this configuration, the script will redirect the client the first time 
logging in, but not
again. Even 24 hours later and after deleting cookies, if I login again I am 
not redirected. I
have to restart squid in order to get another redirect to take place.

So I changed the first line to: 


external_acl_type session ttl=300 negative_ttl=0 %LOGIN 
/usr/lib/squid/my_squid_session.pl

That doesn't seem to make a difference.  


Any ideas? I am trying to avoid using the "squid_session" helper because I get 
all sorts of errors
when using it together with digest_authentication.
  


That's very odd.  I have no insight here, I just find it odd.


Thank you!
  


Chris


FW: [squid-users] wbinfo_group.pl - This ever happen to anyone?

2007-11-28 Thread Terry Dobbs
Ok, for anyone with a similar issue the problem was in smb.conf file. 

I had:

idmap uid 1 - 2

There can not be any spaces... ughhh the time I spent looking at
completely different things!

Also, when adding a user to a group in a wbinfo_group.pl access list,
does squid need to be reloaded?

-Original Message-
From: Terry Dobbs [mailto:[EMAIL PROTECTED] 
Sent: Wednesday, November 28, 2007 1:38 PM
To: squid-users@squid-cache.org
Subject: [squid-users] wbinfo_group.pl - This ever happen to anyone?

Ok, so I have wbinfo_group.pl working nicely on our local squid box, it
blocks users belonging to a particular group.

However, I have done the exact same thing on a remote box in the USA and
it doesn't want to work. When I run wbinfo -r username I get no results,
I used to get "could not get groups for user" now I get nothing
returned. I am pretty sure this is what is causing the wbinfo_group.pl
not to work. The logs don't give me much useful information.

On the local box that wbinfo_group.pl is working I get the error "Could
Not convert SID='S-1-5-21-1122444-424242525-5353622-42124- User(1) to
gid" when I do it manually, but I thought nothing of it as it works. The
same thing happens on the box that isn't working as well, but that box
will not even do a wbinfo -r.

I have verified that I do indeed had idmap uid and gids mapped in
smb.conf. I have used the local Domain Controller at the remote site to
authenticate thinking it may be timing out or something (is there a
timeout?) I have goggled like crazy, I know this is a SMB issue but im
wondering if anyone has ever had a similar issue in their squid setup. 

I would be VERY grateful if anyone has any kind of insight.


Re: [squid-users] looking for testers: google maps/earth/youtubecaching

2007-11-28 Thread Gleidson Antonio Henriques

Ok,

(gdb) bt
#0  0x004ebd48 in strcmp () from /lib/libc.so.6
#1  0x0806af62 in clientCacheHit (data=0x929a2e0, rep=0x929b2a0) at 
client_side.c:2192
#2  0x080657e3 in storeClientCopyHeadersCB (data=0x929a2e0, buf=0x929b3a8 
"", size=312) at client_side.c:184
#3  0x080c6dc6 in storeClientCallback (sc=0x929b368, sz=312) at 
store_client.c:146
#4  0x080c7e78 in storeClientReadHeader (data=0x929b368, buf=0x929b3a8 "", 
len=439) at store_client.c:489
#5  0x080dd393 in storeAufsReadDone (fd=24, my_data=0x92431d0, buf=0x929c650 
"\003\177", len=439, errflag=0) at aufs/store_io_aufs.c:400

#6  0x080dfb3d in aioCheckCallbacks (SD=0x90249f8) at aufs/async_io.c:319
#7  0x080c982e in storeDirCallback () at store_dir.c:511
#8  0x0807618f in comm_select (msec=300) at comm_generic.c:377
#9  0x080a81e5 in main (argc=2, argv=0xbfc0eec4) at main.c:856
(gdb) frame 1
#1  0x0806af62 in clientCacheHit (data=0x929a2e0, rep=0x929b2a0) at 
client_side.c:2192
2192} else if (r->store_url && strcmp(mem->store_url, r->store_url) 
!= 0) {

(gdb) print mem->url
$1 = 0x929b318 "http://www.youtube.com/css/base_all_yts1195074272.css";
(gdb) print mem->store_url
$2 = 0x0
(gdb) r->store_url
Undefined command: "r->store_url".  Try "help".
(gdb) r
The program being debugged has been started already.
Start it from the beginning? (y or n) n
Program not restarted.
(gdb) print r->store_url
$3 = 0x929b1c0 "http://www.youtube.com/css/base_all_yts1195074272.css";

is this ?

Thanks,

Gleidson Antonio Henriques

- Original Message - 
From: "Adrian Chadd" <[EMAIL PROTECTED]>

To: "Gleidson Antonio Henriques" <[EMAIL PROTECTED]>
Cc: "Adrian Chadd" <[EMAIL PROTECTED]>; 
Sent: Wednesday, November 28, 2007 5:23 PM
Subject: Re: [squid-users] looking for testers: google 
maps/earth/youtubecaching




oh no

frame 1
print mem->url
print mem->store_url
r->store_url



Adrian

On Wed, Nov 28, 2007, Gleidson Antonio Henriques wrote:

Ok,

But i didn?t understood what do you mean about print to strcmp...
I put below the steps that i had done.

--->
(gdb) frame 1
#1  0x0806aa6d in clientCacheHit (data=0x96453c8, rep=0x9646388) at
client_side.c:2192
2192} else if (r->store_url && strcmp(mem->store_url, 
r->store_url)

!= 0) {
(gdb) print strcmp
$3 = {} 0x4ebd40 
<--

That?s all right for you or i?m wrong about the print command ?

Thanks in Advance,

Gleidson Antonio Henriques

- Original Message - 
From: "Adrian Chadd" <[EMAIL PROTECTED]>

To: "Gleidson Antonio Henriques" <[EMAIL PROTECTED]>
Cc: "Adrian Chadd" <[EMAIL PROTECTED]>; 


Sent: Wednesday, November 28, 2007 3:57 PM
Subject: Re: [squid-users] looking for testers: google
maps/earth/youtubecaching


>Ok cool. Can you go frame 1, and then print the arguments to strcmp?
>
>Thanks,
>
>
>Adrian
>
>On Wed, Nov 28, 2007, Gleidson Antonio Henriques wrote:
>>Here we go,
>>
>>->
>>
>>(gdb) bt
>>#0  0x004ebd48 in strcmp () from /lib/libc.so.6
>>#1  0x0806aa6d in clientCacheHit (data=0x8840760, rep=0x8841720) at
>>client_side.c:2192
>>#2  0x080b3ec8 in storeClientCallback (sc=0x88417e8, sz=312) at
>>store_client.c:146
>>#3  0x080b482e in storeClientReadHeader (data=0x88417e8, buf=0x8841828
>>"",
>>len=439) at store_client.c:489
>>#4  0x080c70f6 in storeAufsReadDone (fd=25, my_data=0x87e8608,
>>buf=0x8842ae0 "\003\177", len=439, errflag=0) at 
>>aufs/store_io_aufs.c:400
>>#5  0x080c94a1 in aioCheckCallbacks (SD=0x85ca9f8) at 
>>aufs/async_io.c:319

>>#6  0x080b56bf in storeDirCallback () at store_dir.c:511
>>#7  0x0806f357 in comm_select (msec=379) at comm_generic.c:377
>>#8  0x0809a618 in main (argc=2, argv=0xbfae3da4) at main.c:856
>>
>><-
>>
>>Best Regards,
>>
>>Gleidson Antonio Henriques
>>
>>- Original Message - 
>>From: "Adrian Chadd" <[EMAIL PROTECTED]>

>>To: "Gleidson Antonio Henriques" <[EMAIL PROTECTED]>
>>Cc: "Adrian Chadd" <[EMAIL PROTECTED]>;
>>
>>Sent: Wednesday, November 28, 2007 3:03 PM
>>Subject: Re: [squid-users] looking for testers: google
>>maps/earth/youtubecaching
>>
>>
>>>Hm, can you get a stack trace at all?
>>>
>>>Its not hard in gdb, just put this in ~/.gdbinit
>>>
>>>handle all nostop noprint
>>>handle SIGSEGV stop
>>>break xassert
>>>break fatal
>>>
>>>then gdb squid
>>>run -ND
>>>.. wait.
>>>
>>>
>>>
>>>
>>>Adrian
>>>
>>>On Wed, Nov 28, 2007, Gleidson Antonio Henriques wrote:
Adrian,

 With Latest snapshot SQUID-2.HEAD, i?ve got a Segmentation fault in
Squid
when try to access youtube.
 If i try other URL that isn?t in store_rewrite_list, it worked well.
 Do you need some more information about it ? I tried gdb but squid
running
in threads is so difficult to trace.
 I used same ./configure options that had used in
 squid-2.HEAD-20071126.

 Best regards,

Gleidson Antonio Henriques

- Original Message - 
From: "Adrian Chadd" <[EMAIL PROTECTED]>

To: "Gleidson 

Re: [squid-users] looking for testers: google maps/earth/youtubecaching

2007-11-28 Thread Adrian Chadd
oh no

frame 1
print mem->url
print mem->store_url
r->store_url



Adrian

On Wed, Nov 28, 2007, Gleidson Antonio Henriques wrote:
> Ok,
> 
> But i didn?t understood what do you mean about print to strcmp...
> I put below the steps that i had done.
> 
> --->
> (gdb) frame 1
> #1  0x0806aa6d in clientCacheHit (data=0x96453c8, rep=0x9646388) at 
> client_side.c:2192
> 2192} else if (r->store_url && strcmp(mem->store_url, r->store_url) 
> != 0) {
> (gdb) print strcmp
> $3 = {} 0x4ebd40 
> <--
> 
> That?s all right for you or i?m wrong about the print command ?
> 
> Thanks in Advance,
> 
> Gleidson Antonio Henriques
> 
> - Original Message - 
> From: "Adrian Chadd" <[EMAIL PROTECTED]>
> To: "Gleidson Antonio Henriques" <[EMAIL PROTECTED]>
> Cc: "Adrian Chadd" <[EMAIL PROTECTED]>; 
> Sent: Wednesday, November 28, 2007 3:57 PM
> Subject: Re: [squid-users] looking for testers: google 
> maps/earth/youtubecaching
> 
> 
> >Ok cool. Can you go frame 1, and then print the arguments to strcmp?
> >
> >Thanks,
> >
> >
> >Adrian
> >
> >On Wed, Nov 28, 2007, Gleidson Antonio Henriques wrote:
> >>Here we go,
> >>
> >>->
> >>
> >>(gdb) bt
> >>#0  0x004ebd48 in strcmp () from /lib/libc.so.6
> >>#1  0x0806aa6d in clientCacheHit (data=0x8840760, rep=0x8841720) at
> >>client_side.c:2192
> >>#2  0x080b3ec8 in storeClientCallback (sc=0x88417e8, sz=312) at
> >>store_client.c:146
> >>#3  0x080b482e in storeClientReadHeader (data=0x88417e8, buf=0x8841828 
> >>"",
> >>len=439) at store_client.c:489
> >>#4  0x080c70f6 in storeAufsReadDone (fd=25, my_data=0x87e8608,
> >>buf=0x8842ae0 "\003\177", len=439, errflag=0) at aufs/store_io_aufs.c:400
> >>#5  0x080c94a1 in aioCheckCallbacks (SD=0x85ca9f8) at aufs/async_io.c:319
> >>#6  0x080b56bf in storeDirCallback () at store_dir.c:511
> >>#7  0x0806f357 in comm_select (msec=379) at comm_generic.c:377
> >>#8  0x0809a618 in main (argc=2, argv=0xbfae3da4) at main.c:856
> >>
> >><-
> >>
> >>Best Regards,
> >>
> >>Gleidson Antonio Henriques
> >>
> >>- Original Message - 
> >>From: "Adrian Chadd" <[EMAIL PROTECTED]>
> >>To: "Gleidson Antonio Henriques" <[EMAIL PROTECTED]>
> >>Cc: "Adrian Chadd" <[EMAIL PROTECTED]>; 
> >>
> >>Sent: Wednesday, November 28, 2007 3:03 PM
> >>Subject: Re: [squid-users] looking for testers: google
> >>maps/earth/youtubecaching
> >>
> >>
> >>>Hm, can you get a stack trace at all?
> >>>
> >>>Its not hard in gdb, just put this in ~/.gdbinit
> >>>
> >>>handle all nostop noprint
> >>>handle SIGSEGV stop
> >>>break xassert
> >>>break fatal
> >>>
> >>>then gdb squid
> >>>run -ND
> >>>.. wait.
> >>>
> >>>
> >>>
> >>>
> >>>Adrian
> >>>
> >>>On Wed, Nov 28, 2007, Gleidson Antonio Henriques wrote:
> Adrian,
> 
>  With Latest snapshot SQUID-2.HEAD, i?ve got a Segmentation fault in
> Squid
> when try to access youtube.
>  If i try other URL that isn?t in store_rewrite_list, it worked well.
>  Do you need some more information about it ? I tried gdb but squid
> running
> in threads is so difficult to trace.
>  I used same ./configure options that had used in 
>  squid-2.HEAD-20071126.
> 
>  Best regards,
> 
> Gleidson Antonio Henriques
> 
> - Original Message - 
> From: "Adrian Chadd" <[EMAIL PROTECTED]>
> To: "Gleidson Antonio Henriques" <[EMAIL PROTECTED]>
> Cc: 
> Sent: Wednesday, November 28, 2007 1:47 PM
> Subject: Re: [squid-users] looking for testers: google
> maps/earth/youtubecaching
> 
> 
> >On Wed, Nov 28, 2007, Gleidson Antonio Henriques wrote:
> >>Hey Adrian,
> >>
> >>   For me google maps and google earth are working like a charm ! So
> >> many
> >>HITS and SWAPOUTS !
> >
> >Thats great news! Thats two users who have reported success with
> >this work. Fantastic.
> >
> >Who else is game to give it a go?
> >
> >>   I know that Youtube part is incomplete for now, but for me it 
> >> works
> >>sometimes.
> >>   Sometimes my squid screen shows this message and when this message
> >>occurrs the content
> >>   is served from youtube source even if a have the video in cache. 
> >> For
> >>your knowledge i put the line below.
> >>
> >>   2007/11/28 11:18:45| storeClientReadHeader: URL mismatch
> >>2007/11/28 11:18:45|
> >>{http://74.125.1.37/get_video?video_id=ZQkHAytCmFY&origin=dal-v26.dal.youtube.com}
> >>!=
> >>{http://74.125.1.101/get_video?video_id=ZQkHAytCmFY&origin=dal-v26.dal.youtube.com}
> >
> >Try upgrading to the latest Squid-2.HEAD snapshot; I think I've fixed
> >this.
> >Let me know if not and I'll take another look at it.
> >
> >
> >
> >Adrian
> >
> >-- 
> >- Xenion - http://www.xenion.com.au/ - VPS Hosting - Commercial Squid
> >Support -
> >>>
> >>>-- 
> >>>- Xenion - http://www

Re: [squid-users] looking for testers: google maps/earth/youtubecaching

2007-11-28 Thread Gleidson Antonio Henriques

I will compile my Squid with CFLAGS="-g" for more debug information.
Give some minutes !

Regards,

Gleidson Antonio Henriques
- Original Message - 
From: "Adrian Chadd" <[EMAIL PROTECTED]>

To: "Gleidson Antonio Henriques" <[EMAIL PROTECTED]>
Cc: "Adrian Chadd" <[EMAIL PROTECTED]>; 
Sent: Wednesday, November 28, 2007 3:57 PM
Subject: Re: [squid-users] looking for testers: google 
maps/earth/youtubecaching




Ok cool. Can you go frame 1, and then print the arguments to strcmp?

Thanks,


Adrian

On Wed, Nov 28, 2007, Gleidson Antonio Henriques wrote:

Here we go,

->

(gdb) bt
#0  0x004ebd48 in strcmp () from /lib/libc.so.6
#1  0x0806aa6d in clientCacheHit (data=0x8840760, rep=0x8841720) at
client_side.c:2192
#2  0x080b3ec8 in storeClientCallback (sc=0x88417e8, sz=312) at
store_client.c:146
#3  0x080b482e in storeClientReadHeader (data=0x88417e8, buf=0x8841828 
"",

len=439) at store_client.c:489
#4  0x080c70f6 in storeAufsReadDone (fd=25, my_data=0x87e8608,
buf=0x8842ae0 "\003\177", len=439, errflag=0) at aufs/store_io_aufs.c:400
#5  0x080c94a1 in aioCheckCallbacks (SD=0x85ca9f8) at aufs/async_io.c:319
#6  0x080b56bf in storeDirCallback () at store_dir.c:511
#7  0x0806f357 in comm_select (msec=379) at comm_generic.c:377
#8  0x0809a618 in main (argc=2, argv=0xbfae3da4) at main.c:856

<-

Best Regards,

Gleidson Antonio Henriques

- Original Message - 
From: "Adrian Chadd" <[EMAIL PROTECTED]>

To: "Gleidson Antonio Henriques" <[EMAIL PROTECTED]>
Cc: "Adrian Chadd" <[EMAIL PROTECTED]>; 


Sent: Wednesday, November 28, 2007 3:03 PM
Subject: Re: [squid-users] looking for testers: google
maps/earth/youtubecaching


>Hm, can you get a stack trace at all?
>
>Its not hard in gdb, just put this in ~/.gdbinit
>
>handle all nostop noprint
>handle SIGSEGV stop
>break xassert
>break fatal
>
>then gdb squid
>run -ND
>.. wait.
>
>
>
>
>Adrian
>
>On Wed, Nov 28, 2007, Gleidson Antonio Henriques wrote:
>>Adrian,
>>
>> With Latest snapshot SQUID-2.HEAD, i?ve got a Segmentation fault in
>>Squid
>>when try to access youtube.
>> If i try other URL that isn?t in store_rewrite_list, it worked well.
>> Do you need some more information about it ? I tried gdb but squid
>>running
>>in threads is so difficult to trace.
>> I used same ./configure options that had used in 
>> squid-2.HEAD-20071126.

>>
>> Best regards,
>>
>>Gleidson Antonio Henriques
>>
>>- Original Message - 
>>From: "Adrian Chadd" <[EMAIL PROTECTED]>

>>To: "Gleidson Antonio Henriques" <[EMAIL PROTECTED]>
>>Cc: 
>>Sent: Wednesday, November 28, 2007 1:47 PM
>>Subject: Re: [squid-users] looking for testers: google
>>maps/earth/youtubecaching
>>
>>
>>>On Wed, Nov 28, 2007, Gleidson Antonio Henriques wrote:
Hey Adrian,

   For me google maps and google earth are working like a charm ! So
 many
HITS and SWAPOUTS !
>>>
>>>Thats great news! Thats two users who have reported success with
>>>this work. Fantastic.
>>>
>>>Who else is game to give it a go?
>>>
   I know that Youtube part is incomplete for now, but for me it 
 works

sometimes.
   Sometimes my squid screen shows this message and when this message
occurrs the content
   is served from youtube source even if a have the video in cache. 
 For

your knowledge i put the line below.

   2007/11/28 11:18:45| storeClientReadHeader: URL mismatch
2007/11/28 11:18:45|
{http://74.125.1.37/get_video?video_id=ZQkHAytCmFY&origin=dal-v26.dal.youtube.com}
!=
{http://74.125.1.101/get_video?video_id=ZQkHAytCmFY&origin=dal-v26.dal.youtube.com}
>>>
>>>Try upgrading to the latest Squid-2.HEAD snapshot; I think I've fixed
>>>this.
>>>Let me know if not and I'll take another look at it.
>>>
>>>
>>>
>>>Adrian
>>>
>>>-- 
>>>- Xenion - http://www.xenion.com.au/ - VPS Hosting - Commercial Squid

>>>Support -
>
>-- 
>- Xenion - http://www.xenion.com.au/ - VPS Hosting - Commercial Squid

>Support -
>- $25/pm entry-level VPSes w/ capped bandwidth charges available in WA -


--
- Xenion - http://www.xenion.com.au/ - VPS Hosting - Commercial Squid 
Support -
- $25/pm entry-level VPSes w/ capped bandwidth charges available in WA - 




Re: [squid-users] looking for testers: google maps/earth/youtubecaching

2007-11-28 Thread Gleidson Antonio Henriques

Ok,

But i didn´t understood what do you mean about print to strcmp...
I put below the steps that i had done.

--->
(gdb) frame 1
#1  0x0806aa6d in clientCacheHit (data=0x96453c8, rep=0x9646388) at 
client_side.c:2192
2192} else if (r->store_url && strcmp(mem->store_url, r->store_url) 
!= 0) {

(gdb) print strcmp
$3 = {} 0x4ebd40 
<--

That´s all right for you or i´m wrong about the print command ?

Thanks in Advance,

Gleidson Antonio Henriques

- Original Message - 
From: "Adrian Chadd" <[EMAIL PROTECTED]>

To: "Gleidson Antonio Henriques" <[EMAIL PROTECTED]>
Cc: "Adrian Chadd" <[EMAIL PROTECTED]>; 
Sent: Wednesday, November 28, 2007 3:57 PM
Subject: Re: [squid-users] looking for testers: google 
maps/earth/youtubecaching




Ok cool. Can you go frame 1, and then print the arguments to strcmp?

Thanks,


Adrian

On Wed, Nov 28, 2007, Gleidson Antonio Henriques wrote:

Here we go,

->

(gdb) bt
#0  0x004ebd48 in strcmp () from /lib/libc.so.6
#1  0x0806aa6d in clientCacheHit (data=0x8840760, rep=0x8841720) at
client_side.c:2192
#2  0x080b3ec8 in storeClientCallback (sc=0x88417e8, sz=312) at
store_client.c:146
#3  0x080b482e in storeClientReadHeader (data=0x88417e8, buf=0x8841828 
"",

len=439) at store_client.c:489
#4  0x080c70f6 in storeAufsReadDone (fd=25, my_data=0x87e8608,
buf=0x8842ae0 "\003\177", len=439, errflag=0) at aufs/store_io_aufs.c:400
#5  0x080c94a1 in aioCheckCallbacks (SD=0x85ca9f8) at aufs/async_io.c:319
#6  0x080b56bf in storeDirCallback () at store_dir.c:511
#7  0x0806f357 in comm_select (msec=379) at comm_generic.c:377
#8  0x0809a618 in main (argc=2, argv=0xbfae3da4) at main.c:856

<-

Best Regards,

Gleidson Antonio Henriques

- Original Message - 
From: "Adrian Chadd" <[EMAIL PROTECTED]>

To: "Gleidson Antonio Henriques" <[EMAIL PROTECTED]>
Cc: "Adrian Chadd" <[EMAIL PROTECTED]>; 


Sent: Wednesday, November 28, 2007 3:03 PM
Subject: Re: [squid-users] looking for testers: google
maps/earth/youtubecaching


>Hm, can you get a stack trace at all?
>
>Its not hard in gdb, just put this in ~/.gdbinit
>
>handle all nostop noprint
>handle SIGSEGV stop
>break xassert
>break fatal
>
>then gdb squid
>run -ND
>.. wait.
>
>
>
>
>Adrian
>
>On Wed, Nov 28, 2007, Gleidson Antonio Henriques wrote:
>>Adrian,
>>
>> With Latest snapshot SQUID-2.HEAD, i?ve got a Segmentation fault in
>>Squid
>>when try to access youtube.
>> If i try other URL that isn?t in store_rewrite_list, it worked well.
>> Do you need some more information about it ? I tried gdb but squid
>>running
>>in threads is so difficult to trace.
>> I used same ./configure options that had used in 
>> squid-2.HEAD-20071126.

>>
>> Best regards,
>>
>>Gleidson Antonio Henriques
>>
>>- Original Message - 
>>From: "Adrian Chadd" <[EMAIL PROTECTED]>

>>To: "Gleidson Antonio Henriques" <[EMAIL PROTECTED]>
>>Cc: 
>>Sent: Wednesday, November 28, 2007 1:47 PM
>>Subject: Re: [squid-users] looking for testers: google
>>maps/earth/youtubecaching
>>
>>
>>>On Wed, Nov 28, 2007, Gleidson Antonio Henriques wrote:
Hey Adrian,

   For me google maps and google earth are working like a charm ! So
 many
HITS and SWAPOUTS !
>>>
>>>Thats great news! Thats two users who have reported success with
>>>this work. Fantastic.
>>>
>>>Who else is game to give it a go?
>>>
   I know that Youtube part is incomplete for now, but for me it 
 works

sometimes.
   Sometimes my squid screen shows this message and when this message
occurrs the content
   is served from youtube source even if a have the video in cache. 
 For

your knowledge i put the line below.

   2007/11/28 11:18:45| storeClientReadHeader: URL mismatch
2007/11/28 11:18:45|
{http://74.125.1.37/get_video?video_id=ZQkHAytCmFY&origin=dal-v26.dal.youtube.com}
!=
{http://74.125.1.101/get_video?video_id=ZQkHAytCmFY&origin=dal-v26.dal.youtube.com}
>>>
>>>Try upgrading to the latest Squid-2.HEAD snapshot; I think I've fixed
>>>this.
>>>Let me know if not and I'll take another look at it.
>>>
>>>
>>>
>>>Adrian
>>>
>>>-- 
>>>- Xenion - http://www.xenion.com.au/ - VPS Hosting - Commercial Squid

>>>Support -
>
>-- 
>- Xenion - http://www.xenion.com.au/ - VPS Hosting - Commercial Squid

>Support -
>- $25/pm entry-level VPSes w/ capped bandwidth charges available in WA -


--
- Xenion - http://www.xenion.com.au/ - VPS Hosting - Commercial Squid 
Support -
- $25/pm entry-level VPSes w/ capped bandwidth charges available in WA - 




[squid-users] ufdbGuard v1.14 is released

2007-11-28 Thread Marcus Kool


ufdbGuard v1.14 is the URL filter for Squid that can do 50,000 URL 
verifications/sec.

Version 1.14 includes 2 major and 6 minor fixes and fixes the performance 
problem
that was introduced in version 1.13.

New features include:
- SafeSearch enforcement is extended with Excite, dogpile.co.uk and Yahoo video
- Squid log file analyzer
- ligthweight http daemon to serve the URL redirection messages (a.k.a. the
  "no access" messages)

ufdbGuard can be downloaded from http://sourceforge.net or 
http://www.urlfilterdb.com



Re: [squid-users] looking for testers: google maps/earth/youtubecaching

2007-11-28 Thread Adrian Chadd
Ok cool. Can you go frame 1, and then print the arguments to strcmp?

Thanks,


Adrian

On Wed, Nov 28, 2007, Gleidson Antonio Henriques wrote:
> Here we go,
> 
> ->
> 
> (gdb) bt
> #0  0x004ebd48 in strcmp () from /lib/libc.so.6
> #1  0x0806aa6d in clientCacheHit (data=0x8840760, rep=0x8841720) at 
> client_side.c:2192
> #2  0x080b3ec8 in storeClientCallback (sc=0x88417e8, sz=312) at 
> store_client.c:146
> #3  0x080b482e in storeClientReadHeader (data=0x88417e8, buf=0x8841828 "", 
> len=439) at store_client.c:489
> #4  0x080c70f6 in storeAufsReadDone (fd=25, my_data=0x87e8608, 
> buf=0x8842ae0 "\003\177", len=439, errflag=0) at aufs/store_io_aufs.c:400
> #5  0x080c94a1 in aioCheckCallbacks (SD=0x85ca9f8) at aufs/async_io.c:319
> #6  0x080b56bf in storeDirCallback () at store_dir.c:511
> #7  0x0806f357 in comm_select (msec=379) at comm_generic.c:377
> #8  0x0809a618 in main (argc=2, argv=0xbfae3da4) at main.c:856
> 
> <-
> 
> Best Regards,
> 
> Gleidson Antonio Henriques
> 
> - Original Message - 
> From: "Adrian Chadd" <[EMAIL PROTECTED]>
> To: "Gleidson Antonio Henriques" <[EMAIL PROTECTED]>
> Cc: "Adrian Chadd" <[EMAIL PROTECTED]>; 
> Sent: Wednesday, November 28, 2007 3:03 PM
> Subject: Re: [squid-users] looking for testers: google 
> maps/earth/youtubecaching
> 
> 
> >Hm, can you get a stack trace at all?
> >
> >Its not hard in gdb, just put this in ~/.gdbinit
> >
> >handle all nostop noprint
> >handle SIGSEGV stop
> >break xassert
> >break fatal
> >
> >then gdb squid
> >run -ND
> >.. wait.
> >
> >
> >
> >
> >Adrian
> >
> >On Wed, Nov 28, 2007, Gleidson Antonio Henriques wrote:
> >>Adrian,
> >>
> >> With Latest snapshot SQUID-2.HEAD, i?ve got a Segmentation fault in 
> >>Squid
> >>when try to access youtube.
> >> If i try other URL that isn?t in store_rewrite_list, it worked well.
> >> Do you need some more information about it ? I tried gdb but squid 
> >>running
> >>in threads is so difficult to trace.
> >> I used same ./configure options that had used in squid-2.HEAD-20071126.
> >>
> >> Best regards,
> >>
> >>Gleidson Antonio Henriques
> >>
> >>- Original Message - 
> >>From: "Adrian Chadd" <[EMAIL PROTECTED]>
> >>To: "Gleidson Antonio Henriques" <[EMAIL PROTECTED]>
> >>Cc: 
> >>Sent: Wednesday, November 28, 2007 1:47 PM
> >>Subject: Re: [squid-users] looking for testers: google
> >>maps/earth/youtubecaching
> >>
> >>
> >>>On Wed, Nov 28, 2007, Gleidson Antonio Henriques wrote:
> Hey Adrian,
> 
>    For me google maps and google earth are working like a charm ! So 
>  many
> HITS and SWAPOUTS !
> >>>
> >>>Thats great news! Thats two users who have reported success with
> >>>this work. Fantastic.
> >>>
> >>>Who else is game to give it a go?
> >>>
>    I know that Youtube part is incomplete for now, but for me it works
> sometimes.
>    Sometimes my squid screen shows this message and when this message
> occurrs the content
>    is served from youtube source even if a have the video in cache. For
> your knowledge i put the line below.
> 
>    2007/11/28 11:18:45| storeClientReadHeader: URL mismatch
> 2007/11/28 11:18:45|
> {http://74.125.1.37/get_video?video_id=ZQkHAytCmFY&origin=dal-v26.dal.youtube.com}
> !=
> {http://74.125.1.101/get_video?video_id=ZQkHAytCmFY&origin=dal-v26.dal.youtube.com}
> >>>
> >>>Try upgrading to the latest Squid-2.HEAD snapshot; I think I've fixed
> >>>this.
> >>>Let me know if not and I'll take another look at it.
> >>>
> >>>
> >>>
> >>>Adrian
> >>>
> >>>-- 
> >>>- Xenion - http://www.xenion.com.au/ - VPS Hosting - Commercial Squid
> >>>Support -
> >
> >-- 
> >- Xenion - http://www.xenion.com.au/ - VPS Hosting - Commercial Squid 
> >Support -
> >- $25/pm entry-level VPSes w/ capped bandwidth charges available in WA - 

-- 
- Xenion - http://www.xenion.com.au/ - VPS Hosting - Commercial Squid Support -
- $25/pm entry-level VPSes w/ capped bandwidth charges available in WA -


[squid-users] wbinfo_group.pl - This ever happen to anyone?

2007-11-28 Thread Terry Dobbs
Ok, so I have wbinfo_group.pl working nicely on our local squid box, it
blocks users belonging to a particular group.

However, I have done the exact same thing on a remote box in the USA and
it doesn't want to work. When I run wbinfo -r username I get no results,
I used to get "could not get groups for user" now I get nothing
returned. I am pretty sure this is what is causing the wbinfo_group.pl
not to work. The logs don't give me much useful information.

On the local box that wbinfo_group.pl is working I get the error "Could
Not convert SID='S-1-5-21-1122444-424242525-5353622-42124- User(1) to
gid" when I do it manually, but I thought nothing of it as it works. The
same thing happens on the box that isn't working as well, but that box
will not even do a wbinfo -r.

I have verified that I do indeed had idmap uid and gids mapped in
smb.conf. I have used the local Domain Controller at the remote site to
authenticate thinking it may be timing out or something (is there a
timeout?) I have goggled like crazy, I know this is a SMB issue but im
wondering if anyone has ever had a similar issue in their squid setup. 

I would be VERY grateful if anyone has any kind of insight.


Re: [squid-users] looking for testers: google maps/earth/youtubecaching

2007-11-28 Thread Gleidson Antonio Henriques

Here we go,

->

(gdb) bt
#0  0x004ebd48 in strcmp () from /lib/libc.so.6
#1  0x0806aa6d in clientCacheHit (data=0x8840760, rep=0x8841720) at 
client_side.c:2192
#2  0x080b3ec8 in storeClientCallback (sc=0x88417e8, sz=312) at 
store_client.c:146
#3  0x080b482e in storeClientReadHeader (data=0x88417e8, buf=0x8841828 "", 
len=439) at store_client.c:489
#4  0x080c70f6 in storeAufsReadDone (fd=25, my_data=0x87e8608, buf=0x8842ae0 
"\003\177", len=439, errflag=0) at aufs/store_io_aufs.c:400

#5  0x080c94a1 in aioCheckCallbacks (SD=0x85ca9f8) at aufs/async_io.c:319
#6  0x080b56bf in storeDirCallback () at store_dir.c:511
#7  0x0806f357 in comm_select (msec=379) at comm_generic.c:377
#8  0x0809a618 in main (argc=2, argv=0xbfae3da4) at main.c:856

<-

Best Regards,

Gleidson Antonio Henriques

- Original Message - 
From: "Adrian Chadd" <[EMAIL PROTECTED]>

To: "Gleidson Antonio Henriques" <[EMAIL PROTECTED]>
Cc: "Adrian Chadd" <[EMAIL PROTECTED]>; 
Sent: Wednesday, November 28, 2007 3:03 PM
Subject: Re: [squid-users] looking for testers: google 
maps/earth/youtubecaching




Hm, can you get a stack trace at all?

Its not hard in gdb, just put this in ~/.gdbinit

handle all nostop noprint
handle SIGSEGV stop
break xassert
break fatal

then gdb squid
run -ND
.. wait.




Adrian

On Wed, Nov 28, 2007, Gleidson Antonio Henriques wrote:

Adrian,

 With Latest snapshot SQUID-2.HEAD, i?ve got a Segmentation fault in 
Squid

when try to access youtube.
 If i try other URL that isn?t in store_rewrite_list, it worked well.
 Do you need some more information about it ? I tried gdb but squid 
running

in threads is so difficult to trace.
 I used same ./configure options that had used in squid-2.HEAD-20071126.

 Best regards,

Gleidson Antonio Henriques

- Original Message - 
From: "Adrian Chadd" <[EMAIL PROTECTED]>

To: "Gleidson Antonio Henriques" <[EMAIL PROTECTED]>
Cc: 
Sent: Wednesday, November 28, 2007 1:47 PM
Subject: Re: [squid-users] looking for testers: google
maps/earth/youtubecaching


>On Wed, Nov 28, 2007, Gleidson Antonio Henriques wrote:
>>Hey Adrian,
>>
>>   For me google maps and google earth are working like a charm ! So 
>> many

>>HITS and SWAPOUTS !
>
>Thats great news! Thats two users who have reported success with
>this work. Fantastic.
>
>Who else is game to give it a go?
>
>>   I know that Youtube part is incomplete for now, but for me it works
>>sometimes.
>>   Sometimes my squid screen shows this message and when this message
>>occurrs the content
>>   is served from youtube source even if a have the video in cache. For
>>your knowledge i put the line below.
>>
>>   2007/11/28 11:18:45| storeClientReadHeader: URL mismatch
>>2007/11/28 11:18:45|
>>{http://74.125.1.37/get_video?video_id=ZQkHAytCmFY&origin=dal-v26.dal.youtube.com}
>>!=
>>{http://74.125.1.101/get_video?video_id=ZQkHAytCmFY&origin=dal-v26.dal.youtube.com}
>
>Try upgrading to the latest Squid-2.HEAD snapshot; I think I've fixed
>this.
>Let me know if not and I'll take another look at it.
>
>
>
>Adrian
>
>-- 
>- Xenion - http://www.xenion.com.au/ - VPS Hosting - Commercial Squid

>Support -


--
- Xenion - http://www.xenion.com.au/ - VPS Hosting - Commercial Squid 
Support -
- $25/pm entry-level VPSes w/ capped bandwidth charges available in WA - 




Re: [squid-users] looking for testers: google maps/earth/youtubecaching

2007-11-28 Thread Adrian Chadd
Hm, can you get a stack trace at all?

Its not hard in gdb, just put this in ~/.gdbinit

handle all nostop noprint
handle SIGSEGV stop
break xassert
break fatal

then gdb squid
run -ND
.. wait.




Adrian

On Wed, Nov 28, 2007, Gleidson Antonio Henriques wrote:
> Adrian,
> 
>  With Latest snapshot SQUID-2.HEAD, i?ve got a Segmentation fault in Squid 
> when try to access youtube.
>  If i try other URL that isn?t in store_rewrite_list, it worked well.
>  Do you need some more information about it ? I tried gdb but squid running 
> in threads is so difficult to trace.
>  I used same ./configure options that had used in squid-2.HEAD-20071126.
> 
>  Best regards,
> 
> Gleidson Antonio Henriques
> 
> - Original Message - 
> From: "Adrian Chadd" <[EMAIL PROTECTED]>
> To: "Gleidson Antonio Henriques" <[EMAIL PROTECTED]>
> Cc: 
> Sent: Wednesday, November 28, 2007 1:47 PM
> Subject: Re: [squid-users] looking for testers: google 
> maps/earth/youtubecaching
> 
> 
> >On Wed, Nov 28, 2007, Gleidson Antonio Henriques wrote:
> >>Hey Adrian,
> >>
> >>   For me google maps and google earth are working like a charm ! So many
> >>HITS and SWAPOUTS !
> >
> >Thats great news! Thats two users who have reported success with
> >this work. Fantastic.
> >
> >Who else is game to give it a go?
> >
> >>   I know that Youtube part is incomplete for now, but for me it works
> >>sometimes.
> >>   Sometimes my squid screen shows this message and when this message
> >>occurrs the content
> >>   is served from youtube source even if a have the video in cache. For
> >>your knowledge i put the line below.
> >>
> >>   2007/11/28 11:18:45| storeClientReadHeader: URL mismatch
> >>2007/11/28 11:18:45|
> >>{http://74.125.1.37/get_video?video_id=ZQkHAytCmFY&origin=dal-v26.dal.youtube.com}
> >>!=
> >>{http://74.125.1.101/get_video?video_id=ZQkHAytCmFY&origin=dal-v26.dal.youtube.com}
> >
> >Try upgrading to the latest Squid-2.HEAD snapshot; I think I've fixed 
> >this.
> >Let me know if not and I'll take another look at it.
> >
> >
> >
> >Adrian
> >
> >-- 
> >- Xenion - http://www.xenion.com.au/ - VPS Hosting - Commercial Squid 
> >Support - 

-- 
- Xenion - http://www.xenion.com.au/ - VPS Hosting - Commercial Squid Support -
- $25/pm entry-level VPSes w/ capped bandwidth charges available in WA -


Re: [squid-users] looking for testers: google maps/earth/youtubecaching

2007-11-28 Thread Gleidson Antonio Henriques

Adrian,

 With Latest snapshot SQUID-2.HEAD, i´ve got a Segmentation fault in Squid 
when try to access youtube.

 If i try other URL that isn´t in store_rewrite_list, it worked well.
 Do you need some more information about it ? I tried gdb but squid running 
in threads is so difficult to trace.

 I used same ./configure options that had used in squid-2.HEAD-20071126.

 Best regards,

Gleidson Antonio Henriques

- Original Message - 
From: "Adrian Chadd" <[EMAIL PROTECTED]>

To: "Gleidson Antonio Henriques" <[EMAIL PROTECTED]>
Cc: 
Sent: Wednesday, November 28, 2007 1:47 PM
Subject: Re: [squid-users] looking for testers: google 
maps/earth/youtubecaching




On Wed, Nov 28, 2007, Gleidson Antonio Henriques wrote:

Hey Adrian,

   For me google maps and google earth are working like a charm ! So many
HITS and SWAPOUTS !


Thats great news! Thats two users who have reported success with
this work. Fantastic.

Who else is game to give it a go?


   I know that Youtube part is incomplete for now, but for me it works
sometimes.
   Sometimes my squid screen shows this message and when this message
occurrs the content
   is served from youtube source even if a have the video in cache. For
your knowledge i put the line below.

   2007/11/28 11:18:45| storeClientReadHeader: URL mismatch
2007/11/28 11:18:45|
{http://74.125.1.37/get_video?video_id=ZQkHAytCmFY&origin=dal-v26.dal.youtube.com}
!=
{http://74.125.1.101/get_video?video_id=ZQkHAytCmFY&origin=dal-v26.dal.youtube.com}


Try upgrading to the latest Squid-2.HEAD snapshot; I think I've fixed 
this.

Let me know if not and I'll take another look at it.



Adrian

--
- Xenion - http://www.xenion.com.au/ - VPS Hosting - Commercial Squid 
Support - 




Re: [squid-users] looking for testers: google maps/earth/youtubecaching

2007-11-28 Thread Andreas Pettersson

Gleidson Antonio Henriques wrote:
   For me google maps and google earth are working like a charm ! So 
many HITS and SWAPOUTS !


When you write "Google Earth", do you mean the full-blown client or some 
kind of web page or browser plugin? Because when I run GE all I see in 
access.log is MISSes all the way, like the one below. Adrian pointed out 
in a chat that GE isn't yet cachable at all..


1196172074.516136 192.168.1.20 TCP_MISS/200 797 GET 
http://kh.google.com/flatfile?f1c-0203020030103-t.133 - 
DIRECT/64.233.161.91 application/octet-stream



--
Andreas




Re: [squid-users] looking for testers: google maps/earth/youtubecaching

2007-11-28 Thread Adrian Chadd
On Wed, Nov 28, 2007, Gleidson Antonio Henriques wrote:
> Hey Adrian,
> 
>For me google maps and google earth are working like a charm ! So many 
> HITS and SWAPOUTS !

Thats great news! Thats two users who have reported success with
this work. Fantastic.

Who else is game to give it a go?

>I know that Youtube part is incomplete for now, but for me it works 
> sometimes.
>Sometimes my squid screen shows this message and when this message 
> occurrs the content
>is served from youtube source even if a have the video in cache. For 
> your knowledge i put the line below.
> 
>2007/11/28 11:18:45| storeClientReadHeader: URL mismatch
> 2007/11/28 11:18:45| 
> {http://74.125.1.37/get_video?video_id=ZQkHAytCmFY&origin=dal-v26.dal.youtube.com}
>  
> != 
> {http://74.125.1.101/get_video?video_id=ZQkHAytCmFY&origin=dal-v26.dal.youtube.com}

Try upgrading to the latest Squid-2.HEAD snapshot; I think I've fixed this.
Let me know if not and I'll take another look at it.



Adrian

-- 
- Xenion - http://www.xenion.com.au/ - VPS Hosting - Commercial Squid Support -


Re: [squid-users] Concurrent question

2007-11-28 Thread Tek Bahadur Limbu

Hi Monah,

Monah Baki wrote:

Hi all,

I'm running squid 2.6 stable 16 on a Pentium III 500Mhz with 512MB RAM, 
IDE HDD, installed FreeBSD 6.3 with the following:


--enable-storeio=ufs,diskd,null --enable-underscores --with-large-files 
--enable-large-cache-files --enable-delay-pools --disable-ident-lookups 
--enable-snmp --enable-removal-policies --enable-async-io --enable-kqueue


I would add the following compilation parameters to --enable-storeio:

 '--enable-storeio=ufs,coss,diskd,aufs,null'

Just in case, you may want to try the aufs or coss storage schemes.

As far as I know, if you include aufs in --enable-storeio, then you 
don't need the "--enable-async-io" parameter.





Added into the /boot/loader.conf:

kern.ipc.nmbclusters: 32768
kern.maxfiles=65536
kern.maxfilesperproc=32768
net.inet.ip.portrange.last: 65535


I suggest increasing the kern.ipc.nmbclusters to at least 65536. I have 
too often faced the shortage of mbufs in FreeBSD!






Compiled kernel with these options:
options SHMSEG=16
options SHMMNI=32
options SHMMAX=2097152
options SHMALL=4096
options MAXFILES=8192


I'm also running Dans Guardian on it too.




My question is approximately how many users can I proxy for?


From my experience, if you don't have too many or complicated filtering 
rules in both Dans Guardian and Squid, then it should be scalable to 
about 200 - 500 users.


A lot will also depend upon your internet connection link and your users 
browsing habits. The size of bandwidth pipe and it's medium will also 
determine how many users your proxy can handle.


And of course as Adrian mentioned, active monitoring and collecting 
statistics from Squid and your FreeBSD machine via SNMP and MRTG/RRD 
will help you out.



Thanking you...




Thanks


BSD Networking, Microsoft Notworking


Cool phrase!!!











--

With best regards and good wishes,

Yours sincerely,

Tek Bahadur Limbu

System Administrator

(TAG/TDG Group)
Jwl Systems Department

Worldlink Communications Pvt. Ltd.

Jawalakhel, Nepal

http://www.wlink.com.np

http://teklimbu.wordpress.com


RE: [squid-users] Anyone Use wbinfo_group.pl?

2007-11-28 Thread Terry Dobbs
What exactly do you mean?

Should I set it up like this?
external_acl_type ntgroup %LOGIN /usr/lib/squid/wbinfo_group.pl
acl NoInternet external ntgroup NoInternet

http_access deny NoInternet ALL

So by default the last thing on the line is AUTH? What exactly does the
ALL do to make it not pop up (it appears to work btw).

Also, when changing group membership in AD, for the changes to take
effect would you have to reload squid, samba, and winbind? Is there
anyway (other than editing the default squid error page, to redirect
them if they are blocked? I do this with squidguard, not sure if its
possible with this script/squid.

Thanks


-Original Message-
From: Amos Jeffries [mailto:[EMAIL PROTECTED] 
Sent: Wednesday, November 28, 2007 3:15 AM
To: Terry Dobbs
Cc: squid-users@squid-cache.org
Subject: Re: [squid-users] Anyone Use wbinfo_group.pl?

Terry Dobbs wrote:
> Hey
> 
> I have a transparent proxy setup using squid, winbind, samba, etc... I
> got sick of manually blocking IP addresses from accessing the internet
> and stumbled across an article (thank god for google!) that allows
> access based on AD Group.
> 
> It pretty much looks like...
> 
> external_acl_type ntgroup %LOGIN /usr/lib/squid/wbinfo_group.pl
> acl NoInternet external ntgroup NoInternet
> 
> Then there is the http_access deny line that denies the NoInternet
> group.
> 
> This seems to work fine, if a user belongs to the NoInternet group
they
> are prompted for Username/Password and even if they put in the correct
> credentials they aren't allowed to go anywhere. 
> 
> My question is, instead of prompting for username/password if a user
> belongs to the group, how do I just redirect them to a page? No other
> time is my users prompted for authentication as it uses the NT "pass
> through" credentials, so not sure why it wants to prompt now.
> 
> Hoping someone out there is doing something similar? 

The credientials are asked again because auth is the last option to 
complete the http_access rule.

There is a hack/workaround of adding 'all' as the last item on the line 
which apparently prevents the credentials being sought if they fail the 
first time.

I suspect your other rules go something like
   http_access !noauth localnet
which has the same effect of not requesting again on failure.

Amos


[squid-users] new squid setup, optimizing cache

2007-11-28 Thread Dave

Hello,
   I'm setting up squid on a new machine, this one an OpenBSD 4.2 router. 
The install went fine and all, but now i come to configuring squid. I gave 
the squid cache it's own partition in this case /dev/wd0h, on the first 
drive, it's got soft updates enabled on it, and is 17.6 gb in size. I was 
wondering the best squid optimizations?
   I read that the cache size shouldn't exceed 80% of the total space so 
would that be 80% of the 17.6 g partition And when entering this on the 
cache_dir line should the size be approx 15000 or should i append something, 
15g for instance?or the entire disk? For the memory and cache replacement 
policies Heap LFUDa, or is there a better choice?
   Unrelated to cache, but my logformat i've got the squid logformat 
uncommented, i'd like to display it's time formats in local time for 
interpreting at a glance. Is this doable?

Thanks.
Dave.



Re: [squid-users] looking for testers: google maps/earth/youtubecaching

2007-11-28 Thread Gleidson Antonio Henriques

Hey Adrian,

   For me google maps and google earth are working like a charm ! So many 
HITS and SWAPOUTS !
   I know that Youtube part is incomplete for now, but for me it works 
sometimes.
   Sometimes my squid screen shows this message and when this message 
occurrs the content
   is served from youtube source even if a have the video in cache. For 
your knowledge i put the line below.


   2007/11/28 11:18:45| storeClientReadHeader: URL mismatch
2007/11/28 11:18:45| 
{http://74.125.1.37/get_video?video_id=ZQkHAytCmFY&origin=dal-v26.dal.youtube.com} 
!= 
{http://74.125.1.101/get_video?video_id=ZQkHAytCmFY&origin=dal-v26.dal.youtube.com}


   Best regards,

Gleidson Antonio Henriques

- Original Message - 
From: "Henrik Nordstrom" <[EMAIL PROTECTED]>

To: "Andreas Pettersson" <[EMAIL PROTECTED]>
Cc: "Adrian Chadd" <[EMAIL PROTECTED]>; 
Sent: Tuesday, November 27, 2007 12:54 PM
Subject: Re: [squid-users] looking for testers: google 
maps/earth/youtubecaching





AW: [squid-users] Making a Squid Start Page

2007-11-28 Thread Markus.Rietzler
simple solution would be to add a timeflag in the perl script and check against 
it.
don't know whether you can achieve this via squid...

in this case 2 hours (7200):

  if (!defined($logged_in{$_})) {
$logged_in{$_} = time();
print "ERR\n";
  } elsif ($logged_in{$_} < time() - 7200) {
print "ERR\n";
  } else {
print "OK\n";
  }

>-Ursprüngliche Nachricht-
>Von: Reid [mailto:[EMAIL PROTECTED] 
>Gesendet: Dienstag, 27. November 2007 18:15
>An: squid-users@squid-cache.org
>Betreff: [squid-users] Making a Squid Start Page
>
>I am trying to create a start page so that Squid will display 
>a disclaimer when clients login. 
>
>I found the following script on a previous posting, and put it 
>on my server:
>
>#!/usr/bin/perl
>$|=1;
>my %logged_in;
>
>while(<>) {
>  if (!defined($logged_in{$_})) {
>$logged_in{$_} = 1;
>print "ERR\n";
>  } else {
>print "OK\n";
>  }
>}
>
>
>I gave it proper permissions, and then put the following in my 
>squid.conf:
>
>
>external_acl_type session negative_ttl=0 %LOGIN 
>/usr/lib/squid/my_squid_session.pl
>acl session external session
>http_access deny !session
>deny_info http://##.##.##.##/startpage.php session
>
>
>Using this configuration, the script will redirect the client 
>the first time logging in, but not
>again. Even 24 hours later and after deleting cookies, if I 
>login again I am not redirected. I
>have to restart squid in order to get another redirect to take place.
>
>So I changed the first line to: 
>
>external_acl_type session ttl=300 negative_ttl=0 %LOGIN 
>/usr/lib/squid/my_squid_session.pl
>
>That doesn't seem to make a difference.  
>
>Any ideas? I am trying to avoid using the "squid_session" 
>helper because I get all sorts of errors
>when using it together with digest_authentication.
>
>Thank you!
>
>
>  
>___
>_
>Never miss a thing.  Make Yahoo your home page. 
>http://www.yahoo.com/r/hs
>


[squid-users] error: Could not get groups for user

2007-11-28 Thread Francisco Martinez Espadas

Hello,

since a few days ago I am having some problems with users trying to
access Internet through proxy. 

I have squid (2.5 stable 14) with user validation against an Active
Directory. I have a single domain and a sub-domain. Both have an
attribute 
that marks users who have access granted to Internet: "Internet Default"
or "SUBDOMAIN\Internet Default". 

User acces Control is defined in "squid.conf" this way:

auth_param ntlm program /usr/bin/ntlm_auth 
--helper-protocol=squid-2.5-ntlmssp
auth_param ntlm children 30
auth_param basic children 5
auth_param basic realm DOMAIN
auth_param basic credentialsttl 2 hour

external_acl_type wb_group ttl=900 %
LOGIN /usr/lib/squid/wbinfo_group.pl

acl ACCES_INTERNET external wb_group "/etc/squid/grupo-AD"

This is the content of my "/etc/squid/grupo-AD" file:
"Internet Default" 
"SUBDOMAIN\Internet Default"

The problem is that users on the main domain have access to internet,
but some users on the subdomain don't. They are getting
ERR_ACCESS_DENIED error:

This is what is logged in "cache.log":
Could not get groups for user SUBDOMAIN\\user1


If I run "usr/lib/squid/wbinfo_group.pl" via command line (assuming
user2 is working and user1 is not) 

with params:
SUBDOMAIN\user1 "\" SUBDOMAIN\\internet default\""
Sending OK to squid

SUBDOMAIN\user2 "\" SUBDOMAIN\\internet default\""
Sending OK to squid

An the same but with two backslashes:
SUBDOMAIN\\user1 "\" SUBDOMAIN\\internet default\""
Could not get groups for user SUBDOMAIN\\user1
Sending ERR to squid

SUBDOMAIN\\user2 "\" SUBDOMAIN\\internet default\""
Sending OK to squid


This is the content of the file "usr/lib/squid/wbinfo_group.pl":

> sub check {
> local($user, $group) = @_;
> $groupSID = `wbinfo -n "$group" | cut -d" " -f1`;
> chop $groupSID;
> $groupGID = `wbinfo -Y "$groupSID"`;
> chop $groupGID;
> &debug( "User: -$user-\nGroup: -$group-\nSID:
> -$groupSID-\nGID: -$groupGID-");
> return 'OK' if(`wbinfo -r \Q$user\E` =~ /^$groupGID$/m);
> return 'ERR';
> }
> 
> # 
> # Main loop
> #
> while () {
> chop;
> &debug ("Got $_ from squid");
> if( $_ =~ /^"?([^"]+)"? / ) {
> $user = $1;
> }
> if( $_ =~ /(( "?\\"[^"]+\\""?)+)/i ) {
> $groups = $1;
> }
> s/"\\/\\/g for $groups;
> s/""/"/g for $groups;
> s/\\ / /g for $groups;
> $groups = substr($groups, 3, length($groups)-5);
> @groups = split /\\" \\"/, $groups;
> foreach $group(@groups) {
> $ans = &check($user, $group);
> last if($ans eq 'OK');
> }
> &debug ("Sending $ans to squid");
> print "$ans\n";
> } 
> 

Any help please?

Thank you so much

OS: Ubuntu 6.06
Squid 2.5 STABLE 14
LDAP: Active Directory (Windows 2003)



AW: AW: AW: [squid-users] Authentication on Active Directory

2007-11-28 Thread Ralf.Lutz

Isnard, I think I have a problem with samba / winbind. I tried squid using the 
squid_unix_group with the machine I´ve configured with Kerberos and it worked. 
Now I configured samba on a testmachine that was unconfigured before and tried 
wbinfo -g and I become error messages. So I think, that squid and 
squid_unix_group had no problem yesterday, but Samba didn´t worked well. I now 
try to get Samba working without Kerberos (I think it worked yesterday after I 
configured Kerberos).

Regards, Ralf


Re: [squid-users] Incoming plain connection, outgoing ssl connection

2007-11-28 Thread Davide Vernizzi

On Wed, 2007-11-28 at 21:26 +1300, Amos Jeffries wrote:
> Davide Vernizzi wrote:
> > On Wed, 2007-11-28 at 11:35 +1300, Amos Jeffries wrote:
> >>> Hi,
> >>>
> >>> can I have an incoming connection on the port 80 and forward it to the
> >>> destination on another port within a ssl connection?
> >>>
> >>> I try to explain with a diagram:
> >>>
> >>> Web
> >>> Browser > Squid => Destination
> >>>   httphttps
> >>>   portport
> >>>80  443
> >>>
> >>> Many thanks.
> >>>
> >> Yes. It happens on two occasions:
> >>  - ssl options set on the cache_peer config.
> >>  - client requests a URI from cache as https://...
> >>
> >>
> >> Amos
> > 
> > Thanks Amos,
> > 
> > what I wanted to know is if it is possible to force this behavior so
> > that the web browser requests a specified web page and squid redirects
> > this request to another server using a SSL connection.
> > 
> > Original
> >   destination
> >   (x.x.x.x:P)
> > 
> > 
> >  Web
> >  Browser > Squid =>   Squid-modified
> > httphttps   destination
> >   request to(y.y.y.y:Q)
> > for   y.y.y.y:Q
> >  x.x.x.x:P
> > 
> > Thanks.
> > 
> 
> If its for a small set of pre-known domain or URI, its easy.
> 
> Thats exactly how the cache_peer way works. The alternate to that under 
> the same conditions a custom redirector can change the URI to pretty 
> much anything before squid opens its outbound link.

OK, that's what I wanted to to. I'll give a look at the cache_peer and
I'll try.

Thanks.

-- 
Davide


smime.p7s
Description: S/MIME cryptographic signature


Re: [squid-users] Incoming plain connection, outgoing ssl connection

2007-11-28 Thread Amos Jeffries

Davide Vernizzi wrote:

On Wed, 2007-11-28 at 11:35 +1300, Amos Jeffries wrote:

Hi,

can I have an incoming connection on the port 80 and forward it to the
destination on another port within a ssl connection?

I try to explain with a diagram:

Web
Browser > Squid => Destination
httphttps
portport
 80  443

Many thanks.


Yes. It happens on two occasions:
 - ssl options set on the cache_peer config.
 - client requests a URI from cache as https://...


Amos


Thanks Amos,

what I wanted to know is if it is possible to force this behavior so
that the web browser requests a specified web page and squid redirects
this request to another server using a SSL connection.

Original
  destination
  (x.x.x.x:P)


 Web
 Browser > Squid =>   Squid-modified
httphttps   destination
  request to(y.y.y.y:Q)
for   y.y.y.y:Q
 x.x.x.x:P

Thanks.



If its for a small set of pre-known domain or URI, its easy.

Thats exactly how the cache_peer way works. The alternate to that under 
the same conditions a custom redirector can change the URI to pretty 
much anything before squid opens its outbound link.



Amos


Re: [squid-users] Incoming plain connection, outgoing ssl connection

2007-11-28 Thread Davide Vernizzi

On Wed, 2007-11-28 at 11:35 +1300, Amos Jeffries wrote:
> > Hi,
> >
> > can I have an incoming connection on the port 80 and forward it to the
> > destination on another port within a ssl connection?
> >
> > I try to explain with a diagram:
> >
> > Web
> > Browser > Squid => Destination
> > httphttps
> > portport
> >  80  443
> >
> > Many thanks.
> >
> 
> Yes. It happens on two occasions:
>  - ssl options set on the cache_peer config.
>  - client requests a URI from cache as https://...
> 
> 
> Amos

Thanks Amos,

what I wanted to know is if it is possible to force this behavior so
that the web browser requests a specified web page and squid redirects
this request to another server using a SSL connection.

Original
  destination
  (x.x.x.x:P)


 Web
 Browser > Squid =>   Squid-modified
httphttps   destination
  request to(y.y.y.y:Q)
for   y.y.y.y:Q
 x.x.x.x:P

Thanks.

-- 
Davide


smime.p7s
Description: S/MIME cryptographic signature


Re: [squid-users] Anyone Use wbinfo_group.pl?

2007-11-28 Thread Amos Jeffries

Terry Dobbs wrote:

Hey

I have a transparent proxy setup using squid, winbind, samba, etc... I
got sick of manually blocking IP addresses from accessing the internet
and stumbled across an article (thank god for google!) that allows
access based on AD Group.

It pretty much looks like...

external_acl_type ntgroup %LOGIN /usr/lib/squid/wbinfo_group.pl
acl NoInternet external ntgroup NoInternet

Then there is the http_access deny line that denies the NoInternet
group.

This seems to work fine, if a user belongs to the NoInternet group they
are prompted for Username/Password and even if they put in the correct
credentials they aren't allowed to go anywhere. 


My question is, instead of prompting for username/password if a user
belongs to the group, how do I just redirect them to a page? No other
time is my users prompted for authentication as it uses the NT "pass
through" credentials, so not sure why it wants to prompt now.

Hoping someone out there is doing something similar? 


The credientials are asked again because auth is the last option to 
complete the http_access rule.


There is a hack/workaround of adding 'all' as the last item on the line 
which apparently prevents the credentials being sought if they fail the 
first time.


I suspect your other rules go something like
  http_access !noauth localnet
which has the same effect of not requesting again on failure.

Amos