[squid-users] blogger.com

2009-02-02 Thread dhottinger
Im having to sift through my access logs to see who may have posted  
some content on a blogger site during working hours.  My question is I  
see HTTP Get and HTTP POST in the access.logs.  I havent been able to  
determine an exact url to the site.  Would it be safe to say that POST  
is an entry?  My sarg logs dont go that in depth when I run it from  
the command line.


thanks,

ddh


--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools

Everything should be made as simple as possible, but not simpler.
-- Albert Einstein

The hottest places in Hell are reserved for those who, in times of moral
crisis, preserved their neutrality.
-- Dante



Re: [squid-users] About PHP proxy

2008-04-12 Thread dhottinger

Quoting Marcus Kool [EMAIL PROTECTED]:


Dwayne,

If you do not redirect+filter HTTPS you can never block
HTTPS-based proxies.  To be able to filter HTTPS the
browsers must be configured to use Squid for HTTP and HTTPS.
Once Squid also proxies the HTTPS traffic, you may use
ufdbGuard.

ufdbGuard is a free redirector which can block HTTPS traffic by
- optionally blocking URLs with an IP address
- optionally blocking sites without a properly signed SSL certificate
- optionally blocking SSH and other tunnels that use HTTPS
- optionally use a URL database

ufdbGuard supports free URL databases and a commercial URL database.

-Marcus

Yes Sir.  I am aware of this.  For the present Im running squid as a  
transparent proxy.  I do have plans to change this in the future  
(meaning next summer).  Thanks for the heads up.  I've looked very  
longily at ufdbGuard.


ddh




Re: [squid-users] About PHP proxy

2008-04-11 Thread dhottinger

Quoting Amos Jeffries [EMAIL PROTECTED]:


Tarak Ranjan wrote:

Hi List;
It's really surprising for me that my proxy has been
bypassed by on of the user using the proxybuilder
proxy. what it's doing is that that particular php
based proxy rewrite the mimetype. and that request
going through my actual proxy server, but as that
script is rewriting the mime type and it's encrypting
as test/html.

Overall whatever the mime type based ACL i have in my
server it's ignoring them. and that person has the
access of those blocked urls .

has anyone faced this kind of situation



Yes, many have. It's a old and never-ending battle for those who are  
 involved.


You could try enumerating all the badness as most beginners do. You
could hand in the towel early and cease to care about your users
wellbeing. Or you could play a bit with the serious avioders.

 Just imagine, redirecting all porn sites downloads seamlessly to
tubgirl dot com for one gross example.

Or if you have families to think of, building a kitten-net can be fun
http://ex-parrot.com/~pete/upside-down-ternet.html

Amos
--
Please use Squid 2.6.STABLE19 or 3.0.STABLE4


Just a quick question.  How would you redirect those requests, if the  
proxy server doesnt recognize them?   Most of my users doing this are  
using https sites that dont go through my proxy server.  My firewall  
only redirects port 80 traffic to my proxy server.  If I could  
redirect these people that would be great.


ddh


--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools

Everything should be made as simple as possible, but not simpler.
-- Albert Einstein



Re: [squid-users] squid transparent proxy

2008-04-03 Thread dhottinger

Quoting Wennie V. Lagmay [EMAIL PROTECTED]:


Hi,

You are right I am using port 8080. As I mentioned I have 2 machine   
the 1st machine is my Firewall/NAT server wherein the iptables   
configuration already stated that it should redirect port 80 to 8080


iptables -t nat -A PREROUTING -s 192.168.10.0/255.255.255.0   -p tcp  
 -m tcp --dport 80 -j REDIRECT --to-ports 8080
iptables -t nat  -A PREROUTING -s 192.168.11.0/255.255.255.0   -p   
tcp -m tcp --dport 80 -j REDIRECT --to-ports 8080
iptables -t nat -A PREROUTING -s 192.168.12.0/255.255.255.0   -p tcp  
 -m tcp --dport 80 -j REDIRECT --to-ports 8080
iptables -t nat -A PREROUTING -s 192.168.14.0/255.255.255.0   -p tcp  
 -m tcp --dport 80 -j REDIRECT --to-ports 8080
iptables -t nat -A PREROUTING -s 192.168.15.0/255.255.255.0   -p tcp  
 -m tcp --dport 80 -j REDIRECT --to-ports 8080
iptables -t nat -A PREROUTING -s 192.168.16.0/255.255.255.0   -p tcp  
 -m tcp --dport 80 -j REDIRECT --to-ports 8080
iptables -t nat -A PREROUTING -s 192.168.24.0/255.255.248.0   -p tcp  
 -m tcp --dport 80 -j REDIRECT --to-ports 8080
iptables -t nat -A PREROUTING -s 192.168.64.0/255.255.224.0   -p tcp  
 -m tcp --dport 80 -j REDIRECT --to-ports 8080
iptables -t nat -A PREROUTING -s 192.168.96.0/255.255.224.0   -p tcp  
 -m tcp --dport 80 -j REDIRECT --to-ports 8080


Are you sure that you have your squid.conf set to listen on port 8080  
and not the default of port 3128?  I run transparent on port 8080 with  
redirect from 80 to 8080 with no issues.


ddh


--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools

Everything should be made as simple as possible, but not simpler.
-- Albert Einstein



Re: [squid-users] problem accessing http://www.sytadin.fr through squid

2008-02-14 Thread dhottinger



Quoting Frank Bonnet [EMAIL PROTECTED]:


Hello

I have a problem with this site : http://www.sytadin.fr
when using the squid , if I access directly I have no problem.

My squid config is really a basic one

Anyone has the same problem ?

see below the error message:

---

While trying to retrieve the URL: http://www.sytadin.fr/

The following error was encountered:

* Read Error

The system returned:

(54) Connection reset by peer

An error condition occurred while reading data from the network. Please
retry your request.



Thank you

Frank


Works fine here.  What version of Squid?


--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools

Everything should be made as simple as possible, but not simpler.
-- Albert Einstein



RE: [squid-users] tcp_miss/502

2008-01-09 Thread dhottinger

Quoting Scholten R. [EMAIL PROTECTED]:


wget 145.21.152.26
--16:21:59--  http://145.21.152.26/
Connecting to 145.21.152.26:80... connected.

But.. while digging further, it seems that squid still thinks that
http://adresgids.rijksweb.nl/servlet/DirXweb/Main.htm is located at the
old IP.
But even when I put adresgids.rijksweb.nl in /etc/hosts it still comes
back with the 502 :(
__
R.Scholten

-Oorspronkelijk bericht-
Van: J Beris [mailto:[EMAIL PROTECTED]
Verzonden: woensdag 9 januari 2008 16:14
Aan: Scholten R.; squid-users@squid-cache.org
Onderwerp: RE: [squid-users] tcp_miss/502



145.21.152.26



$ wget 145.21.152.26
--16:05:06--  http://145.21.152.26/
   = `index.html'
Connecting to 145.21.152.26:80... failed: Connection timed out.
Retrying.

--16:08:16--  http://145.21.152.26/
  (try: 2) = `index.html'
Connecting to 145.21.152.26:80... failed: Connection timed out.


Etc., etc.
Seems like the site is down or at least not reachable.

This seems a site only reachable bij people working for the national
government. That could explain why I can't reach it (only local
government) and neither can others.
Could this be a routing issue or something? Only available from certain
networks, maybe?

Regards,

Joop


Dit bericht is gescand op virussen en andere gevaarlijke
inhoud door MailScanner en lijkt schoon te zijn.
Mailscanner door http://www.prosolit.nl
Professional Solutions fot IT




Im not able to connect here either.  Regardless of whether I go  
through my proxy or use the ip address.  Im thinking that perhaps the  
server is offline.


--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools

rarely do people communicate, they just take turns talking



Re: [squid-users] Issues with Base 10 Decimal Bypassing Squidguard

2007-12-11 Thread dhottinger

Quoting [EMAIL PROTECTED]:


Hi there,

Here's an interesting one for you guys, I work P/T at a Local Authority
ISP service based upon open source code.

The kids Have recently realised that is you take

www.playboy.com

convert it to it's IP 216.163.137.3

covert it to Binary

11011000 10100011 10001001 0011

then back into base 10 decimal

3634596099 now you enter this into your browser http://3634596099

at first I was unsure if this was an april fools

but sure enough it works and bypasses the filtering completly. Not many
sites work but I did find one or two more.

Both url blocking in squidguard  IP filtering does not effect base 10

Has anyone any idea how we can get squid to ignore Base 10  Hex web
requests? kids will be bypassing filtering platforms up and down the UK
(or more probably have been for some time)

credit to them, clever little blighters



Crud.  Yea, they are clever.  We train them from kindergarden till  
they graduate.  12 years of computer, then we wonder how they are able  
to defeat every filter we put in place.  Plus there are 4000+ them and  
only one of us at each school division.  Do you know offhand what  
sites work?


--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools

rarely do people communicate, they just take turns talking



Re: [squid-users] auto blacklist users

2007-12-07 Thread dhottinger

Quoting ian j hart [EMAIL PROTECTED]:


On Friday 07 December 2007 23:49:35 Amos Jeffries wrote:

[Apologies in advance if I've miss-understood anything, it's late (early) and
I'm somewhat brain dead. This time zone thing's a killer]


ian j hart wrote:
 On Friday 07 December 2007 00:58:31 Adrian Chadd wrote:
 So if I get this right, you'd like to log the acl list that passed or
 failed the user?



 Adrian

 Near enough.

 I want to log the aclname (or custom error page name) and the username.
 I'll probably want the url in short order, followed by anything else that
 proves useful.

 I want to do this for users who are denied access.

 [The more general solution you state above would probably be okay too. I
 might need to add DENY/ACCEPT so I can include that in the regexp.]

 tangent
 Here's an example of how this might be generally useful. I have thee
 different proxy ACLs.

 A url_regexp
 A dstdomain list harvested from a popular list site
 A daily list gleaned from yesterdays access summary

Problem:
If a student can get through all day today whats to stop them?


Nothing. But here's what I hope will happen. (I probably shouldn't reveal
this, but what the hey).




Ive missed most of this discussion.  But it sounds like you may have  
gotten this to work.  Is there a recap?  Id really like to see your  
squid.conf (at least snippets that pertain to this). Are you running a  
transparent proxy?  Do you run any kind of commercial filter?  Ive  
been struggling with this same thing.  Now I catch this through my  
snort logs, and looking at access_logs for denied hits.  I also block  
quite a few sites at my firewall, but it is impossible to stop.  I do  
seem to have more support from administration than you.

--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools

rarely do people communicate, they just take turns talking



Re: [squid-users] video.nationalgeographic.com

2007-11-27 Thread dhottinger

Quoting Henrik Nordstrom [EMAIL PROTECTED]:


Things to try:

* Try the exact same thing with a current supported Squid version built
without Smartfilter. If that fails file a bug report.

* Try the exact same thing with 2.5.STABLE14 built without smartfilter.
If that fails then upgrade to 2.6, or try to figure out which of all the
several hundreds of changes is the important one and backport that to
your required Squid version.

* Talk to the Smartfilter support about the problem.

Regards
Henrik
I figured this out.  Wasnt a squid/proxy issue at all.  Before the  
video loads, a connection to burstnet.com is made.  I have burstnet  
blocked.  national geographic has the connection made to a burstnet  
site then the video loads.  I determined this by connecting to the  
video site on a linux machine from my dmz.  Im not quite sure why  
burstnet was classified as spyware though.  Does anyone have any info  
on them?  I know they are a huge advertising company, and do web ads.




--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools

rarely do people communicate, they just take turns talking



[squid-users] video.nationalgeographic.com

2007-11-26 Thread dhottinger
I seem to be unable to access any videos on  
video.nationalgeographic.com when behind my transparent proxy.  Im  
running squid version  2.5.STABLE14 and yes I know it is outdated, but  
I also use smartfilter from secure computing, and the version I am  
using isnt compatible with any newer versions of squid.  When  
accessing the site, I get a tcp_miss in my access log 1196093274.044
  38 10.40.20.20 TCP_REFRESH_HIT/200 40922 GET  
http://video.nationalgeographic.com/video/player/media/us-astronomy-apvin/us-astronomy-apvin_150x100.jpg - DIRECT/207.24.89.108 image/jpeg ALLOW Global Allow List [Accept: */*\r\nReferer: http://video.nationalgeographic.com/video/player/flash/society1_0.swf\r\nx-flash-version: 9,0,47,0\r\nCache-Control: no-transform\r\nUA-CPU: x86\r\nAccept-Encoding: gzip, deflate\r\nUser-Agent: Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; .NET CLR 2.0.50727; .NET CLR 1.1.4322; .NET CLR 3.0.04506.30)\r\nHost: video.nationalgeographic.com\r\nConnection: Keep-Alive\r\nCookie: s_cc=true; s_sq=natgeonews%253D%252526pid%25253Dhttp%2525253A//news.nationalgeographic.com/news%252526pidt%25253D1%252526oid%25253Djavascript%2525253AvideoPlayer%25252528%25252527http%2525253A//video.nationalgeographic.com/video/player/news/%25252527%25252529%252526ot%25253DA%252526oi%25253D448; s_nr=1196093247485\r\n] [HTTP/1.1 200 OK\r\nDate: Tue, 20 Nov 2007 12:54:48 GMT\r\nServer: Apache/2.0.52 (Red Hat)\r\nLast-Modified: Mon, 29 Oct 2007 18:40:30 GMT\r\nETag: 944ae-9e65-43da608e60380\r\nAccept-Ranges: bytes\r\nCache-Control: max-age=900\r\nExpires: Tue, 20 Nov 2007 13:09:48 GMT\r\nContent-Type: image/jpeg\r\nContent-length: 40549\r\nConnection: close\r\nAge:  
650\r\n\r]


The error message on nationalgeographic's webpage just says: Were  
sorry but the video player is taking a long time to load.  Please come  
back later or wait for it to load.  Then nothing happens.  Is anyone  
else experiencing any issues or have any ideas?


thanks,

ddh

--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools

rarely do people communicate, they just take turns talking



Re: [squid-users] Block all Web Proxies with squid.

2007-09-05 Thread dhottinger

Quoting Tim Bates [EMAIL PROTECTED]:


[EMAIL PROTECTED] wrote:
Im sort of curious how you route your traffic?  Im using iptables   
and reroute all port 80 traffic to my proxy on port 8080.  Port 443  
 traffic goes straight to website, because you cant cache encrypted  
 traffic.  Or am I totally wrong about this?

You can't cache it, but you can apply rules to it, thus restricting
it's use for avoiding your proxy rules.

I'm fairly sure that you can't do a transparent redirection though.
Open to correct, but I think redirection breaks HTTPS.

TB

**
This message is intended for the addressee named and may contain
privileged information or confidential information or both. If you
are not the intended recipient please delete it and notify the sender.
**


That is what I was thinking.  I am running a transparent proxy.


--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools



Re: [squid-users] Block all Web Proxies with squid.

2007-09-05 Thread dhottinger

Quoting [EMAIL PROTECTED] [EMAIL PROTECTED]:


Hi,


 Well if u want to block proxy you can get the list from

 www.proxy.org.

But this list is paid.is there any free list or can someone send a an
attached text file of the list.Even i face the same Issue.May be we
can make it work with SquidGaurd.


 I visited the site. English is not my native language, so, I can
missed something, but I didn't understand the list is paid for final
users searching proxy access. I tried to get
http://proxy.org/cgi_proxies.shtml using wget and I got a 403 error,
so, I tried -UMozilla. and it worked.

 I don't know if they will, anytime, block accesses coming from
the same IP and doing nothing but loading main page. I did some egrep
and awk in the file ( gotten by wget ) and I got a list of domains (
more than 4000 ), ready to use in a dstdom Squid ACL. I think it can
be considered as a misuse of their service, because they use banners
in the sites. So, I think it must be discussed to analyse the ethics.
( Maybe I am paranoid :-) ).

 Surfing in the site, I found a list or TOR servers, in text
format ( wget needs -U ), to use in a .htaccess file. Again, some
egrep and awk generated a list ready to use in a dst Squid ACL.

 Well, it is a little boring, but, we always can enter the site,
save source page code, process it an use it with Squid, but, again,
how about ethics?

 I am really interested about blocking anonymous proxies, but I
have already seen that it is a hard job. :-(

 Thank you for your attention.

Regards,

Freitas

There is some people doing work on blacklists at bleeding-edge.  They  
wright sig files for snort.  You might check out their site.  Ive used  
their blacklists before.  They stay pretty up-to-date.  Or were.



--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools



Re: [squid-users] Block all Web Proxies with squid.

2007-09-04 Thread dhottinger

Quoting Preetish [EMAIL PROTECTED]:


On 9/5/07, Norman Noah [EMAIL PROTECTED] wrote:

Well if u want to block proxy you can get the list from

www.proxy.org.


But this list is paid.is there any free list or can someone send a an
attached text file of the list.Even i face the same Issue.May be we
can make it work with SquidGaurd.


they have the updated list of all running proxies..

y must u allow https not to go through squid ?

in my environment all internet access must go through squid.





Im sort of curious how you route your traffic?  Im using iptables and  
reroute all port 80 traffic to my proxy on port 8080.  Port 443  
traffic goes straight to website, because you cant cache encrypted  
traffic.  Or am I totally wrong about this?



--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools



Re: [squid-users] google

2007-05-02 Thread dhottinger

Quoting Chris Robertson [EMAIL PROTECTED]:



I didnt remove any of the defaults I am using 2.5 and the acl query  
 statements are there.  Not sure what you are trying to tell me.


thanks,
ddh





1) By default, Squid won't cache the response to any request with a
question mark (usually GET requests with arguments signifying a dynamic
page).

2) Even if this behavior had been changed in your install, the links
you listed are not cacheable, due to the information not given by the
server.

Chris

Cool.  So nothing wrong with my proxy server.


--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools



[squid-users] google

2007-05-01 Thread dhottinger
I suddenly (last friday) started having issues when access google.com.  
 My access.log file shows all tcp_miss for google.  Is anyone else  
experiencing slow google access?  I did get an email from google that  
they were updating their applications (we use google calendars).


thanks,

ddh


--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools



Re: [squid-users] google

2007-05-01 Thread dhottinger

Quoting Adrian Chadd [EMAIL PROTECTED]:


On Tue, May 01, 2007, [EMAIL PROTECTED] wrote:

I suddenly (last friday) started having issues when access google.com.
 My access.log file shows all tcp_miss for google.  Is anyone else
experiencing slow google access?  I did get an email from google that
they were updating their applications (we use google calendars).


I've not heard about it. Do you have mime logging turned on so we can
see the headers w/ the request/reply?



Adrian



1178030214.203538 10.40.15.123 TCP_MISS/200 5959 GET  
http://tbn0.google.com/images? - DIRECT/72.14.211.104 image/jpeg ALL
OW Visual Search Engine, Search Engines [Accept:  
*/*\r\nAccept-Language: en\r\nAccept-Encoding: gzip,  
deflate\r\nCookie: PR
EF=ID=2377a880502b45e8:TM=1158762666:LM=1158762666:S=4J3pIHAN5lUxv6nf\r\nReferer:  
http://images.google.com/images?q=newspaper
+comics+political+cartoons+on+the+war+in+Iraqgbv=2svnum=10hl=enstart=40sa=Nndsp=20\r\nUser-Agent: Mozilla/5.0  
(Macintos
h; U; Intel Mac OS X; en) AppleWebKit/418.9 (KHTML, like Gecko)  
Safari/419.3\r\nConnection: keep-alive\r\nHost: tbn0.google.c
om\r\n] [HTTP/1.0 200 OK\r\nContent-Type: image/jpeg\r\nServer:  
btfe\r\nContent-Length: 5775\r\nDate: Tue, 01 May 2007 14:36:

53 GMT\r\nConnection: Keep-Alive\r\n\r]
1178030214.205538 10.40.15.123 TCP_MISS/200 5535 GET  
http://tbn0.google.com/images? - DIRECT/72.14.211.99 image/jpeg ALLO
W Visual Search Engine, Search Engines [Accept:  
*/*\r\nAccept-Language: en\r\nAccept-Encoding: gzip,  
deflate\r\nCookie: PRE
F=ID=2377a880502b45e8:TM=1158762666:LM=1158762666:S=4J3pIHAN5lUxv6nf\r\nReferer:  
http://images.google.com/images?q=newspaper+
comics+political+cartoons+on+the+war+in+Iraqgbv=2svnum=10hl=enstart=40sa=Nndsp=20\r\nUser-Agent: Mozilla/5.0  
(Macintosh
; U; Intel Mac OS X; en) AppleWebKit/418.9 (KHTML, like Gecko)  
Safari/419.3\r\nConnection: keep-alive\r\nHost: tbn0.google.co
m\r\n] [HTTP/1.0 200 OK\r\nContent-Type: image/jpeg\r\nServer:  
btfe\r\nContent-Length: 5351\r\nDate: Tue, 01 May 2007 14:36:5

3 GMT\r\nConnection: Keep-Alive\r\n\r]
1178030214.361352 10.40.15.123 TCP_MISS/200 3822 GET  
http://tbn0.google.com/images? - DIRECT/72.14.211.104 image/jpeg ALL
OW Visual Search Engine, Search Engines [Accept:  
*/*\r\nAccept-Language: en\r\nAccept-Encoding: gzip,  
deflate\r\nCookie: PR
EF=ID=2377a880502b45e8:TM=1158762666:LM=1158762666:S=4J3pIHAN5lUxv6nf\r\nReferer:  
http://images.google.com/images?q=newspaper
+comics+political+cartoons+on+the+war+in+Iraqgbv=2svnum=10hl=enstart=40sa=Nndsp=20\r\nUser-Agent: Mozilla/5.0  
(Macintos
h; U; Intel Mac OS X; en) AppleWebKit/418.9 (KHTML, like Gecko)  
Safari/419.3\r\nConnection: keep-alive\r\nHost: tbn0.google.c
om\r\n] [HTTP/1.0 200 OK\r\nContent-Type: image/jpeg\r\nServer:  
btfe\r\nContent-Length: 3638\r\nDate: Tue, 01 May 2007 14:36:

54 GMT\r\nConnection: Keep-Alive\r\n\r]
1178030214.393366 10.40.15.123 TCP_MISS/200 4142 GET  
http://tbn0.google.com/images? - DIRECT/72.14.211.99 image/jpeg ALLO
W Visual Search Engine, Search Engines [Accept:  
*/*\r\nAccept-Language: en\r\nAccept-Encoding: gzip,  
deflate\r\nCookie: PRE
F=ID=2377a880502b45e8:TM=1158762666:LM=1158762666:S=4J3pIHAN5lUxv6nf\r\nReferer:  
http://images.google.com/images?q=newspaper+
comics+political+cartoons+on+the+war+in+Iraqgbv=2svnum=10hl=enstart=40sa=Nndsp=20\r\nUser-Agent: Mozilla/5.0  
(Macintosh
; U; Intel Mac OS X; en) AppleWebKit/418.9 (KHTML, like Gecko)  
Safari/419.3\r\nIf-Modified-Since: Tue, 01 May 2007 14:21:53 G
MT\r\nConnection: keep-alive\r\nHost: tbn0.google.com\r\n] [HTTP/1.0  
200 OK\r\nContent-Type: image/jpeg\r\nServer: btfe\r\nCo
ntent-Length: 3958\r\nDate: Tue, 01 May 2007 14:36:54  
GMT\r\nConnection: Keep-Alive\r\n\r]



--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools



Re: [squid-users] google

2007-05-01 Thread dhottinger

Quoting Chris Robertson [EMAIL PROTECTED]:


[EMAIL PROTECTED] wrote:

Quoting Adrian Chadd [EMAIL PROTECTED]:


On Tue, May 01, 2007, [EMAIL PROTECTED] wrote:

I suddenly (last friday) started having issues when access google.com.
My access.log file shows all tcp_miss for google.  Is anyone else
experiencing slow google access?  I did get an email from google that
they were updating their applications (we use google calendars).


I've not heard about it. Do you have mime logging turned on so we can
see the headers w/ the request/reply?



Adrian



1178030214.203538 10.40.15.123 TCP_MISS/200 5959 GET   
http://tbn0.google.com/images? - DIRECT/72.14.211.104 image/jpeg ALL
OW Visual Search Engine, Search Engines [Accept:   
*/*\r\nAccept-Language: en\r\nAccept-Encoding: gzip,   
deflate\r\nCookie: PR
EF=ID=2377a880502b45e8:TM=1158762666:LM=1158762666:S=4J3pIHAN5lUxv6nf\r\nReferer:   
http://images.google.com/images?q=newspaper
+comics+political+cartoons+on+the+war+in+Iraqgbv=2svnum=10hl=enstart=40sa=Nndsp=20\r\nUser-Agent: Mozilla/5.0   
(Macintos
h; U; Intel Mac OS X; en) AppleWebKit/418.9 (KHTML, like Gecko)   
Safari/419.3\r\nConnection: keep-alive\r\nHost: tbn0.google.c
om\r\n] [HTTP/1.0 200 OK\r\nContent-Type: image/jpeg\r\nServer:   
btfe\r\nContent-Length: 5775\r\nDate: Tue, 01 May 2007 14:36:

53 GMT\r\nConnection: Keep-Alive\r\n\r]


Assuming you have not removed the default cache/no_cache line from your
squid.conf, anything with a question mark in the URL will not be cached.

From 2.6STABLE12's squid.conf.default:

#We recommend you to use the following two lines.
acl QUERY urlpath_regex cgi-bin \?
cache deny QUERY

2.5 is the same, except the directive is called no_cache.

In addition, the objects have no freshness information (Expires, or
Content-Length), so even without the explicit requirement within Squid
to not cache GET queries the listed objects are cacheable.

Chris
I didnt remove any of the defaults I am using 2.5 and the acl query  
statements are there.  Not sure what you are trying to tell me.


thanks,
ddh


--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools



[squid-users] no_cache deny for swscan.apple.com

2007-04-09 Thread dhottinger
I am trying to get my squid 2.5 stable 14 install to not cache traffic  
for apple software updates.  I put the following acl in my squid.conf


acl apple1 url_regex swscan.apple.com \?
no_cache deny apple1

I still see traffic to swscan.apple.com in my access.log:

1176124249.216 110065 10.40.0.80 TCP_MISS/200 15636 GET  
http://swscan.apple.com/content/catalogs/index-1.sucatalog -  
DIRECT/17.250.248.95 text/plain ALLOW Shareware/Freeware

1
Which Im pretty sure means it is still being cached.  Ive googled and  
searched through squids website, but dont seem to be able to find the  
syntax I need to make this happen.  What am I doing wrong?

--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools



Re: [squid-users] Apple software updates

2007-04-09 Thread dhottinger

Quoting Henrik Nordstrom [EMAIL PROTECTED]:


mån 2007-04-09 klockan 09:41 -0400 skrev
[EMAIL PROTECTED]:

   
http://wiki.squid-cache.org/SquidFaq/SystemWeirdnesses#head-4920199b311ce7d20b9a0d85723fd5d0dfc9bc84



No problem.  So there is no way I am going to get apple software
updates to work through my proxy server?


Sure, you just have to work around their firewall...

The simplest workaround is
  echo 0 /proc/sys/net/ipv4/tcp_window_scaling

the better workaround is to set up a separate route to their network
with a small window (at most 65535).

See the above for a more in-depth discussion (or actually the article it
links to).

Regards
Henrik



I did echo 0  /proc/sys/net/ipv4/tcp_window_scaling on my proxy  
server, and added acl's for swscan.apple.com, akamaitechnologies.com,  
and swdc.apple.com for no_cache and softwareupdates work now.  I had  
previously done the echo 0  /proc/sys/net/ipv4/tcp_window_scaling on  
my firewall with no joy for software updates.  Thanks for taking the  
time to help with this Henrik.  Hopefully the solution to this will  
help the half dozen or so others who are having issues with Apple  
Software updates.


ddh


--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools



Re: [squid-users] Apple software updates

2007-04-08 Thread dhottinger

Quoting Matus UHLAR - fantomas [EMAIL PROTECTED]:


On 05.04.07 18:46, [EMAIL PROTECTED] wrote:

A couple of weeks ago I sent in some traffic captures showing an Apple
OS x 10.4 trying to software udpates through squid and failing.  I
look at the captures and they mean nothing to me.  Has anyone else had
a chance to turn an expert eye to them?  Also, since software updates
dont work, is there a way I can make squid not try to cache them?
Perhaps Always_direct?


always_direct does not talk about caching, it controls the proxy hierarchy.
use no_cache (renamed to cache in 2.6) to (dis)allow caching.
--
Matus UHLAR - fantomas, [EMAIL PROTECTED] ; http://www.fantomas.sk/
Warning: I wish NOT to receive e-mail advertising to this address.
Varovanie: na tuto adresu chcem NEDOSTAVAT akukolvek reklamnu postu.
Linux IS user friendly, it's just selective who its friends are...



Yea,
I knew that sorry.   How about:
acl apple1 url_regex ^apple.com.html
no_cache deny apple1

Should that work to not cache traffic destined for the domain apple.com?

thanks,

ddh


--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools



[squid-users] Apple software updates

2007-04-05 Thread dhottinger
A couple of weeks ago I sent in some traffic captures showing an Apple  
OS x 10.4 trying to software udpates through squid and failing.  I  
look at the captures and they mean nothing to me.  Has anyone else had  
a chance to turn an expert eye to them?  Also, since software updates  
dont work, is there a way I can make squid not try to cache them?   
Perhaps Always_direct?


--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools



Re: [squid-users] logging auth users statistics for the same

2007-03-27 Thread dhottinger

Quoting [EMAIL PROTECTED]:


Quoting Garry Glendown [EMAIL PROTECTED]:


Garry Glendown wrote:

one of our customers has asked about getting some information about the
amount of surfing users are doing. Problem is that most users are using
terminal servers to do their surfing, so all I get at the moment is the
IP addresses of the multiple terminal servers. Users are authenticated
for web access. I've already located a patch to Squid to add the
authenticated user name to logg (http://devel.squid-cache.org/customlog
), question is what can I use to generate the stats? I've tried
Calamaris by creating Apache log style, and manually put a user name in
the Apache style log lines, but Calamaris doesn't seem to include user
statistics. The current stable version 2.59 doesn't have it, though at
least it's able to parse the emulated logg file, 2.99 doesn't seem to
have that style format at all anymore ...


Nobody?

Tnx, -garry
The perlscript that puts info in a mysqldatabase is squidalyser, you  
can find it on freshmeat.  Also sarg does the same thing and is a  
little prettier.


ddh


--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools



Re: [squid-users] logging auth users statistics for the same

2007-03-26 Thread dhottinger

Quoting Garry Glendown [EMAIL PROTECTED]:


Garry Glendown wrote:

one of our customers has asked about getting some information about the
amount of surfing users are doing. Problem is that most users are using
terminal servers to do their surfing, so all I get at the moment is the
IP addresses of the multiple terminal servers. Users are authenticated
for web access. I've already located a patch to Squid to add the
authenticated user name to logg (http://devel.squid-cache.org/customlog
), question is what can I use to generate the stats? I've tried
Calamaris by creating Apache log style, and manually put a user name in
the Apache style log lines, but Calamaris doesn't seem to include user
statistics. The current stable version 2.59 doesn't have it, though at
least it's able to parse the emulated logg file, 2.99 doesn't seem to
have that style format at all anymore ...


Nobody?

Tnx, -garry


I seem to recall seeing something on freshmeat or sourceforge that did  
this.  It was a perl script that parsed squid files and put them into  
a mysql database.  It had a web interface and you could search by  
username.  I'll check today and see if I have the name.


--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools



Re: [squid-users] apple software updates

2007-03-26 Thread dhottinger

Quoting Henrik Nordstrom [EMAIL PROTECTED]:


tor 2007-03-22 klockan 20:51 -0400 skrev
[EMAIL PROTECTED]:


Works better on Windows than apple though.  I can see the initial
connections being made to swscan.apple.com also, then nothing.  I have
apple.com set to always direct in my squid.conf.


What you mean by initial connection?

First request logged in access.log, or that the SYN/ACK works then
nothing?

Regards
Henrik



First request logged in access.log.  I can send you my access.log file  
with pertinent info showing apple computer connecting to  
swscan.apple.com.  I can also send an iptraf from firewall (or tcpdump  
if nec.).  Glad to see someone is following this.  Seems to be quite a  
few people having the same problem.


thanks,
ddh

--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools



Re: [squid-users] apple software updates

2007-03-26 Thread dhottinger

Quoting Amos Jeffries [EMAIL PROTECTED]:




This sounds remarkably similar to the MSN (or was is windows xp update?
or both?) problem seen earlier. That turned out to be an out-of-band
ssl link
being made directly to the remote server, bypassing the proxy killed it
and the core proxied link was denied thereafter by the remote server.

Amos


I dont remember that thread.  Although, apple tech support says their  
updates all run on port 80.  Ive never had any problems with Windows  
updates.  Plus, My os x 10.3 computers update through my proxy just  
fine.  Apple says that OS X 10.3 and 10.4 connect to the same server,  
just use different methods of downloading the updates.  Evidently 10.4  
downloads a catalog then lists the updates.  Not sure what 10.3 does.   
Apple says that the catalog download isnt finishing.


--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools



Re: [squid-users] apple software updates

2007-03-26 Thread dhottinger

Quoting Henrik Nordstrom [EMAIL PROTECTED]:


mån 2007-03-26 klockan 06:49 -0400 skrev
[EMAIL PROTECTED]:


First request logged in access.log.  I can send you my access.log file
with pertinent info showing apple computer connecting to
swscan.apple.com.  I can also send an iptraf from firewall (or tcpdump
if nec.).  Glad to see someone is following this.  Seems to be quite a
few people having the same problem.


What needs to get done is comparing the traffic of a mac connecting via
the proxy to when going direct.

1. Find a mac having problems.

2. Run the update procedure while collecting full network traces of the
traffic using ethereal/wireshark or tcpdump -s 1600 -w traffic.pcap

3. Reconnect the same mac so it doesn't use the proxy and run the update
procedure again, still while collecting a full traffic trace.

Note: In 3 is is very important you don't run the update service until
you start collecting the traces. If you do you must go back to 2 and
verify that the problem still exists, or the trace will likely be
invalid and not useful.

Collecting the network traces is perhaps most easily done using
wireshark or tcpdump on the MAC while running the updates procedure.

  tcpdump -i any -s 1600 -w traffic.pcap

Did anyone get the tcpdump that I sent?  It got kicked back from the  
list.  I included Mr. Nordstrom in the email.


thanks,

ddh


--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools



Re: [squid-users] apple software updates

2007-03-26 Thread dhottinger

Quoting Henrik Nordstrom [EMAIL PROTECTED]:


mån 2007-03-26 klockan 20:02 -0400 skrev
[EMAIL PROTECTED]:


Did anyone get the tcpdump that I sent?  It got kicked back from the
list.  I included Mr. Nordstrom in the email.


I got the dumps, but has not looked into them yet. Might take a few days
for me to get there.

If you want to look into the dumps yourself then start
ethereal/wireshark and load the traffic dump files there and start
navigating the packets comparing traffic between the two dumps. To aid
in this there is a very useful tool Follow TCP stream in the menu..

Regards
Henrik



Thanks,
No hurry.  I started going through them today.  Just wanted to make  
sure they came through.  I appreciate any insight.  Always good to  
have an expert opinion.


ddh


--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools



Re: [squid-users] apple software updates

2007-03-22 Thread dhottinger

Quoting [EMAIL PROTECTED]:


Hello,

I have been following this thread but never saw a resolution, has   
anybody found

a fix for this error? Is it an update to the latest squid or was it a problem
on Apples side? I have been searching for an answer but have not found one.

I am using: Squid Cache: Version 2.5.STABLE11

Here is the error:

007-03-19 11:17:33.011 Software Update[186]
loader:didFailWithError:NSError XML parser error:
Encountered unexpected EOF
Old-style plist parser error:
Malformed data byte group at line 1; invalid hex
 Domain=SUCatalogLoader Code=0 UserInfo={
NSLocalizedDescription = XML parser error:\n\tEncountered
unexpected EOF\nOld-style plist parser error:\n\tMalformed data byte
group at line 1; invalid hex\n;
NSURL = http://swscan.apple.com/content/catalogs/index-1.sucatalog;
}
2007-03-19 13:49:29.953 Software Update[201]
loader:didFailWithError:NSError XML parser error:
Encountered unexpected EOF
Old-style plist parser error:
Malformed data byte group at line 1; invalid hex
 Domain=SUCatalogLoader Code=0 UserInfo={
NSLocalizedDescription = XML parser error:\n\tEncountered
unexpected EOF\nOld-style plist parser error:\n\tMalformed data byte
group at line 1; invalid hex\n;
NSURL = http://swscan.apple.com/content/catalogs/index-1.sucatalog;
}
### MRJPlugin:  getPluginBundle() here. ###
### MRJPlugin:  CFBundleGetBundleWithIdentifier() succeeded. ###
### MRJPlugin:  CFURLGetFSRef() succeeded. ###
2007-03-19 13:58:11.977 DiskImages UI Agent[209] Could not find image
named 'background'.
2007-03-19 14:24:28.542 Software Update[230]
loader:didFailWithError:NSError XML parser error:
Encountered unexpected EOF
Old-style plist parser error:
Malformed data byte group at line 1; invalid hex
 Domain=SUCatalogLoader Code=0 UserInfo={
NSLocalizedDescription = XML parser error:\n\tEncountered
unexpected EOF\nOld-style plist parser error:\n\tMalformed data byte
group at line 1; invalid hex\n;
NSURL = http://swscan.apple.com/content/catalogs/index-1.sucatalog;
}

Any help or direction would be greatly appreciated,

Jon

Ive been in contact with apple tech support and have heard from quite  
a few others that are having the same issue.  Apple has been no help.   
Im sending them my access.log file from my proxy tommorrow showing an  
imac trying to do software updates.   This is a problem that just  
started around the first part of the year.  Im sure apple made some  
change on their end, but I'll never get out of them what.  I sent  
quite a bit of info to the list, but never recieved a reply.  Apple  
says the reason the software updates are not working is because the  
catalog isnt finishing downloading. Duh, Why isnt it finishing.  Part  
of the catalog loads fine.  I keep hoping someone more knowledgeable  
then me will come up with a solution.



--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools



[squid-users] WARNING! Your cache is running out of filedescriptors

2007-03-22 Thread dhottinger
While watching my cache.log today I came across this WARNING! Your  
cache is running out of filedescriptors.  What do I do about this?


thanks,

ddh


--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools



Re: [squid-users] apple software updates

2007-03-22 Thread dhottinger

Quoting Juan C. Crespo R. [EMAIL PROTECTED]:


Humm

   If you are making Cache of your DNS Request, or your squid server is
the same DNS server clean you DNS Cache, and try Again, it happen to me
a few time ago, but not with Apple Updates, just try.

Regards :)

[EMAIL PROTECTED] escribió:

Quoting [EMAIL PROTECTED]:


Hello,

I have been following this thread but never saw a resolution, has   
 anybody found
a fix for this error? Is it an update to the latest squid or was   
it a problem

on Apples side? I have been searching for an answer but have not found one.

I am using: Squid Cache: Version 2.5.STABLE11

Here is the error:

007-03-19 11:17:33.011 Software Update[186]
loader:didFailWithError:NSError XML parser error:
Encountered unexpected EOF
Old-style plist parser error:
Malformed data byte group at line 1; invalid hex
 Domain=SUCatalogLoader Code=0 UserInfo={
   NSLocalizedDescription = XML parser error:\n\tEncountered
unexpected EOF\nOld-style plist parser error:\n\tMalformed data byte
group at line 1; invalid hex\n;
   NSURL = http://swscan.apple.com/content/catalogs/index-1.sucatalog;
}
2007-03-19 13:49:29.953 Software Update[201]
loader:didFailWithError:NSError XML parser error:
Encountered unexpected EOF
Old-style plist parser error:
Malformed data byte group at line 1; invalid hex
 Domain=SUCatalogLoader Code=0 UserInfo={
   NSLocalizedDescription = XML parser error:\n\tEncountered
unexpected EOF\nOld-style plist parser error:\n\tMalformed data byte
group at line 1; invalid hex\n;
   NSURL = http://swscan.apple.com/content/catalogs/index-1.sucatalog;
}
### MRJPlugin:  getPluginBundle() here. ###
### MRJPlugin:  CFBundleGetBundleWithIdentifier() succeeded. ###
### MRJPlugin:  CFURLGetFSRef() succeeded. ###
2007-03-19 13:58:11.977 DiskImages UI Agent[209] Could not find image
named 'background'.
2007-03-19 14:24:28.542 Software Update[230]
loader:didFailWithError:NSError XML parser error:
Encountered unexpected EOF
Old-style plist parser error:
Malformed data byte group at line 1; invalid hex
 Domain=SUCatalogLoader Code=0 UserInfo={
   NSLocalizedDescription = XML parser error:\n\tEncountered
unexpected EOF\nOld-style plist parser error:\n\tMalformed data byte
group at line 1; invalid hex\n;
   NSURL = http://swscan.apple.com/content/catalogs/index-1.sucatalog;
}

Any help or direction would be greatly appreciated,

Jon

Ive been in contact with apple tech support and have heard from   
quite a few others that are having the same issue.  Apple has been   
no help.  Im sending them my access.log file from my proxy   
tommorrow showing an imac trying to do software updates.   This is   
a problem that just started around the first part of the year.  Im   
sure apple made some change on their end, but I'll never get out of  
 them what.  I sent quite a bit of info to the list, but never   
recieved a reply.  Apple says the reason the software updates are   
not working is because the catalog isnt finishing downloading.   
Duh, Why isnt it finishing.  Part of the catalog loads fine.  I   
keep hoping someone more knowledgeable then me will come up with a   
solution.



--Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools





Not caching DNS.

--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools



Re: [squid-users] WARNING! Your cache is running out of filedescriptors

2007-03-22 Thread dhottinger

Quoting Juan C. Crespo R. [EMAIL PROTECTED]:


you must tweak your OS, there is a lot of info on the Net, just google it :)


[EMAIL PROTECTED] escribió:
While watching my cache.log today I came across this WARNING! Your   
cache is running out of filedescriptors.  What do I do about this?


thanks,

ddh


--Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools




Thanks,
Yes Found the info.


--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools



Re: [squid-users] apple software updates

2007-03-22 Thread dhottinger

Quoting [EMAIL PROTECTED]:


Quoting [EMAIL PROTECTED]:


Quoting [EMAIL PROTECTED]:

 Hello,

 I have been following this thread but never saw a resolution, has
 anybody found
 a fix for this error? Is it an update to the latest squid or was it a
problem
 on Apples side? I have been searching for an answer but have not   
found one.


 I am using: Squid Cache: Version 2.5.STABLE11

 Here is the error:

 007-03-19 11:17:33.011 Software Update[186]
 loader:didFailWithError:NSError XML parser error:
 Encountered unexpected EOF
 Old-style plist parser error:
 Malformed data byte group at line 1; invalid hex
  Domain=SUCatalogLoader Code=0 UserInfo={
 NSLocalizedDescription = XML parser error:\n\tEncountered
 unexpected EOF\nOld-style plist parser error:\n\tMalformed data byte
 group at line 1; invalid hex\n;
 NSURL = http://swscan.apple.com/content/catalogs/index-1.sucatalog;
 }

 Any help or direction would be greatly appreciated,

 Jon

Ive been in contact with apple tech support and have heard from quite
a few others that are having the same issue.  Apple has been no help.
Im sending them my access.log file from my proxy tommorrow showing an
imac trying to do software updates.   This is a problem that just
started around the first part of the year.  Im sure apple made some
change on their end, but I'll never get out of them what.  I sent
quite a bit of info to the list, but never recieved a reply.  Apple
says the reason the software updates are not working is because the
catalog isnt finishing downloading. Duh, Why isnt it finishing.  Part
of the catalog loads fine.  I keep hoping someone more knowledgeable
then me will come up with a solution.


--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools


Thanks for the update Dwayne, can you keep us posted as to what you find out
from sending Apple your logs? The only way I have been able to get   
around this

is to go around my squid box.

Is everyone having this problem or are some people able to   
successfully update

and if so, what version of squid and Mac OS are you using?

Thanks again for any help,

Jon



Apple OS 10.3.x macs update fine.  Windows users using quicktime 7  
with the apple softwareupdate work fine. Its only 10.4 macs that cant  
update.  My error message is identical to yours.  Ive tried several  
things on my firewall to preroute traffic destined for apple.com but  
havent had any success.  Users can still go to apples website and  
download updates manually.  If I take a computer and physically bypass  
my proxy by putting it on my DMZ with a public ip it updates.  Then if  
I take same computer and put it back on my 10. network, updates work  
at least for a while.  Im running squid 2.5 stable 14.  Another list  
poster updated to the latest version of squid and had the same  
problem.  Im positive it is something the proxy  is doing with the  
catalog files, but not sure what.  They are gzip files so it shouldnt  
be that big a deal.  You can take the url that is in that error  
message and copy/paste it in a browser and watch the info come down.   
Works better on Windows than apple though.  I can see the initial  
connections being made to swscan.apple.com also, then nothing.  I have  
apple.com set to always direct in my squid.conf.


ddh


--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools



Re: [squid-users] sarg reports

2007-03-16 Thread dhottinger

Quoting Munawar Zeeshan [EMAIL PROTECTED]:


DENY contents through squid.conf.SARG DENIED will work then

On 3/15/07, [EMAIL PROTECTED] 
[EMAIL PROTECTED] wrote:


I know this may not be totally related to squid, but I cant seem to
get my sarg installation to generate denied reports.  Everything else
works fine.  I have denied in my sarg.conf.  My accesslog shows
TCP_DENIED for denied requests.  Is this a bug in the last couple of
versions of Sarg?

--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools





--
Munawar Zeeshan
Islamabad,Pakistan
+92-300-514-6886
+92-321-535-4851


Huh?  I dont understand.

ddh


--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools



[squid-users] sarg reports

2007-03-15 Thread dhottinger
I know this may not be totally related to squid, but I cant seem to  
get my sarg installation to generate denied reports.  Everything else  
works fine.  I have denied in my sarg.conf.  My accesslog shows  
TCP_DENIED for denied requests.  Is this a bug in the last couple of  
versions of Sarg?


--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools



Re: [squid-users] Another HTTP 1.1 Question

2007-03-08 Thread dhottinger

Quoting Chris Nighswonger [EMAIL PROTECTED]:


On 3/8/07, Adrian Chadd [EMAIL PROTECTED] wrote:

You can try the Squid-2 snapshots which include the below patch.
http://www.squid-cache.org/Versions/v2/HEAD/


Here is what I have done:

1. My current install is via yum (rpm).

2. I have configured with the same options returned from a '#squid -v'
and done a 'make'

3. I have backup my current squid.conf

Here is the question:

Do I do a 'make install,' then replace the new 'squid.conf' with my
original, and start squid back up? (This is a production box and I
really don't want to bust it.)

Thanks,
Chris


If it was me I would do a cp on my current squid directory, then when  
installing do a ./configure --someother directory.  For example: if  
squid is installed in /var/squid you could install the new version in  
/usr/local/squid.  Doing a ./configure --help will give you the exact  
options.  Then after installing you can either edit the new squid.conf  
to suit, stop your old squid and start new squid with a ./squid -z to  
build the cache directorys and then do a ./squid.  If everything goes  
south, then you can go back to your old version and figure out why the  
new one didnt work.


ddh

--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools



Re: [squid-users] Another HTTP 1.1 Question

2007-03-08 Thread dhottinger

Quoting Adrian Chadd [EMAIL PROTECTED]:


On Thu, Mar 08, 2007, [EMAIL PROTECTED] wrote:


If it was me I would do a cp on my current squid directory, then when
installing do a ./configure --someother directory.  For example: if
squid is installed in /var/squid you could install the new version in
/usr/local/squid.  Doing a ./configure --help will give you the exact
options.  Then after installing you can either edit the new squid.conf
to suit, stop your old squid and start new squid with a ./squid -z to
build the cache directorys and then do a ./squid.  If everything goes
south, then you can go back to your old version and figure out why the
new one didnt work.


I normally do this:


./configure --prefix=/usr/local/squid-VERSION
make
make install
cd /usr/local
rm squid (its a symlink!)
ln -s squid-VERSION squid
cp /path/to/normal/squid.conf /usr/local/squid-VERSION/etc/squid.conf

That way I can have multiple squids installed to try but have my init
script only start /usr/local/squid/sbin/squid .




Adrian


Cool.  Yep.  Works very well.  Sort of the same thing only more elegant.


--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools



[squid-users] iptraf dumps

2007-03-01 Thread dhottinger
Did anyone get a chance to look at the iptraf files that I sent  
displaying connections to Apples software update website?


thanks,

ddh


--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools



Re: [squid-users] apple software updates

2007-02-28 Thread dhottinger

Quoting Adrian Chadd [EMAIL PROTECTED]:


On Tue, Feb 27, 2007, [EMAIL PROTECTED] wrote:


This is really starting to bother me.  I have been running the same
version of Squid since December.  No problems with any of my apples
updating.  The problems started just a few weeks ago.  I called apple
tech support, the tech said if os 10.3 works than 10.4 should.   He
elevated me to the next level of support.  That was 2 weeks ago.  No
calls back.  I get the feeling that apple changed something on their
end that has caused this not to work with Squid.  Im going to look at
my firewall and see if I can do something to let apple.com traffic to
go straight through and not get redirected to my proxy server.  Yes, I
too can take squid out of loop, run updates then software updates
works, for a couple of days.  Sometimes their are updates to download
also.  Is there any other testing we could do?


How familar are you with tcpdump/ethereal? Could you get a full   
packet capture of

the request/reply with and without Squid in the way?




Adrian



I can run tcp dump on my firewall.  I may have that with proxy  
already.  Let me look this morning.


ddh


--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools



[squid-users] tcpdump appleupdates no proxy

2007-02-28 Thread dhottinger

Computer ip is 204.111.40.59.  updates are successful.

Wed Feb 28 13:43:48 2007; TCP; eth3; 48 bytes; from  
ha96s687.d.shentel.net:2323 to 204.111.40.59:netbios-ss (source MAC addr

000f352e81a2); first packet (SYN)
Wed Feb 28 13:59:48 2007; TCP; Connection ha96s687.d.shentel.net:2323  
to 204.111.40.59:netbios-ss timed out, 1 packets, 48 by
tes, avg flow rate 0.00 kbits/s; opposite direction 0 packets, 0  
bytes, avg flow rate 0.00 kbits/s
Wed Feb 28 14:01:33 2007; IGMP; eth1; 46 bytes; source MAC address  
0003939a29b0; from 204.111.40.59 to 224.0.0.251
Wed Feb 28 14:01:56 2007; UDP; eth1; 505 bytes; source MAC address  
0003939a29b0; from 204.111.40.59:5353 to 224.0.0.251:5353
Wed Feb 28 14:04:00 2007; TCP; eth2; 52 bytes; from 10.40.0.11:ldap to  
204.111.40.59:49166 (source MAC addr 0009b7135180); fi

rst packet
Wed Feb 28 14:04:00 2007; TCP; eth1; 52 bytes; from 10.40.0.11:ldap to  
204.111.40.59:49166 (source MAC addr 0080c8ca9fb9); fi

rst packet
Wed Feb 28 14:04:00 2007; TCP; eth2; 52 bytes; from  
pear.hhs.harrisonburg.k12.va.us:ldap to 204.111.40.59:49166 (source  
MAC a

ddr 0009b7135180); FIN sent; 2 packets, 104 bytes, avg flow rate 0.00 kbits/s
Wed Feb 28 14:04:00 2007; TCP; eth1; 52 bytes; from  
pear.hhs.harrisonburg.k12.va.us:ldap to 204.111.40.59:49166 (source  
MAC a

ddr 0080c8ca9fb9); FIN sent; 2 packets, 104 bytes, avg flow rate 0.00 kbits/s
Wed Feb 28 14:04:00 2007; TCP; eth1; 52 bytes; from  
204.111.40.59:49166 to pear.hhs.harrisonburg.k12.va.us:ldap (source  
MAC a

ddr 0003939a29b0); first packet
Wed Feb 28 14:04:00 2007; TCP; eth2; 52 bytes; from  
204.111.40.59:49166 to pear.hhs.harrisonburg.k12.va.us:ldap (source  
MAC a

ddr 0080c8ca9fba); first packet
Wed Feb 28 14:04:00 2007; TCP; eth1; 64 bytes; from  
204.111.40.59:49167 to pear.hhs.harrisonburg.k12.va.us:ldap (source  
MAC a

ddr 0003939a29b0); first packet (SYN)
Wed Feb 28 14:04:00 2007; TCP; eth2; 64 bytes; from  
204.111.40.59:49167 to pear.hhs.harrisonburg.k12.va.us:ldap (source  
MAC a

ddr 0080c8ca9fba); first packet (SYN)
Wed Feb 28 14:04:00 2007; TCP; eth1; 64 bytes; from  
204.111.40.59:49168 to pear.hhs.harrisonburg.k12.va.us:ldap (source  
MAC a

ddr 0003939a29b0); first packet (SYN)
Wed Feb 28 14:04:00 2007; TCP; eth2; 64 bytes; from  
204.111.40.59:49168 to pear.hhs.harrisonburg.k12.va.us:ldap (source  
MAC a

ddr 0080c8ca9fba); first packet (SYN)
Wed Feb 28 14:04:00 2007; TCP; eth2; 60 bytes; from  
pear.hhs.harrisonburg.k12.va.us:ldap to 204.111.40.59:49168 (source  
MAC a

ddr 0009b7135180); first packet (SYN)
Wed Feb 28 14:04:00 2007; TCP; eth1; 60 bytes; from  
pear.hhs.harrisonburg.k12.va.us:ldap to 204.111.40.59:49168 (source  
MAC a

ddr 0080c8ca9fb9); first packet (SYN)
Wed Feb 28 14:04:00 2007; TCP; eth3; 60 bytes; from  
213.200.109.24:http to 204.111.40.59:49173 (source MAC addr  
000f352e81a2)

; first packet (SYN)
Wed Feb 28 14:04:01 2007; TCP; eth1; 60 bytes; from  
213.200.109.24:http to 204.111.40.59:49173 (source MAC addr  
0080c8ca9fb9)

; first packet (SYN)
Wed Feb 28 14:04:01 2007; TCP; eth1; 52 bytes; from  
204.111.40.59:49173 to 213.200.109.24:http (source MAC addr  
0003939a29b0)

; first packet
Wed Feb 28 14:04:01 2007; TCP; eth3; 52 bytes; from  
204.111.40.59:49173 to 213.200.109.24:http (source MAC addr  
0080c8ca9fbb)

; first packet
Wed Feb 28 14:04:02 2007; TCP; eth1; 46 bytes; from  
204.111.40.59:49178 to www.apple.com:http (source MAC addr  
0003939a29b0);

 first packet
Wed Feb 28 14:04:02 2007; TCP; eth3; 40 bytes; from  
204.111.40.59:49178 to www.apple.com:http (source MAC addr  
0080c8ca9fbb);

 first packet
Wed Feb 28 14:04:02 2007; TCP; eth1; 64 bytes; from  
204.111.40.59:49179 to www.apple.com:http (source MAC addr  
0003939a29b0);

 first packet (SYN)
Wed Feb 28 14:04:02 2007; TCP; eth3; 64 bytes; from  
204.111.40.59:49179 to www.apple.com:http (source MAC addr  
0080c8ca9fbb);

 first packet (SYN)
Wed Feb 28 14:04:02 2007; TCP; eth3; 1420 bytes; from  
www.apple.com:http to 204.111.40.59:49178 (source MAC addr 000f352e81a2

); first packet
Wed Feb 28 14:04:02 2007; TCP; eth1; 1420 bytes; from  
www.apple.com:http to 204.111.40.59:49178 (source MAC addr 0080c8ca9fb9

); first packet
Wed Feb 28 14:04:02 2007; TCP; eth3; 46 bytes; from www.apple.com:http  
to 204.111.40.59:49179 (source MAC addr 000f352e81a2);

 first packet (SYN)
Wed Feb 28 14:04:02 2007; TCP; eth1; 44 bytes; from www.apple.com:http  
to 204.111.40.59:49179 (source MAC addr 0080c8ca9fb9);

 first packet (SYN)
Wed Feb 28 14:04:02 2007; TCP; eth1; 152 bytes; from  
204.111.40.59:49171 to pear.hhs.harrisonburg.k12.va.us:ldap (source MAC

addr 0003939a29b0); first packet
Wed Feb 28 14:04:02 2007; TCP; eth2; 152 bytes; from  
204.111.40.59:49171 to pear.hhs.harrisonburg.k12.va.us:ldap (source MAC

addr 0080c8ca9fba); first packet
Wed Feb 28 14:04:02 2007; TCP; eth2; 52 bytes; from  
pear.hhs.harrisonburg.k12.va.us:ldap to 204.111.40.59:49171 (source  
MAC a

ddr 0009b7135180); first packet
Wed Feb 

[squid-users] iptraffic software update nogood

2007-02-28 Thread dhottinger

Iptraf dump from firewall showing computer that doesnt connect.



Wed Feb 28 15:14:33 2007; TCP; eth1; 328 bytes; from  
newproxy.harrisonburg.k12.va.us:webcache to 10.40.7.184:49402 (source  
MAC addr 0013726661fe); first packet
Wed Feb 28 15:14:33 2007; TCP; eth2; 328 bytes; from  
swscan.apple.com:http to 10.40.7.184:49402 (source MAC addr  
0080c8ca9fba); first packet
Wed Feb 28 15:14:33 2007; TCP; eth2; 52 bytes; from 10.40.7.184:49402  
to swscan.apple.com:http (source MAC addr 0009b7135180); first packet
Wed Feb 28 15:14:33 2007; TCP; eth1; 52 bytes; from 10.40.7.184:49402  
to newproxy.harrisonburg.k12.va.us:webcache (source MAC addr  
0080c8ca9fb9); first packet



--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools



Re: [squid-users] apple software updates

2007-02-27 Thread dhottinger

Quoting Henrik Nordstrom [EMAIL PROTECTED]:


mån 2007-02-26 klockan 13:11 -0500 skrev
[EMAIL PROTECTED]:


 NSLocalizedDescription = XML parser error:\n\tEncountered  unexpected
EOF\nOld-style plist parser error:\n\tMalformed data byte  group at
line 1; invalid
hex\n;
 NSURL = http://swscan.apple.com/content/catalogs/index-1.sucatalog;
}
XML parser error:
 Encountered unexpected EOF
Old-style plist parser error:
 Malformed data byte group at line 1; invalid hex
Everything worked ok till a month or so ago.  No changes were made   
on my end.


A longshot, but perhaps something caused an gzip:ed version of the
update catalog to get cached somehow.. The update agent doesn't like
that..

Squid-2.6 is sensitive to servers applying gzip compression wrongly. See
the broken_vary_encoding directive for a possible workaround if this is
the problem.

access.log data with log_mime_hdrs on can easily tell if this has
happened.  Regards
Henrik



I added log_mime_hdrs on, and did a ./squid -k reconfigure.  What  
should I be looking for in the access.log?  I'll look into the  
log_mime_hdrs.


thanks guys,

--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools



Re: [squid-users] apple software updates

2007-02-27 Thread dhottinger

Quoting Al Cripps [EMAIL PROTECTED]:


I'll give the 2.6 STABLE9 a try.
thanks,
al


Adrian Chadd wrote:


On Mon, Feb 26, 2007, Al Cripps wrote:

I have the same problem -- No solution, however.--  I have two   
different installations of squid.  For squid  2.5.STABLE6, I CAN   
do Apple updates with squid in place (transparent proxy)..
However for squid 2.6.STABLE1, I can NOT do Apple updates with   
squid being transparent proxy.  Again for OS 10.4.x   If I just   
take the squid 2.6 out of the loop via a change to iptables rule,   
then everything works fine (without proxy).  With 2.6 in place, it  
 fells every time.

al



And could you please re-try it with Squid-2.6.STABLE9? I've heard at least
one report of transparent interception working fine with MacOS/X updates
(and I run 10.4 at home and didn't notice any trouble whilst doing
transparent interception and software updates.)



Adrian



Al,
Let me know what you find out.  I dont think I can upgrade.  I'll have  
to check and see what versions Smartfilter supports.



--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools



Re: [squid-users] apple software updates

2007-02-27 Thread dhottinger

Quoting Henrik Nordstrom [EMAIL PROTECTED]:


mån 2007-02-26 klockan 13:11 -0500 skrev
[EMAIL PROTECTED]:


 NSLocalizedDescription = XML parser error:\n\tEncountered  unexpected
EOF\nOld-style plist parser error:\n\tMalformed data byte  group at
line 1; invalid
hex\n;
 NSURL = http://swscan.apple.com/content/catalogs/index-1.sucatalog;
}
XML parser error:
 Encountered unexpected EOF
Old-style plist parser error:
 Malformed data byte group at line 1; invalid hex
Everything worked ok till a month or so ago.  No changes were made   
on my end.


A longshot, but perhaps something caused an gzip:ed version of the
update catalog to get cached somehow.. The update agent doesn't like
that..

Squid-2.6 is sensitive to servers applying gzip compression wrongly. See
the broken_vary_encoding directive for a possible workaround if this is
the problem.

access.log data with log_mime_hdrs on can easily tell if this has
happened.

Regards
Henrik



Ok.
Here is the snippet from my logfile with log_mime_hdrs on.

1172582734.353 955773 10.40.7.184 TCP_MISS/200 91768 GET  
http://swscan.apple.com/content/catalogs/index-1.sucatalog -  
DIRECT/17.250.248.95 text/plain ALLOW Shareware/Freeware  
[User-Agent: CFNetwork/129.16\r\nAccept: */*\r\nAccept-Language:  
en\r\nAccept-Encoding: gzip, deflate\r\nX-Software-Update-Session-Id:  
U1jRJMoop0HfigV+\r\nConnection: keep-alive\r\nHost:  
swscan.apple.com\r\n] [HTTP/1.0 200 OK\r\nAccept-Ranges:  
bytes\r\nDate: Tue, 27 Feb 2007 13:09:38 GMT\r\nContent-Length:  
388177\r\nContent-Type: text/plain\r\nConnection:  
keep-alive\r\nServer: Apache/1.3.33 (Darwin)\r\nLast-Modified: Thu, 22  
Feb 2007 23:37:02 GMT\r\nETag: 1052102-5ec51-45de291e\r\nVia: 1.1  
netcache04 (NetCache NetApp/5.5R6)\r\n\r]
1172583132.390211 10.40.7.184 TCP_MISS/304 246 GET  
http://swscan.apple.com/content/catalogs/index-1.sucatalog -  
DIRECT/17.250.248.95 - ALLOW Shareware/Freeware [User-Agent:  
CFNetwork/129.16\r\nAccept: */*\r\nAccept-Language:  
en\r\nAccept-Encoding: gzip, deflate\r\nX-Software-Update-Session-Id:  
ai8CuNQop0FFIW5Z\r\nIf-Modified-Since: Thu, 22 Feb 2007 23:37:02  
GMT\r\nConnection: keep-alive\r\nHost: swscan.apple.com\r\n] [HTTP/1.0  
304 Not Modified\r\nDate: Tue, 27 Feb 2007 13:32:12 GMT\r\nConnection:  
keep-alive\r\nServer: Apache/1.3.33 (Darwin)\r\nETag:  
1052102-5ec51-45de291e\r\nVia: 1.1 netcache02 (NetCache  
NetApp/5.5R6)\r\n\r]


Not sure what I am looking at.  I see that the connection is allowed,  
and the initial connection is made to swscan.apple.com.  Anyone know  
how to decode all this?


--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools



Re: [squid-users] apple software updates

2007-02-27 Thread dhottinger

Quoting [EMAIL PROTECTED]:


Quoting Henrik Nordstrom [EMAIL PROTECTED]:


mån 2007-02-26 klockan 13:11 -0500 skrev
[EMAIL PROTECTED]:


NSLocalizedDescription = XML parser error:\n\tEncountered  unexpected
EOF\nOld-style plist parser error:\n\tMalformed data byte  group at
line 1; invalid
hex\n;
NSURL = http://swscan.apple.com/content/catalogs/index-1.sucatalog;
}
XML parser error:
Encountered unexpected EOF
Old-style plist parser error:
Malformed data byte group at line 1; invalid hex
Everything worked ok till a month or so ago.  No changes were made  
  on my end.


A longshot, but perhaps something caused an gzip:ed version of the
update catalog to get cached somehow.. The update agent doesn't like
that..

Squid-2.6 is sensitive to servers applying gzip compression wrongly. See
the broken_vary_encoding directive for a possible workaround if this is
the problem.

access.log data with log_mime_hdrs on can easily tell if this has
happened.  Regards
Henrik



I added log_mime_hdrs on, and did a ./squid -k reconfigure.  What
should I be looking for in the access.log?  I'll look into the
log_mime_hdrs.

thanks guys,

--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools


Terribly sorry.  I am running squid 2.5 stable 13.  I was running 2.6  
and smartfilter isnt 2.6 compliant.  So I had to install 2.5.


thanks,

ddh


--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools



Re: [squid-users] apple software updates

2007-02-27 Thread dhottinger

Quoting Henrik Nordstrom [EMAIL PROTECTED]:


tis 2007-02-27 klockan 09:00 -0500 skrev
[EMAIL PROTECTED]:


Ok.
Here is the snippet from my logfile with log_mime_hdrs on.

1172582734.353 955773 10.40.7.184 TCP_MISS/200 91768 GET
http://swscan.apple.com/content/catalogs/index-1.sucatalog -
DIRECT/17.250.248.95 text/plain ALLOW Shareware/Freeware
[User-Agent: CFNetwork/129.16\r\nAccept: */*\r\nAccept-Language:
en\r\nAccept-Encoding: gzip, deflate\r\nX-Software-Update-Session-Id:
U1jRJMoop0HfigV+\r\nConnection: keep-alive\r\nHost:
swscan.apple.com\r\n] [HTTP/1.0 200 OK\r\nAccept-Ranges:
bytes\r\nDate: Tue, 27 Feb 2007 13:09:38 GMT\r\nContent-Length:
388177\r\nContent-Type: text/plain\r\nConnection:
keep-alive\r\nServer: Apache/1.3.33 (Darwin)\r\nLast-Modified: Thu, 22
Feb 2007 23:37:02 GMT\r\nETag: 1052102-5ec51-45de291e\r\nVia: 1.1
netcache04 (NetCache NetApp/5.5R6)\r\n\r]


Looks fine to me.


1172583132.390211 10.40.7.184 TCP_MISS/304 246 GET
http://swscan.apple.com/content/catalogs/index-1.sucatalog -
DIRECT/17.250.248.95 - ALLOW Shareware/Freeware [User-Agent:
CFNetwork/129.16\r\nAccept: */*\r\nAccept-Language:
en\r\nAccept-Encoding: gzip, deflate\r\nX-Software-Update-Session-Id:
ai8CuNQop0FFIW5Z\r\nIf-Modified-Since: Thu, 22 Feb 2007 23:37:02
GMT\r\nConnection: keep-alive\r\nHost: swscan.apple.com\r\n] [HTTP/1.0
304 Not Modified\r\nDate: Tue, 27 Feb 2007 13:32:12 GMT\r\nConnection:
keep-alive\r\nServer: Apache/1.3.33 (Darwin)\r\nETag:
1052102-5ec51-45de291e\r\nVia: 1.1 netcache02 (NetCache
NetApp/5.5R6)\r\n\r]


This too. Here the data was already cached by the client (10.40.7.184).

Somewhere in the updates catalog on the client you should find this
index-1.sucatalog file. Please verify that the checksums match what is
expected

file size:  388177
md5:421aa448ea6b5bb3b5c79dea468b3223
sha1:   7eba515ea094634d6e67164a73e35c5e8d91ae10


Note: the above is only valid until Apple updates the catalog.

note:
Regards
Henrik



I cant find an index-1.sucatalog file anywhere on any of these apples.  
 Ive looked in all the directories I know about.  Google doesnt seem  
to help.   Does anyone have any idea where OSX keeps these files?


--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools



Re: [squid-users] apple software updates

2007-02-27 Thread dhottinger

Quoting Henrik Nordstrom [EMAIL PROTECTED]:


tis 2007-02-27 klockan 09:00 -0500 skrev
[EMAIL PROTECTED]:


Ok.
Here is the snippet from my logfile with log_mime_hdrs on.

1172582734.353 955773 10.40.7.184 TCP_MISS/200 91768 GET
http://swscan.apple.com/content/catalogs/index-1.sucatalog -
DIRECT/17.250.248.95 text/plain ALLOW Shareware/Freeware
[User-Agent: CFNetwork/129.16\r\nAccept: */*\r\nAccept-Language:
en\r\nAccept-Encoding: gzip, deflate\r\nX-Software-Update-Session-Id:
U1jRJMoop0HfigV+\r\nConnection: keep-alive\r\nHost:
swscan.apple.com\r\n] [HTTP/1.0 200 OK\r\nAccept-Ranges:
bytes\r\nDate: Tue, 27 Feb 2007 13:09:38 GMT\r\nContent-Length:
388177\r\nContent-Type: text/plain\r\nConnection:
keep-alive\r\nServer: Apache/1.3.33 (Darwin)\r\nLast-Modified: Thu, 22
Feb 2007 23:37:02 GMT\r\nETag: 1052102-5ec51-45de291e\r\nVia: 1.1
netcache04 (NetCache NetApp/5.5R6)\r\n\r]


Looks fine to me.


1172583132.390211 10.40.7.184 TCP_MISS/304 246 GET
http://swscan.apple.com/content/catalogs/index-1.sucatalog -
DIRECT/17.250.248.95 - ALLOW Shareware/Freeware [User-Agent:
CFNetwork/129.16\r\nAccept: */*\r\nAccept-Language:
en\r\nAccept-Encoding: gzip, deflate\r\nX-Software-Update-Session-Id:
ai8CuNQop0FFIW5Z\r\nIf-Modified-Since: Thu, 22 Feb 2007 23:37:02
GMT\r\nConnection: keep-alive\r\nHost: swscan.apple.com\r\n] [HTTP/1.0
304 Not Modified\r\nDate: Tue, 27 Feb 2007 13:32:12 GMT\r\nConnection:
keep-alive\r\nServer: Apache/1.3.33 (Darwin)\r\nETag:
1052102-5ec51-45de291e\r\nVia: 1.1 netcache02 (NetCache
NetApp/5.5R6)\r\n\r]


This too. Here the data was already cached by the client (10.40.7.184).

Somewhere in the updates catalog on the client you should find this
index-1.sucatalog file. Please verify that the checksums match what is
expected

file size:  388177
md5:421aa448ea6b5bb3b5c79dea468b3223
sha1:   7eba515ea094634d6e67164a73e35c5e8d91ae10


Note: the above is only valid until Apple updates the catalog.

note:
Regards
Henrik



In case I can muddy up things, here is a softwareupdate on a 10.3.9  
server (softwareupdates work on 10.3.9 apples):
1172596300.775467 10.40.0.3 TCP_REFRESH_HIT/200 41683 GET  
http://swscan.apple.com/scanningpoints/scanningpointX.xml -  
DIRECT/17.250.248.95 text/xml ALLOW Shareware/Freeware [Host:  
swscan.apple.com\r\nConnection: keep-alive\r\nAccept:  
*/*\r\nAccept-Encoding: gzip, deflate\r\nAccept-Language: en-us\r\n]  
[HTTP/1.0 200 OK\r\nAccept-Ranges: bytes\r\nDate: Mon, 26 Feb 2007  
16:03:50 GMT\r\nContent-Length: 41335\r\nContent-Type:  
text/xml\r\nConnection: keep-alive\r\nServer: Apache/1.3.33  
(Darwin)\r\nLast-Modified: Wed, 21 Feb 2007 23:01:27 GMT\r\nETag:  
55bbca-a177-45dccf47\r\nVia: 1.1 netcache04 (NetCache  
NetApp/5.5R6)\r\n\r]
1172596305.205   1362 10.40.0.3 TCP_MISS/200 110648 POST  
http://swquery.apple.com/WebObjects/SoftwareUpdatesServer -  
DIRECT/17.250.248.93 text/xml ALLOW Business, Computing/Internet  
[Host: swquery.apple.com\r\nConnection: close\r\nAccept:  
*/*\r\nAccept-Encoding: gzip, deflate\r\nAccept-Language:  
en-us\r\nContent-Type:  
application/x-www-form-urlencoded\r\nContent-Length: 12165\r\n]  
[HTTP/1.0 200 Apple\r\nDate: Tue, 27 Feb 2007 17:11:43  
GMT\r\nContent-Length: 110257\r\nContent-Type:  
text/xml;charset=UTF-8\r\nExpires: Tue, 27 Feb 2007 17:11:43  
GMT\r\nCache-Control: private, no-cache, no-store, must-revalidate,  
max-age=0\r\nConnection: keep-alive\r\nServer: Apache/1.3.33  
(Darwin)\r\npragma: no-cache\r\nVia: 1.1 netcache05 (NetCache  
NetApp/5.5R6)\r\n\r]


Looks like they go to the sam server, but different directorys?  I  
thought maybe the software update servers were different which would  
lead me to believe one was doing something funny with packets or  
something.  But that isnt the case.


--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools



Re: [squid-users] apple software updates

2007-02-27 Thread dhottinger

Quoting Al Cripps [EMAIL PROTECTED]:


Adrian,
I installed squid-2.6.STABLE9 (build as of 1/24/07) and tested tonight.
 I found NO difference in the Apple update process for this version of
squid as opposed to sqauid 2.6.STABLE1, i.e., it failed by hanging in
retrieving the Apple update information.   As previously reported, on a
MAC running 10.4.x that has not had a recent update, the update fails.
If  you do an Apple update while squid is not in the loop, then put
squid in the loop, the update process works fine...but obviously there
are no files to download in this case.  This tends to imply that
perhaps and as suggested by Henrik in another post that uncompressing
of some file causes the failure?   Any suggestions?
thanks,
al cripps


Adrian Chadd wrote:


On Mon, Feb 26, 2007, Al Cripps wrote:

I have the same problem -- No solution, however.--  I have two   
different installations of squid.  For squid  2.5.STABLE6, I CAN   
do Apple updates with squid in place (transparent proxy)..
However for squid 2.6.STABLE1, I can NOT do Apple updates with   
squid being transparent proxy.  Again for OS 10.4.x   If I just   
take the squid 2.6 out of the loop via a change to iptables rule,   
then everything works fine (without proxy).  With 2.6 in place, it  
 fells every time.

al



And could you please re-try it with Squid-2.6.STABLE9? I've heard at least
one report of transparent interception working fine with MacOS/X updates
(and I run 10.4 at home and didn't notice any trouble whilst doing
transparent interception and software updates.)



Adrian


This is really starting to bother me.  I have been running the same  
version of Squid since December.  No problems with any of my apples  
updating.  The problems started just a few weeks ago.  I called apple  
tech support, the tech said if os 10.3 works than 10.4 should.   He  
elevated me to the next level of support.  That was 2 weeks ago.  No  
calls back.  I get the feeling that apple changed something on their  
end that has caused this not to work with Squid.  Im going to look at  
my firewall and see if I can do something to let apple.com traffic to  
go straight through and not get redirected to my proxy server.  Yes, I  
too can take squid out of loop, run updates then software updates  
works, for a couple of days.  Sometimes their are updates to download  
also.  Is there any other testing we could do?



--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools



[squid-users] apple software updates

2007-02-26 Thread dhottinger
Im having issues connecting to apple software updates when going  
through my squid proxy.  When computer is plugged into dmz and doesnt  
go through proxy it seems to work ok.  I get the following message  
when running softwareupdater from terminal on an apple os 10.4.x:


2007-02-26 09:02:11.127 softwareupdate[234]   
loader:didFailWithError:NSError XML parser

error:
Encountered unexpected EOF
Old-style plist parser error:
Malformed data byte group at line 1; invalid hex
 Domain=SUCatalogLoader Code=0 UserInfo={
NSLocalizedDescription = XML parser error:\n\tEncountered  unexpected
EOF\nOld-style plist parser error:\n\tMalformed data byte  group at  
line 1; invalid

hex\n;
NSURL = http://swscan.apple.com/content/catalogs/index-1.sucatalog;
}
XML parser error:
Encountered unexpected EOF
Old-style plist parser error:
Malformed data byte group at line 1; invalid hex
Everything worked ok till a month or so ago.  No changes were made on my end.
I can see the initial connection made to swscan.apple.com.  When  
computer does not go through proxy things work fine.   Im not quite  
sure what the issue could be.  Im running squid 2.6. stable 3.  Anyone  
have any ideas as to why this doesnt work?


thanks,

ddh

--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools



Re: [squid-users] squid stuck on old site

2007-02-01 Thread dhottinger

Local cache on your Workstation?

Quoting John Oliver [EMAIL PROTECTED]:


On Fri, Feb 02, 2007 at 07:53:27AM +0800, Adrian Chadd wrote:

On Thu, Feb 01, 2007, John Oliver wrote:
 We're rolling out a new web site.  Externally, everyone can see it just
 fine.  Internally, we see the old site.  I tried squidclient -m PURGE
 with every possible variation I could think of, but get mostly 404s  i
 did get a couple of 200s, but keep seeing the old site.  I went into
 squid.conf and added an ACL to not cache our site at all, but... that's
 right, we still see the old site.

 I'm at a loss here.  What do i need to do to make squid completely
 forget about the old site?

Whats your Squid configuration look like?


I could paste it, I suppose, but it's pretty much what came out of the
box.  I certainly haven't added anything tricky to tell it to refuse to
refresh any given item. :-)


Which version of Squid?


squid-2.5.STABLE6  Yes, I know it's old.  That's what Red Hat supports
for RHEL4.  But then, I figure that the ability to flush something out
of the cache has existed for a whole lot longer than the difference
between now and when this version was released.

I cleaned out the cache directory and used squid -z to rebuild it, and
was still seeing the same old site.  I have a hard time picturing how
that's even possible ;-)

--
***
* John Oliver http://www.john-oliver.net/ *
* *
***





--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools



Re: [squid-users] ditch squid or not?

2007-01-03 Thread dhottinger



Quoting Nick Duda [EMAIL PROTECTED]:



I've been fighting this fight for far to long without resolution. I've
emailed the list at times with no resolution to my problem. I'm now
faced with ditching Squid and SquidGuard as our corporate content
filtering product because it can not do what we need. I'll offer the
problem one more time in hopes of getting an answer , or at least
pointed in the direction.

Things to note: SquidGuard is no longer in dev (at least until someone
picks its up) so gettign any support whatsoever isn't happening.

The setup:
I run Squid with SquidGuard in a branch office of about 400 employees.
This branch office only has 2 dedicated private line 1.5mb (bonded for
3mb total) to the corporate office, no internet access directly. All
internet traffic is routed over these private lines to the corporate
office then routed to the internet from there. In this branch office is
the Squid server. Only this server has the rights to go out to the
internet over the private lines, nothing else. If something in this
branch office isn't configured to use the Squid proxy server (which uses
NT authentication with the AD domain) its not going anywhere. Pretty
straight forward.

On the Squid server I run SquidGuard, and subscribe to use the
Blacklists from urlblacklist.com (which puts the files in a format
natively that squidguard likes but not what squid likes). I use pretty
much all the blacklist files in some way or another.

My Problem:

I want to block certain people/groups from using certain blacklists (the
ones from urlblacklist.com) while allowing other access to them. Based
on previous emails to the squid group and the fact that nobody answers
or knows anything about squidguard on the squidguard mailing list
(ironic), squidguard can't do what i want.

In active directory, I setup Security groups with the people I want for
a squidguard rule. For instance, I have an active directory group called
Can access webmail and Can access IM. In this group I add all the
people that I want to access online webmail like gmail, yahoo mail...etc
and in the other people that can access Instant Messaging urls.

On the proxy I run a script:

# Start Script #

#!/bin/sh

DC='x.x.x.x'
EMAIL=/usr/local/squidGuard/db/users/EmailUsers
IM=/usr/local/squidGuard/db/users/IMUsers

EMAILemployees=`net rpc group members Can access webmail -S $DC -U
username%password | awk '{print substr($0,14,10)}'  $EMAIL`
EMAILemployees=`net rpc group members Can access IM -S $DC -U
username%password | awk '{print substr($0,14,10)}'  $IM`

chown squid.squid /usr/local/squidGuard/db/users/* -Rf

/usr/local/squid/sbin/squid -k reconfigure

# End SCript #

This script runs every x minutes and the output is a file with a list of
users in the format of first inital last name (ie. jdoh)

In the squidguard.conf file I setup something like this:

source EmailUsers {
   userlistusers/EmailUsers
}

source IMUsers {
   userlistusers/IMUsers
}

EmailUsers {
pass webmail mail !ads !adult !aggressive !antispyware !artnudes
!banking !beerliquorinfo !beerliquorsale !cellphones !chat !childcare
!clothing !culinary !customblocked !dating !dialers !drugs !ecommerce
!frencheducation !gambling !government !hacking !homerepair !jewelry
!jobsearch !kidstimewasting !naturism !onlineauctions !onlinegames
!onlinepayment !personalfinance !phishing !porn !proxy !radio !religion
!ringtones !sexuality !spyware !vacation !violence !virusinfected !warez
!weapons all
redirect
http://localhost/errors/aclerror.php?clientaddr=%aclientname=%nclientu
ser=%iclientgroup=%surl=%utargetgroup=%t
}

IMUsers {
pass instantmessaging !ads !adult !aggressive !antispyware
!artnudes !banking !beerliquorinfo !beerliquorsale !cellphones !chat
!childcare !clothing !culinary !customblocked !dating !dialers !drugs
!ecommerce !frencheducation !gambling !government !hacking !homerepair
!jewelry !jobsearch !kidstimewasting !mail !naturism !onlineauctions
!onlinegames !onlinepayment !personalfinance !phishing !porn !proxy
!radio !religion !ringtones !sexuality !spyware !vacation !violence
!virusinfected !warez !weapons !webmail all
redirect
http://localhost/errors/aclerror.php?clientaddr=%aclientname=%nclientu
ser=%iclientgroup=%surl=%utargetgroup=%t
}

default {
pass !ads !adult !aggressive !antispyware !artnudes !banking
!beerliquorinfo !beerliquorsale !cellphones !chat !childcare !clothing
!culinary !customblocked !dating !dialers !drugs !ecommerce
!frencheducation !gambling !government !hacking !homerepair
!instantmessaging !jewelry !jobsearch !kidstimewasting !mail !naturism
!onlineauctions !onlinegames !onlinepayment !personalfinance !phishing
!porn !proxy !radio !religion !ringtones !sexuality !spyware !vacation
!violence !virusinfected !warez !weapons !webmail all
redirect
http://localhost/errors/aclerror.php?clientaddr=%aclientname=%nclientu

Re: [squid-users] Squid cannot start because it can't open 'on' for writing...

2006-12-07 Thread dhottinger

Quoting Marcello Romani [EMAIL PROTECTED]:


Hi,
I'm having a strange problem with squid.
When I stop and restart the program, it fails to start; cache.log says
it can't open on for writing, and suggests to check that the parent
directory be writable by the user squid.

I don't think this is a file permission issue, because even changing
777 permission to the entire /var/cache dir doesn't solve the problem.
I also tried lsof to check for locks on that file, but didn't find anything.

Yesterday I solved the problem by deleting the entire cache directory
and upgrading squid (now I'm running 2.6.STABLE4).

Googling around with strings from the message in the cache.log didn't
give much help, which makes me think I'm nearly the only one having
this issue.

The relevant lines from cache.log are:

- 8 
Squid Cache (Version 2.6.STABLE3): Terminated abnormally.
CPU Usage: 0.008 seconds = 0.008 user + 0.000 sys
Maximum Resident Size: 0 KB
Page faults with physical i/o: 0
2006/12/06 13:43:13| Starting Squid Cache version 2.6.STABLE3 for
i686-pc-linux-gnu...
2006/12/06 13:43:13| Process ID 15637
2006/12/06 13:43:13| With 1024 file descriptors available
2006/12/06 13:43:13| Using epoll for the IO loop
2006/12/06 13:43:13| Performing DNS Tests...
2006/12/06 13:43:13| Successful DNS name lookup tests...
2006/12/06 13:43:13| DNS Socket created at 0.0.0.0, port 33030, FD 5
2006/12/06 13:43:13| Adding nameserver 192.9.200.200 from /etc/resolv.conf
2006/12/06 13:43:13| Adding nameserver 151.99.125.1 from /etc/resolv.conf
2006/12/06 13:43:13| Adding nameserver 151.99.125.2 from /etc/resolv.conf
2006/12/06 13:43:13| Adding nameserver 151.99.125.3 from /etc/resolv.conf
2006/12/06 13:43:13| Adding nameserver 212.216.172.222 from /etc/resolv.conf
2006/12/06 13:43:13| Adding nameserver 212.216.112.112 from /etc/resolv.conf
FATAL: Cannot open 'on' for writing.
The parent directory must be writeable by the
user 'squid', which is the cache_effective_user
set in squid.conf.
Squid Cache (Version 2.6.STABLE3): Terminated abnormally.
CPU Usage: 0.008 seconds = 0.008 user + 0.000 sys
Maximum Resident Size: 0 KB
Page faults with physical i/o: 0
- 8 ---


Thanks in advance.


--
Marcello Romani
Responsabile IT
Ottotecnica s.r.l.
http://www.ottotecnica.com
Does your squid user own the cache directory.  Or it sounds like you  
may have an error in your .conf file.  FATAL: Cannot open 'on' for  
writing, sounds like it is looking for a directory named 'on'.  Dont  
suppose you made any changes to squid prior to this?




--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools



Re: [squid-users] LDAP authentication?

2006-12-02 Thread dhottinger
Sarg shows access by ip.  Is there no dist. for osx10.4?  If you have  
xcode you may be able to stroke it to work.


Quoting Jaime [EMAIL PROTECTED]:


I apologize for what might be a FAQ, but I can't find the details that
I need in order to know what I'm doing.  :)

I have a perfectly functioning squid-2.5/DansGuardian/FreeBSD based
transparent proxy running at the public school district that I work in.
 What I'd like to do is know who is accessing each URL.  Users come
from different IPs all the time.  Therefore, I think that this requires
authentication, which in turn requires a switch from transparent to
manual proxying.  What I can't figure out is how to set up this
authentication with the LDAP server that I'm already running.  Its a
MacOS X Server 10.4 system, if that helps at all.

I found this article in the past:

http://www.afp548.com/article.php?story=20041207040115940query=squid

The problem is that I could never get pam_auth to work when invoked
from the command line.  So I'm pretty sure that something wasn't done
right, but I don't know what.  I realize that the article is written as
if I was compiling squid on OSX and I'm using FreeBSD, so I tried to
make some course corrections as I went.  I may have screwed those up.

Any ideas?  Is this article even harder than I need?

Thanks in advance,
Jaime




--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools



Re: [squid-users] Squid 2.6 shutdown by itself

2006-11-14 Thread dhottinger
Squid has a 2 gb limit on logfiles.   If it goes above this it will  
shutdown.  I rotate my logfiles everynight.  use squid -k logrotate.


Quoting Jason Gauthier [EMAIL PROTECTED]:


I believe.  I got a report that the internet was down, so I checked
squid, and it was not responding to connections.  I started it, and
checked cache.log.  I see the results below.  I would appreciate some
insight as to what may have caused this to occur.

Thanks!

Jason
-=-=-=-=-=-=-
Log file:

2006/11/14 06:33:25| storeDirWriteCleanLogs: Starting...
2006/11/14 06:33:25| WARNING: Closing open FD   39
2006/11/14 06:33:25| commSetEvents: epoll_ctl(EPOLL_CTL_DEL): failed on
fd=39: (1) Operatio
n not permitted
2006/11/14 06:33:25| WARNING: Closing open FD   40
2006/11/14 06:33:25| commSetEvents: epoll_ctl(EPOLL_CTL_DEL): failed on
fd=40: (1) Operatio
n not permitted
2006/11/14 06:33:26| 65536 entries written so far.
2006/11/14 06:33:26|131072 entries written so far.
2006/11/14 06:33:26|196608 entries written so far.
2006/11/14 06:33:26|262144 entries written so far.
2006/11/14 06:33:26|327680 entries written so far.
2006/11/14 06:33:26|393216 entries written so far.
2006/11/14 06:33:26|458752 entries written so far.
2006/11/14 06:33:26|524288 entries written so far.
2006/11/14 06:33:26|589824 entries written so far.
2006/11/14 06:33:26|655360 entries written so far.
2006/11/14 06:33:26|720896 entries written so far.
2006/11/14 06:33:26|786432 entries written so far.
2006/11/14 06:33:26|851968 entries written so far.
2006/11/14 06:33:26|917504 entries written so far.
2006/11/14 06:33:26|983040 entries written so far.
2006/11/14 06:33:26|   1048576 entries written so far.
2006/11/14 06:33:26|   1114112 entries written so far.
2006/11/14 06:33:26|   Finished.  Wrote 1117052 entries.
2006/11/14 06:33:26|   Took 0.9 seconds (1309073.4 entries/sec).
FATAL: logfileWrite: /var/log/squid/store.log: (11) Resource temporarily
unavailable

Squid Cache (Version 2.6.STABLE4): Terminated abnormally.
CPU Usage: 699.184 seconds = 418.406 user + 280.778 sys
Maximum Resident Size: 0 KB
Page faults with physical i/o: 7
Memory usage for squid via mallinfo():
total space in arena:  132024 KB
Ordinary blocks:   130609 KB   1169 blks
Small blocks:   0 KB  0 blks
Holding blocks:   748 KB  2 blks
Free Small blocks:  0 KB
Free Ordinary blocks:1414 KB
Total in use:  131357 KB 99%
Total free:  1414 KB 1%
2006/11/14 06:33:29| Starting Squid Cache version 2.6.STABLE4 for
i686-pc-linux-gnu...
2006/11/14 06:33:29| Process ID 10023
2006/11/14 06:33:29| With 1024 file descriptors available
2006/11/14 06:33:29| Using epoll for the IO loop
2006/11/14 06:33:29| Performing DNS Tests...
2006/11/14 06:33:29| Successful DNS name lookup tests...
2006/11/14 06:33:29| DNS Socket created at 0.0.0.0, port 54203, FD 6
2006/11/14 06:33:29| Adding domain ctg.com from /etc/resolv.conf
2006/11/14 06:33:29| Adding nameserver 192.168.74.8 from
/etc/resolv.conf
2006/11/14 06:33:29| Adding nameserver 192.168.74.9 from
/etc/resolv.conf
2006/11/14 06:33:29| helperStatefulOpenServers: Starting 15 'ntlm_auth'
processes
2006/11/14 06:33:30| helperOpenServers: Starting 5 'ntlm_auth' processes
2006/11/14 06:33:30| Unlinkd pipe opened on FD 31
2006/11/14 06:33:30| Swap maxSize 2048 KB, estimated 1575384 objects
2006/11/14 06:33:30| Target number of buckets: 78769
2006/11/14 06:33:30| Using 131072 Store buckets
2006/11/14 06:33:30| Max Mem  size: 16384 KB
2006/11/14 06:33:30| Max Swap size: 2048 KB
2006/11/14 06:33:30| Rebuilding storage in /var/squid1/cache1 (CLEAN)
2006/11/14 06:33:30| Rebuilding storage in /var/squid2/cache1 (CLEAN)
2006/11/14 06:33:30| Rebuilding storage in /var/squid1/cache2 (CLEAN)
2006/11/14 06:33:30| Rebuilding storage in /var/squid2/cache2 (CLEAN)
2006/11/14 06:33:30| Using Least Load store dir selection
2006/11/14 06:33:30| chdir: /opt/squid/var/cache: (2) No such file or
directory
2006/11/14 06:33:30| Current Directory is /opt/squid/etc
2006/11/14 06:33:30| Loaded Icons.
2006/11/14 06:33:30| Accepting proxy HTTP connections at 0.0.0.0, port
3128, FD 39.
2006/11/14 06:33:30| Accepting ICP messages at 0.0.0.0, port 3130, FD
40.
2006/11/14 06:33:30| Accepting SNMP messages on port 3401, FD 41.
2006/11/14 06:33:30| WCCP Disabled.
2006/11/14 06:33:30| R2006/11/14 07:56:12| Preparing for shutdown after
95958 requests





--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools