RE: [squid-users] Prevent squid from caching a page

2009-11-13 Thread Mike Marchywka










> Date: Fri, 13 Nov 2009 11:17:54 -0500
> From: mra...@lsd.k12.mi.us
> To: squid-users@squid-cache.org
> Subject: Re: [squid-users] Prevent squid from caching a page
>
> Landy Landy wrote:
>> Is there a way to tell squid not to cache a page or have it ignore it? 
>> Theres a page I cannot view from our lan: www.dgii.gob.do. I don't know why 
>> but, I'm suspecting there must be something with squid since, I noticed I 
>> receive the page's title and it disappears.
>>
>> Please help with this thanks.
>>
>>
>>
>
> Check out the no_cache directive.

I downloaded and installed the ad zapper.How would you rate
that as a learning tool for stuff like this?

>
>
> --
> Mike Rambo





Note: hotmail is now unusable for TEXT, I am moving to marchy...@gmail.com or 
also use
marchy...@yahoo.com. Thanks.

Mike Marchywka
586 Saint James Walk
Marietta GA 30067-7165
415-264-8477 (w)<- use this
404-788-1216 (C)<- leave message
989-348-4796 (P)<- emergency only
marchy...@hotmail.com
Note: If I am asking for free stuff, I normally use for hobby/non-profit
information but may use in investment forums, public and private.
Please indicate any concerns if applicable.





  
_
Hotmail: Trusted email with powerful SPAM protection.
http://clk.atdmt.com/GBL/go/177141665/direct/01/

RE: [squid-users] Looking for web usage reporting solution

2009-11-13 Thread Mike Marchywka







> From:
> To: squid-users@squid-cache.org
> Date: Fri, 13 Nov 2009 09:31:35 -0700
> Subject: [squid-users] Looking for web usage reporting solution
>
> I am looking for a web usage reporting solution that can run via sniffing or 
> from a mirror port on a switch. I envision this solution would simply log 
> each URL request it sees and allow reports to be generated on web sites that 
> internal users have gone to. I've searched high and low, but cannot find a 
> "ready-made" solution, so I'm looking to put it together myself.
>
> Most people/posts suggest using squid/squidgard/dan's guardian, but it 
> appears to me that is only an inline solution, and I would prefer a sniffing 
> solution for safety (if machine crashes, it doesn't take down Internet). In 
> that sense, it would work a lot like websense, but without the blocking, only 
> reporting.
>
> From a high-level pseudo-code standpoint, it would simply sniff all traffic, 
> and when it sees a packet requesting a webpage, it parses it and dumps these 
> results into a database:
>
> -Date
> -Time
> -Source IP
> -Dest IP
> -URL requested
> -FQDN portion of web request - IE: if request was for
> http://www.microsoft.com/windows/server/2003, it records only
> www.microsoft.com here
> -domain portion of web request - only microsoft.com in above example
>
> Using this data, I can then produce reports for the client on who went where 
> when Personally, I thought this would be a great program for open source, 
> but I can't find anything like this already out there!!! It seems like kind 
> of a mix between Squid, NTOP and Snort...
>

What's wrong with running a bash script on the squid logs?





> Thanks for any thoughts on this project!
  
_
Hotmail: Trusted email with Microsoft's powerful SPAM protection.
http://clk.atdmt.com/GBL/go/177141664/direct/01/
http://clk.atdmt.com/GBL/go/177141664/direct/01/


RE: [squid-users] Prevent squid from caching a page

2009-11-13 Thread Mike Marchywka



>> I downloaded and installed the ad zapper.How would you rate
>> that as a learning tool for stuff like this?
>>
>
> I would rate adzapper as irrelevant to issues receiving the full body of
> a reply object.

Sure, but in terms of the issue of understanding related squid configuration
parameters? Any particularly good deployments of squid with open source support 
code that may illustrate squid features ( let's say you had a way to redir 
based on user agent or something and you had to regenerate the corrected 
version for each request and change the cachability headers etc). 



>
> Amos
> --
> Please be using
> Current Stable Squid 2.7.STABLE7 or 3.0.STABLE20
> Current Beta Squid 3.1.0.14
  
_
Windows 7: Unclutter your desktop.
http://go.microsoft.com/?linkid=9690331&ocid=PID24727::T:WLMTAGL:ON:WL:en-US:WWL_WIN_evergreen:112009

RE: [squid-users] Configuration problems attempting to cache Google Earth/dynamic content

2009-11-18 Thread Mike Marchywka





> Date: Wed, 18 Nov 2009 12:02:40 -0600
> From:
> To: squid-users@squid-cache.org
> Subject: [squid-users] Configuration problems attempting to cache Google 
> Earth/dynamic content
>
> I am trying to set up a server that is running SUSE SLES 11 as a Squid
> Proxy to help cache Google Earth content in a low-bandwidth
> environment. I have tried following the steps in this article:
> http://wiki.squid-cache.org/Features/StoreUrlRewrite?action=recall&rev=7
> but I am not having any luck with getting it to work. In fact, when I
> try those steps, Squid will automatically stop about 15 seconds after
> start. The system is running version 2.7 Stable, as installed by
> YAST.

Why does it stop? There should be some logs to check and if you invoke
it from the command line in foreground you can get quick feedback.
Do you want it to cache in contradiction to server response headers?

>
> Anyone who could offer some help or a configuration file that would
> work with this?
  
_
Windows 7: It works the way you want. Learn more.
http://www.microsoft.com/Windows/windows-7/default.aspx?ocid=PID24727::T:WLMTAGL:ON:WL:en-US:WWL_WIN_evergreen:112009v2

RE: [squid-users] Squid3 reverse proxy & Failed to select source strange errors

2009-11-23 Thread Mike Marchywka










> Date: Tue, 24 Nov 2009 02:08:01 +1300
> From: squ...@treenet.co.nz
> To: squid-users@squid-cache.org
> Subject: Re: [squid-users] Squid3 reverse proxy & Failed to select source 
> strange errors
>
> David B. wrote:
>> Hi Squid users,
>>
>> We're using squid3 as a reverse proxy on several boxes and he's working
>> quite well.
>>
>> Squid configuration is quite simple :
>> cache_peer X.X.X.X parent 80 0 no-query originserver no-digest
>> cache_peer Y.Y.Y.Y parent 80 0 no-query originserver no-digest
>>
>> cache_peer_domain X.X.X.X static.myhost1.com
>> cache_peer_domain Y.Y.Y.Y static.myhost2.com
>>
>> So squid deliver static content and ONLY get missing files from backend
>> with cache_peer.
>>
>> But sometimes (several times a day), i got some stange errors from
>> cache.log.
>> It seems that squid is trying to contact servers that are not in
>> cache_peer list with domain name that I should not handle any request !.
>>
>> Exemple :
>> 2009/11/23 08:36:28| Failed to select source for
>> 'http://img43.imageshack.us/img43/416/greysanatomypromotional.jpg'
>> 2009/11/23 08:36:28| always_direct = 0
>> 2009/11/23 08:36:28| never_direct = 0
>> 2009/11/23 08:36:28| timedout = 0
>> [snip]
>> 2009/11/23 11:02:26| Failed to select source for
>> 'http://pagead2.googlesyndication.com/pagead/show_ads.js'
>> 2009/11/23 11:02:26| always_direct = 0
>> 2009/11/23 11:02:26| never_direct = 0
>> 2009/11/23 11:02:26| timedout = 0
>>
>> I'm not imageshack or google. :)
>>
>
> Normal website attacks.
>
> One of the benefits of using Squid is to prevent these resource wasters
> getting near the backend processors. "Failed to select source" is good
> news.
>
> You might also want to occasionally scan the access.log to see if any
> foreign requests do get through (2xx or 3xx status). If any do you have
> a problem, otherwise everything is fine.

I think we had our's up for maybe 1 day before it was discovered.
We just added our own headers for authentication. Not sure this
is always an option but if you can restrict by IP or UA or something
that may be the easiest thing to do. 


>
>
> NP: If you see many of these attacks (or a few regularly) and can log
> the sources there are services around for back-tracking and killing off
> the attack sources. I administrate one such and am always seeking
> reliable data sources.
>
> Amos
> --
> Please be using
> Current Stable Squid 2.7.STABLE7 or 3.0.STABLE20
> Current Beta Squid 3.1.0.14
  
_
Windows 7: It works the way you want. Learn more.
http://www.microsoft.com/Windows/windows-7/default.aspx?ocid=PID24727::T:WLMTAGL:ON:WL:en-US:WWL_WIN_evergreen:112009v2

RE: [squid-users] Squid3 reverse proxy & Failed to select source strange errors

2009-11-23 Thread Mike Marchywka










> Date: Mon, 23 Nov 2009 16:32:29 +0100
> From: haazel...@gmail.com
> To: marchy...@hotmail.com
> CC: squid-users@squid-cache.org
> Subject: Re: [squid-users] Squid3 reverse proxy & Failed to select source 
> strange errors
>
> Hi mike,
>
> Mike Marchywka a écrit :
>> [snip]
>>> Normal website attacks.
>>>
>>> One of the benefits of using Squid is to prevent these resource wasters
>>> getting near the backend processors. "Failed to select source" is good
>>> news.
>>>
>>> You might also want to occasionally scan the access.log to see if any
>>> foreign requests do get through (2xx or 3xx status). If any do you have
>>> a problem, otherwise everything is fine.
>>>
>>
>> I think we had our's up for maybe 1 day before it was discovered.
>> We just added our own headers for authentication. Not sure this
>> is always an option but if you can restrict by IP or UA or something
>> that may be the easiest thing to do.
>>
> Sure, this could be great, but this will not help us I think.
> We're using squid as a reverse proxy, so anyone can tell squid : "please
> give me this static content or this image". I can't see how can i
> restrict this. :)
>

I haven't given this much thought but if you are just storing things
that go with other content from your server, what bout cookies? If you 
only want to serve resources needed for your own pages, then set
some kind of cookie or other header like referer and use that for a squid 
validation. If the req doesn't have the page specific header don't return 
anything.





> Regards
> David.
  
_
Bing brings you maps, menus, and reviews organized in one place.
http://www.bing.com/search?q=restaurants&form=MFESRP&publ=WLHMTAG&crea=TEXT_MFESRP_Local_MapsMenu_Resturants_1x1

RE: [squid-users] is it bad to constantly reload squid.conf

2009-11-25 Thread Mike Marchywka










> Date: Thu, 26 Nov 2009 01:36:14 +1300
> From: squid
> CC: squid-users@squid-cache.org
> Subject: Re: [squid-users] is it bad to constantly reload squid.conf
>
> Jeff Peng wrote:
>> You will 'squid -k reconfig' for reloading the new config file.
>> This is safe enough form what I checked in the sources years ago.
>> But if you reconfig it too frequently, I don't know the result.
>>
>> Regards.
>
> Squid will not accept new requests for the period it takes to reload the
> config, restart the helpers, and write the cache index to disk and read
> it back in again.
>
> Amos
>


I gues the question might be, " are there known of possible memory leaks
or other ways this could generate garbage?"

I use squid under cygwin and usually just stop/start with cygrunsrv .




Note: hotmail is now unusable for TEXT, I am moving to marchy...@gmail.com or 
also use
marchy...@yahoo.com. Thanks. NEVER USE HOTMAIL, EVER

Mike Marchywka
586 Saint James Walk
Marietta GA 30067-7165
415-264-8477 (w)<- use this
404-788-1216 (C)<- leave message
989-348-4796 (P)<- emergency only
marchy...@hotmail.com
Note: If I am asking for free stuff, I normally use for hobby/non-profit
information but may use in investment forums, public and private.
Please indicate any concerns if applicable.


  
_
Windows 7: It works the way you want. Learn more.
http://www.microsoft.com/Windows/windows-7/default.aspx?ocid=PID24727::T:WLMTAGL:ON:WL:en-US:WWL_WIN_evergreen:112009v2

RE: [squid-users] coredumps on 2.7

2009-11-27 Thread Mike Marchywka











> Date: Fri, 27 Nov 2009 02:03:06 -0800
> From: jd...@yahoo.com
> To: squid-users@squid-cache.org
> Subject: Re: [squid-users] coredumps on 2.7
>
> From: Quin Guin 
>> I am running 2.7-STABALE6 on many squid servers and just recently in the last
>> few days I am seeing a lot of coredumps. I have most of the coredumps still 
>> and
>> I would like to understand what happened?
>> I did search the mailing list and I used gdb to generate a stack trace but it
>> didn't give ME a lot of useful information.
>> [xxx...@cach2 cache]# gdb squid core.31033
>> GNU gdb Red Hat Linux (6.3.0.0-1.132.EL4rh)
>> Copyright 2004 Free Software Foundation, Inc.
>> GDB is free software, covered by the GNU General Public License, and you are
>> welcome to change it and/or distribute copies of it under certain conditions.
>> Type "show copying" to see the conditions.
>> There is absolutely no warranty for GDB. Type "show warranty" for details.
>> This GDB was configured as "i386-redhat-linux-gnu"...squid: No such file or
>> directory.
>> Core was generated by `(squid)'.
>> Program terminated with signal 6, Aborted.
>> #0 0x0054a7a2 in ?? ()
>> (gdb) where
>> #0 0x0054a7a2 in ?? ()
>> #1 0x0058f7a5 in ?? ()
>> #2 0x in ?? ()
>> (gdb)
>> So I was wondering if someone could point me to where I can find more
>> information on interpreting the coredumps.
>
> I think you would have to compile squid with debuging to see something 
> usefull... no?
> "squid: No such file or directory."
> If it is squid that says this, do you see anything in the logs about a 
> missing file or directory?
> Maybe try an strace and check what is the last access attempt before it 
> coredumps.

Depending on your situation, it may help to backup and do
some simple things esp if you can't find any suspects that may cause
a problem to start a few days ago.
When I was just starting, I tried to invoke it in foreground from
command line, squid -f conf , and it either told me more or less what
was wrong or there was a log file telling me what it didn't like(  When
you get it working in fg, then it mysteriously dies as a service, in my
case it turned out to be things like ownership or permissions of swap
or log dirs). I'm not sure how long it takes to produce your dump but 
invoking in fg may be an option for testing once you have suspects or
can get failures in an hour or so. 
A gradually worsening problem suggests something is getting used up,
hard to know what else you may have working with squid. This could be anything,
is someone changing permissions on directories you use, etc. 







>
> JD
>
>
>
  
_
Windows 7: I wanted simpler, now it's simpler. I'm a rock star.
http://www.microsoft.com/Windows/windows-7/default.aspx?h=myidea?ocid=PID24727::T:WLMTAGL:ON:WL:en-US:WWL_WIN_myidea:112009

RE: [squid-users] Squid HTTP Headers

2009-11-27 Thread Mike Marchywka










> Date: Fri, 27 Nov 2009 09:08:17 -0600
> From: jhodges
> To: squid-users@squid-cache.org
> Subject: [squid-users] Squid HTTP Headers
>
> Hello and thanks for taking the time to read this post. All feedback is 
> welcome.
>
> I work for a cell phone company and we use squid along with some fancy NAT 
> and PAT in order to offer various proxying services for cell phone web 
> browsers. Thus far we have been quite successful. If anyone is curious of our 
> set up, just ask.
>
> I do have a challenge that I have been unable to overcome and I am hoping the 
> user group can help solve it.
>
> The "Web Content Provider" that serves ringtones, games, etc to our 
> subscribers requires that we have an http header inserted as the users surf 
> their site. The header is x-msisdn. The value should be the subscriber's 
> phone number (mdn). We have been unsuccessful at finding a good/solid 
> solution for retrieving the mdn and inserting an http header. The mdn (in our 
> case) is stored in a database, so the information is easily available. 
> However, we do not know how to make squid request the mdn and inject the http 
> request header.


How does this work without squid? We are using squid to proxy our own
custom browser ( only on BB since it uses a lot of RIM browser related classes)
and routinely insert various headers and send back browsing info. 
We use squid to offload things like DNS lookups that can slow down the phone.

It sounds as if you are asking "how do we get the browser on the phone to
add an id header using squid?" Will cookies work for you or don't you have
anyone to set the cookie?



>
> Here are some details:
> 1. The mdn is stored in a database with an association to the ip address the 
> subscriber was assigned. Optionally this same information is stored in a flat 
> text file (radius accounting detail log)
> 2. We only want to inject this http header for certain URLs.
>
> If anyone can offer suggestions on how we can make this happen, that would be 
> greatly appreciated.

What do you have now? Does it run without squid?


>
> Thanks in advance for your time and feedback.
>
> Regards,
> Jason P Hodges
>




Note: hotmail is now unusable for TEXT, I am moving to marchy...@gmail.com or 
also use
marchy...@yahoo.com. Thanks. NEVER USE HOTMAIL, EVER

Mike Marchywka
586 Saint James Walk
Marietta GA 30067-7165
415-264-8477 (w)<- use this
404-788-1216 (C)<- leave message
989-348-4796 (P)<- emergency only
marchy...@hotmail.com
Note: If I am asking for free stuff, I normally use for hobby/non-profit
information but may use in investment forums, public and private.
Please indicate any concerns if applicable.


>
  
_
Bing brings you maps, menus, and reviews organized in one place.
http://www.bing.com/search?q=restaurants&form=MFESRP&publ=WLHMTAG&crea=TEXT_MFESRP_Local_MapsMenu_Resturants_1x1

RE: [squid-users] Squid HTTP Headers

2009-11-27 Thread Mike Marchywka








> Subject: RE: [squid-users] Squid HTTP Headers
> Date: Fri, 27 Nov 2009 09:46:12 -0600
> From: jhodges
> To: marchy...@hotmail.com; squid-users@squid-cache.org
>
> Today we are paying another company to proxy for our users. They use a
> commercial product to accomplish the http header injection.
> Unfortunately they charge us a ridiculous amount of money to do what
> seems like a simple job.
>
> The header injection needs to occur at the server/gateway/proxy.
> Cookies will not work for this.

Well, if the  phone number is available on the phone it could
be injected there if your browser app does this for you. Would that
violate a privacy or security paradigm ? It seems you need to 
verify the information with a DB maintained by a carrier who has
assigned the IP and has the customer billing info. 



>
> I understand the x-msisdn header to be common in the "mobile world". It
> is used to identify a subscriber in order to bill them without having to
> ask them who they are.
>
> Thanks for the feedback. Keep em comin'.
>
>
>
>
> Regards,
> Jason P Hodges
> Senior Network and Systems Architect
>
> Pocket Communications
> 2819 NW Loop 410
> San Antonio, Texas 78230
> 
> Email: jhod...@pocket.com
> Desk: 210-447-1220
> EFax: 210-678-8187
>
>
>
> -Original Message-
> From: Mike Marchywka [mailto:marchy...@hotmail.com]
> Sent: Friday, November 27, 2009 9:22 AM
> To: Jason Hodges; squid-users@squid-cache.org
> Subject: RE: [squid-users] Squid HTTP Headers
>
>
>
>
>
>
>
>
>
>
> 
>> Date: Fri, 27 Nov 2009 09:08:17 -0600
>> From: jhodges
>> To: squid-users@squid-cache.org
>> Subject: [squid-users] Squid HTTP Headers
>>
>> Hello and thanks for taking the time to read this post. All feedback
> is welcome.
>>
>> I work for a cell phone company and we use squid along with some fancy
> NAT and PAT in order to offer various proxying services for cell phone
> web browsers. Thus far we have been quite successful. If anyone is
> curious of our set up, just ask.
>>
>> I do have a challenge that I have been unable to overcome and I am
> hoping the user group can help solve it.
>>
>> The "Web Content Provider" that serves ringtones, games, etc to our
> subscribers requires that we have an http header inserted as the users
> surf their site. The header is x-msisdn. The value should be the
> subscriber's phone number (mdn). We have been unsuccessful at finding a
> good/solid solution for retrieving the mdn and inserting an http header.
> The mdn (in our case) is stored in a database, so the information is
> easily available. However, we do not know how to make squid request the
> mdn and inject the http request header.
>
>
> How does this work without squid? We are using squid to proxy our own
> custom browser ( only on BB since it uses a lot of RIM browser related
> classes)
> and routinely insert various headers and send back browsing info.
> We use squid to offload things like DNS lookups that can slow down the
> phone.
>
> It sounds as if you are asking "how do we get the browser on the phone
> to
> add an id header using squid?" Will cookies work for you or don't you
> have
> anyone to set the cookie?
>
>
>
>>
>> Here are some details:
>> 1. The mdn is stored in a database with an association to the ip
> address the subscriber was assigned. Optionally this same information is
> stored in a flat text file (radius accounting detail log)
>> 2. We only want to inject this http header for certain URLs.
>>
>> If anyone can offer suggestions on how we can make this happen, that
> would be greatly appreciated.
>
> What do you have now? Does it run without squid?
>
>
>>
>> Thanks in advance for your time and feedback.
>>
>> Regards,
>> Jason P Hodges
>>
>
>
>
>
> Note: hotmail is now unusable for TEXT, I am moving to
> marchy...@gmail.com or also use
> marchy...@yahoo.com. Thanks. NEVER USE HOTMAIL, EVER
>
> Mike Marchywka
> 586 Saint James Walk
> Marietta GA 30067-7165
> 415-264-8477 (w)<- use this
> 404-788-1216 (C)<- leave message
> 989-348-4796 (P)<- emergency only
> marchy...@hotmail.com
> Note: If I am asking for free stuff, I normally use for hobby/non-profit
> information but may use in investment forums, public and private.
> Please indicate any concerns if applicable.
>
>
>>
>
> _
> Bing brings you maps, menus, and reviews organized in one place.
> http://www.bing.com/search?q=restaurants&form=MFESRP&publ=WLHMTAG&crea=T
> EXT_MFESRP_Local_MapsMenu_Resturants_1x1
  
_
Hotmail: Trusted email with Microsoft's powerful SPAM protection.
http://clk.atdmt.com/GBL/go/177141664/direct/01/
http://clk.atdmt.com/GBL/go/177141664/direct/01/


RE: [squid-users] Squid HTTP Headers

2009-11-27 Thread Mike Marchywka











> Date: Fri, 27 Nov 2009 10:10:05 -0600
> From: jhod...@pocket.com
> To: marchy...@hotmail.com; squid-users@squid-cache.org
> Subject: RE: [squid-users] Squid HTTP Headers
>
> In our case, we are the carrier. Unfortunately browsers for phones are very 
> antiquated and the less you require them to do, the better off you are. In 
> fact, they aren't even full blown browsers ... thus the need for a proxy. The 
> full blown browsers are on "smart phones" and more then 98% of the 
> subscribers do not use them.

Who keeps your subs list and assigns your IP addresses LOL? Even on 
smart phones, the strategy is to offload the browser functions to 
a server in many cases. I thought the carriers put the default
software on most phones, adding a header shouldn't be that tough as
often these are branded and otherwise customized.




>
>
>
>
> Regards,
> Jason P Hodges
> Senior Network and Systems Architect
>
> Pocket Communications
> 2819 NW Loop 410
> San Antonio, Texas 78230
> 
> Email: jhod...@pocket.com
> Desk: 210-447-1220
> EFax: 210-678-8187
>
>
>
> -Original Message-
> From: Mike Marchywka [mailto:marchy...@hotmail.com]
> Sent: Friday, November 27, 2009 9:53 AM
> To: Jason Hodges; squid-users@squid-cache.org
> Subject: RE: [squid-users] Squid HTTP Headers
>
>
>
>
>
>
>
>
> 
>> Subject: RE: [squid-users] Squid HTTP Headers
>> Date: Fri, 27 Nov 2009 09:46:12 -0600
>> From: jhodges
>> To: marchy...@hotmail.com; squid-users@squid-cache.org
>>
>> Today we are paying another company to proxy for our users. They use a
>> commercial product to accomplish the http header injection.
>> Unfortunately they charge us a ridiculous amount of money to do what
>> seems like a simple job.
>>
>> The header injection needs to occur at the server/gateway/proxy.
>> Cookies will not work for this.
>
> Well, if the  phone number is available on the phone it could
> be injected there if your browser app does this for you. Would that
> violate a privacy or security paradigm ? It seems you need to
> verify the information with a DB maintained by a carrier who has
> assigned the IP and has the customer billing info.
>
>
>
>>
>> I understand the x-msisdn header to be common in the "mobile world". It
>> is used to identify a subscriber in order to bill them without having to
>> ask them who they are.
>>
>> Thanks for the feedback. Keep em comin'.
>>
>>
>>
>>
>> Regards,
>> Jason P Hodges
>> Senior Network and Systems Architect
>>
>> Pocket Communications
>> 2819 NW Loop 410
>> San Antonio, Texas 78230
>> 
>> Email: jhod...@pocket.com
>> Desk: 210-447-1220
>> EFax: 210-678-8187
>>
>>
>>
>> -Original Message-
>> From: Mike Marchywka [mailto:marchy...@hotmail.com]
>> Sent: Friday, November 27, 2009 9:22 AM
>> To: Jason Hodges; squid-users@squid-cache.org
>> Subject: RE: [squid-users] Squid HTTP Headers
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>> 
>>> Date: Fri, 27 Nov 2009 09:08:17 -0600
>>> From: jhodges
>>> To: squid-users@squid-cache.org
>>> Subject: [squid-users] Squid HTTP Headers
>>>
>>> Hello and thanks for taking the time to read this post. All feedback
>> is welcome.
>>>
>>> I work for a cell phone company and we use squid along with some fancy
>> NAT and PAT in order to offer various proxying services for cell phone
>> web browsers. Thus far we have been quite successful. If anyone is
>> curious of our set up, just ask.
>>>
>>> I do have a challenge that I have been unable to overcome and I am
>> hoping the user group can help solve it.
>>>
>>> The "Web Content Provider" that serves ringtones, games, etc to our
>> subscribers requires that we have an http header inserted as the users
>> surf their site. The header is x-msisdn. The value should be the
>> subscriber's phone number (mdn). We have been unsuccessful at finding a
>> good/solid solution for retrieving the mdn and inserting an http header.
>> The mdn (in our case) is stored in a database, so the information is
>> easily available. However, we do not know how to make squid request the
>> mdn and inject the http request header.
>>
>>
>> How does this work without squid? We are using squid to proxy ou

RE: [squid-users] Need help with insert code into html body

2009-11-29 Thread Mike Marchywka










> Date: Sun, 29 Nov 2009 19:26:47 +0800
> From: lan.messerschm...@gmail.com
> To: ronaldo.z...@gmail.com
> CC: squid-users@squid-cache.org
> Subject: Re: [squid-users] Need help with insert code into html body
>
> On Sun, Nov 29, 2009 at 4:45 PM, Ronaldo Zhou  wrote:
>> Hi everyone,
>>
>>   I need you help with squid to insert some code to html body
>> returned to clients, for analytic reasons.
>>
>>
>
> Squid is designed that NO change to original HTTP response body,
> though you can insert some additional headers.

How does compressin work? I thought someone said you could compress
results in some squid versions? Also, I've been playing with the ad zapper
and it clearly modifies results. There is no reason the modified file can't
get the target html and do something with it.



> Also eCap is squid-3.1's new feature.


  
_
Windows 7: I wanted simpler, now it's simpler. I'm a rock star.
http://www.microsoft.com/Windows/windows-7/default.aspx?h=myidea?ocid=PID24727::T:WLMTAGL:ON:WL:en-US:WWL_WIN_myidea:112009

RE: [squid-users] Need help with insert code into html body

2009-11-29 Thread Mike Marchywka


[ lol, hotmail seems to insist on non-text email 
which the squid list sanely rejected  I guess the
setting is specific to a given machine since I had 
changed it before]

> Subject: Re: [squid-users] Need help with insert code into html body
> 
> On Sun, Nov 29, 2009 at 8:28 PM, Mike Marchywka  wrote:
> 
>>
>> How does compressin work? I thought someone said you could compress
>> results in some squid versions?
> 
> The official squid doesn't do that compression, it's maybe other
> organization's release, or with an external module.
> You could modify the source and implement that, go on, nobody will blame you.


Well, adzapper may be a better example then. Presumably the adzapper logic 
could get the target url and modify it. Right now it just replaces it with a 
generated image saying "zapped." 

Note: while waiting for firefox on windoze to stop doing vm and echo even 1 of 
my last 50 keystrokes, I retyped this message on a debian machine with no 
hesitations or swearing required to get it to respond. IT looks like the
other machine finally came back but it doesn't matter now. In any case, editing 
may be a bit bad as I transition to something that doesn't need a supercomputer 
to type text. Hotmail won't run on my mom's Dell desktop without hanging every 
few messages either. arrggghh.


  
Windows 7: I wanted simpler, now it's simpler. I'm a rock star. 
  
_
Bing brings you maps, menus, and reviews organized in one place.
http://www.bing.com/search?q=restaurants&form=MFESRP&publ=WLHMTAG&crea=TEXT_MFESRP_Local_MapsMenu_Resturants_1x1

RE: [squid-users] squid crashes by itself and reboots automatically

2009-11-29 Thread Mike Marchywka














> Date: Mon, 30 Nov 2009 00:24:38 +0100
> From: gkin...@gmail.com
> To: rag...@smartelecom.org
> CC: squid-users@squid-cache.org
> Subject: Re: [squid-users] squid crashes by itself and reboots automatically
>
> On Sun, Nov 29, 2009 at 8:25 PM, Ragheb Rustom  wrote:
>> Dear all,
>>
>> I have multiple servers running squid 2.7-stable7. Lately I have noticed
>> that most of these servers are crashing sometimes everyday creating a core
>> dump file and then after creating the file restart by itself. After reading
>> the core file with gdb and doing a backtrace I got the following info. Can
>> anyone please help me identify what is going on with these squid servers.
>> You help is very valuable.
>
> Is there anything in cache.log? Possibly just the few lines before the 
> restart.

the default config file seems good about logging stuff. 
Was that sparse stack trace typical or just the first
one you had? The only sig that was obvious was strcmp-
if it was blowing up there you either had pathological strings or memory 
corruption or timing issue etc.
Not too informative until you at least know if it is
a consistent failure point.




>
>
> --
> /kinkie
  
_
Windows 7: I wanted simpler, now it's simpler. I'm a rock star.
http://www.microsoft.com/Windows/windows-7/default.aspx?h=myidea?ocid=PID24727::T:WLMTAGL:ON:WL:en-US:WWL_WIN_myidea:112009

RE: [squid-users] squid crashes by itself and reboots automatically

2009-11-30 Thread Mike Marchywka








> From:
> To: gkin...@gmail.com
> CC: squid-users@squid-cache.org
> Date: Mon, 30 Nov 2009 07:15:27 +0200
> Subject: RE: [squid-users] squid crashes by itself and reboots automatically
>
> The only entry that I have in the cache.log are these lines before that I 
> have all storelocatevary warnings but nothing out of the ordinary.
>
> FATAL: Received Segment Violation...dying.
> 2009/11/29 22:45:48| storeDirWriteCleanLogs: Starting...

fatal seg fault is your strcmp throw? I guess null pointer
but could just be passing code or non-zero junk.


> 2009/11/29 22:45:48| WARNING: Closing open FD 24
> 2009/11/29 22:45:48| commSetEvents: epoll_ctl(EPOLL_CTL_DEL): failed on 
> fd=24: (1) Operation not permitted
>
> Lines that are before this error are all similar to these

This sounds like it has an invalid object, which would suggest a reason for 
passing a silly param to strcmp?

>
> storeLocateVary: Not our vary marker object, 9884D2F9295EC65D66A36B329A3C7BBD 
> = 'http://b.static.ak.fbcdn.net/rsrc.php/zE010/hash/2kkc6o
> 96.css', 'accept-encoding="gzip,%20deflate"'/'gzip, deflate'
> 2009/11/29 22:45:47| storeLocateVary: Not our vary marker object, 
> DCBAB677A2364853342DA79508301881 = 
> 'http://static.ak.fbcdn.net/rsrc.php/z5WTN/hash/arvryq2p
> .css', 'accept-encoding="gzip,%20deflate"'/'gzip, deflate'
> 2009/11/29 22:45:47| storeLocateVary: Not our vary marker object, 
> FBD3324530E5C7D93199529A191361C0 = 
> 'http://static.ak.fbcdn.net/rsrc.php/zBICQ/hash/ryfrvghz
> .css', 'accept-encoding="gzip,deflate,sdch"'/'gzip,deflate,sdch'
> 2009/11/29 22:45:47| storeLocateVary: Not our vary marker object, 
> 498DE9ABC02057F4C59D6A60B2E515B1 = 
> 'http://static.ak.fbcdn.net/rsrc.php/z37DG/hash/e5d2uja2
> .css', 'accept-encoding="gzip,deflate,sdch"'/'gzip,deflate,sdch'
> 2009/11/29 22:45:47| storeLocateVary: Not our vary marker object, 
> 956DBDC557535910ECC2BDE74EEC30D4 = 
> 'http://b.static.ak.facebook.com/common/redirectiframe.h
> tml', 'accept-encoding="gzip,%20deflate"'/'gzip, deflate'
>
> Sincerely,
>
> Ragheb Rustom
>
> -Original Message-
> From: Kinkie [mailto:gkin...@gmail.com]
> Sent: Monday, November 30, 2009 1:25 AM
> To: Ragheb Rustom
> Cc: squid-users@squid-cache.org
> Subject: Re: [squid-users] squid crashes by itself and reboots automatically
>
> On Sun, Nov 29, 2009 at 8:25 PM, Ragheb Rustom  wrote:
>> Dear all,
>>
>> I have multiple servers running squid 2.7-stable7. Lately I have noticed
>> that most of these servers are crashing sometimes everyday creating a core
>> dump file and then after creating the file restart by itself. After reading
>> the core file with gdb and doing a backtrace I got the following info. Can
>> anyone please help me identify what is going on with these squid servers.
>> You help is very valuable.
>
> Is there anything in cache.log? Possibly just the few lines before the 
> restart.
>
>
> --
> /kinkie
>
>
  
_
Hotmail: Trusted email with Microsoft's powerful SPAM protection.
http://clk.atdmt.com/GBL/go/177141664/direct/01/
http://clk.atdmt.com/GBL/go/177141664/direct/01/


RE: [squid-users] Squid HTTP Headers

2009-12-02 Thread Mike Marchywka







> Date: Wed, 2 Dec 2009 08:06:42 -0600
> From: jhod...@pocket.com
> To: squid-users@squid-cache.org
> Subject: RE: [squid-users] Squid HTTP Headers
>
> *bump*
comments below on possible reasons for lack
of response.

>
> Any help or guidance on this would be appreciated. I have heard a couple of 
> ideas but nothing has come to fruition yet.
>
>
IIRC, your issue was paying whoever maintains your
DB or radius server. It isn't hard to write a cold fusion
page or java server to modify headers based on
a DB lookup if that is all you need. I'm not sure on
best approach if you want to rewrite headers
in squid but I'm still not even sure which step is of
most concern- accessing a DB or header modification.

It may be that you haven't gotten an answer because
other skimmed your post with same level of
detail and re-wording would help :)




>
>
> Regards,
> Jason P Hodges
> Senior Network and Systems Architect
>
>
>
> -Original Message-
> From: Jason Hodges [mailto:jhod...@pocket.com]
> Sent: Friday, November 27, 2009 9:08 AM
> To: squid-users@squid-cache.org
> Subject: [squid-users] Squid HTTP Headers
>
> Hello and thanks for taking the time to read this post. All feedback is 
> welcome.
>
> I work for a cell phone company and we use squid along with some fancy NAT 
> and PAT in order to offer various proxying services for cell phone web 
> browsers. Thus far we have been quite successful. If anyone is curious of our 
> set up, just ask.
>
> I do have a challenge that I have been unable to overcome and I am hoping the 
> user group can help solve it.
>
> The "Web Content Provider" that serves ringtones, games, etc to our 
> subscribers requires that we have an http header inserted as the users surf 
> their site. The header is x-msisdn. The value should be the subscriber's 
> phone number (mdn). We have been unsuccessful at finding a good/solid 
> solution for retrieving the mdn and inserting an http header. The mdn (in our 
> case) is stored in a database, so the information is easily available. 
> However, we do not know how to make squid request the mdn and inject the http 
> request header.
>
> Here are some details:
> 1. The mdn is stored in a database with an association to the ip address the 
> subscriber was assigned. Optionally this same information is stored in a flat 
> text file (radius accounting detail log)
> 2. We only want to inject this http header for certain URLs.
>
> If anyone can offer suggestions on how we can make this happen, that would be 
> greatly appreciated.
>
> Thanks in advance for your time and feedback.
>
> Regards,
> Jason P Hodges
>
>
  
_
Chat with Messenger straight from your Hotmail inbox.
http://www.microsoft.com/windows/windowslive/hotmail_bl1/hotmail_bl1.aspx?ocid=PID23879::T:WLMTAGL:ON:WL:en-ww:WM_IMHM_4:092009

RE: [squid-users] Using MySQL for ips acl and urls

2009-12-02 Thread Mike Marchywka











> Date: Thu, 3 Dec 2009 00:00:29 +0100
> From: j...@jccm.es
> To: squ...@treenet.co.nz
> CC: squid-users@squid-cache.org
> Subject: Re: [squid-users] Using MySQL for ips acl and urls
>
> Amos Jeffries escribió:
>> On Wed, 02 Dec 2009 20:36:38 +0100, José Illescas Pérez 
>> wrote:
>>> Hello,
>>>
>>> I'm interesed in install squid for my organization.
>>>
>>> I want to configure large acl's of ip lists, 20.000 more o less.
>>>
>>> Can I use external acl with MySQL for create this acl ip list?. What's
>>> the performance in this case?.
>>>
>>> I want to configure large acl of url lists in MySQL too, for example a
>>> blacklist with categories. What's the performance in this case?.
>>>
>>> Perhaps, is more convenient use squidguard for blacklist of urls and
>>> create the group categories. Any ideas?.
>>>
>>> Greetings.
>>
>> Individual IPs with individual blocklists? this is extremely inefficient.
>>
>> If you must, you can easily use external_acl_type to pull details from
>> mysql during live traffic processing. Speed depends on the query efficiency
>> and network lag to mysql server.

We have java servers for related tasks that maintain in memory hashtables for 
these lists. if DB is not too dynamic this works well. You may need to either 
signal 
server to invalidate in-memory acl cache or have
short expirations if db is more volatile but has can be
much faster than db look up on remote machine.





>>
>> If you find that too slow look at ufdbGuard.
>>
>> Amos
>>
>
> We have five or six ip groups, with permissions in categories of
> blacklist for each group. Each group contains between 1,000 and 10,000
> ip addresses.
>
> The blacklist categories can be urlblacklist, for example.
>
> Where can I configure this, in squid or squidguard?.
>
> Greetings.
>
> --
> _   __ __
> | |/ ___/ ___| \/ | Jose Illescas Perez. Linux User #73559
> _ | | | | | | |\/| | TFNO: +34 925 266 219 FAX: +34 925 266 300
> | |_| | |__| |___| | | | El Webteam de http://www.jccm.es
> \___/ \\|_| |_| Junta de Comunidades de Castilla-La Mancha
  
_
Get gifts for them and cashback for you. Try Bing now.
http://www.bing.com/shopping/search?q=xbox+games&scope=cashback&form=MSHYCB&publ=WLHMTAG&crea=TEXT_MSHYCB_Shopping_Giftsforthem_cashback_1x1

RE: [squid-users] Problem with squid corrupting page layout

2009-12-03 Thread Mike Marchywka










> Date: Thu, 3 Dec 2009 16:20:22 +0800
> From: gz...@hotmail.com
> To: pulpfictionst...@gmail.com
> CC: squid-users@squid-cache.org
> Subject: Re: [squid-users] Problem with squid corrupting page layout
>
> Marisa Giancarla :
>> I am trying to set up squid-3 for my caching proxy and I am having
>> trouble with the pages returned missing their graphics and other layout
>> elements being rendered in odd ways. Everything else seems to be running
>> fine, but this reformatting of web pages is a fly in the ointment to
>> this project. Can someone give me some suggestions on where to go to fix
>> this problem?
>>
>
> what squid version?
> And a copy of your squid.conf here is helpful much.

It may also help if you can more clearly state what appears to be wrong. Is it 
picking up wrong style sheets
or appear to be formatting for wrong user agent etc.
You might be able to use something like wget to get 
each secondary script or css directly or through squid
and see if they differ. I'm not sure off a decent desktop
browser however that gives diagnostic messages
about what it is doing to render a page. It may be of
interest to readers here if anyone knows of something
like this  ( or an open source browser which could be
recompiled with diagnostic messages). 



>
> --
> Regards,
> Yonghua Peng
  
_
Windows Live Hotmail is faster and more secure than ever.
http://www.microsoft.com/windows/windowslive/hotmail_bl1/hotmail_bl1.aspx?ocid=PID23879::T:WLMTAGL:ON:WL:en-ww:WM_IMHM_1:092009

[squid-users] anyone know off hand where squid helps browsing on debian?

2009-12-03 Thread Mike Marchywka

Anyone know off hand how much squid can contribute to browsing speed on
various platforms due to DNS caching? I just setup a debian system and
notice while browsing I had very low BW at times. I suspected it
may have been doing lots of DNS lookups since there
were
lots of long delays ( I guess I could have looked at packet traffic but
was too lazy). Anyway, apt-get install on squid and setup iceweasel to
use it seems to have done the trick. BW monitor shows I'm getting more
throughput and browser responds a lot better. Anyone know off hand what
DNS or other caching is on debian by default and where an
out-of-the-box squid install is likely to fix anything?

Thanks.



Note: hotmail is now unusable for TEXT, I am moving to marchy...@gmail.com or 
also use
marchy...@yahoo.com. Thanks. NEVER USE HOTMAIL, EVER 

Mike Marchywka 
586 Saint James Walk 
Marietta GA 30067-7165 
415-264-8477 (w)<- use this
404-788-1216 (C)<- leave message 
989-348-4796 (P)<- emergency only 
marchy...@hotmail.com 
Note: If I am asking for free stuff, I normally use for hobby/non-profit
information but may use in investment forums, public and private.
Please indicate any concerns if applicable. 



  
_
Get gifts for them and cashback for you. Try Bing now.
http://www.bing.com/shopping/search?q=xbox+games&scope=cashback&form=MSHYCB&publ=WLHMTAG&crea=TEXT_MSHYCB_Shopping_Giftsforthem_cashback_1x1

[squid-users] j2me libs for converting get/post to connect

2009-12-04 Thread Mike Marchywka


I've got a j2me app that benefits somewhat from squid as it can
offload a lot of junk to a fixed server. But, I'm just forwarding get/post 
requests
to the server instead of using connect because the java classes don't support
this and I'm not sure how to write lower level code or what benefits
I could get. Is there a faq somewhere on this? Presumably you could have a
socket open longer etc but not sure in wireless situation what other
issues come up.

Thanks.

Note: hotmail is now unusable for TEXT, I am moving to marchy...@gmail.com or 
also use
marchy...@yahoo.com. Thanks. NEVER USE HOTMAIL, EVER

Mike Marchywka
586 Saint James Walk
Marietta GA 30067-7165
415-264-8477 (w)<- use this
404-788-1216 (C)<- leave message
989-348-4796 (P)<- emergency only
marchy...@hotmail.com
Note: If I am asking for free stuff, I normally use for hobby/non-profit
information but may use in investment forums, public and private.
Please indicate any concerns if applicable.



  
_
Chat with Messenger straight from your Hotmail inbox.
http://www.microsoft.com/windows/windowslive/hotmail_bl1/hotmail_bl1.aspx?ocid=PID23879::T:WLMTAGL:ON:WL:en-ww:WM_IMHM_4:092009

RE: [squid-users] squid ceasing to function when interface goes down

2009-12-05 Thread Mike Marchywka











> Date: Sun, 6 Dec 2009 00:27:23 +1300
> From: squ...@treenet.co.nz
> To: squid-users@squid-cache.org
> Subject: Re: [squid-users] squid ceasing to function when interface goes down
>
> ty...@marieval.com wrote:
>> I'm using squid with a parent that's over a VPN. This VPN occasionally
>> times out or goes down; I'm having difficulty deducing which, but
>> theoretically my scripts should reboot it whenever the VPN goes up so I
>> suspect the former.
>>
>> The end result is that my downstream squid occasionally decides to
>> return nothing but 404's ever again until manually restarted. I'm
>> having to watch it like a hawk and can't keep that up all Christmas. If
>> it would just reopen the interface on its own, all would be well. Is
>> there any way to make it deal with this condition gracefully instead of
>> flipping out?
>>
>
> The normal way to handle these things is to use a) reliable network
> connections, or b) multiple interfaces.
>
> What Squid is this?
>
> Things should go back to normal when the interface actually comes back
> up again.

On my debian install, which I am making up as a I go along and so may not
be a robust example of anything, I had a similiar issue and had to use squid -k
to restart. Ihave two interfaces, a LAN card and a wireless card using 
ndiswrapper
with ipmasq. ( Eventually I want to attach all my local computers in this room
to a router and use the wireless interface to reach WAN via wireless router on 
cable modem,
giving me just one wireless connection and some way to monitor the packets from 
all machines and eventually install a wireless modem etc )
In any case, when I boot if I start the browsers before connecting to WLAN it
doesn't work ( duh) but even after connecting I had to restart squid. 



>
> If the interface is up, the parent proxy contactable and Squid still
> sending out the error pages you need to take a good look at those error
> pages and see *why* Squid thinks they are still valid. Probably turn
> negative_ttl down to an HTTP compliant 0 seconds as well.

Do you have a faq page or link on how the peering is handled? I stumbled
onto something regarding the config for it, but wasn't clear quite how all
these things work together in a fault tolerant way. Thanks.


>
> Amos
> --
> Please be using
> Current Stable Squid 2.7.STABLE7 or 3.0.STABLE20
> Current Beta Squid 3.1.0.15
  
_
Windows 7: Unclutter your desktop. Learn more.
http://www.microsoft.com/windows/windows-7/videos-tours.aspx?h=7sec&slideid=1&media=aero-shake-7second&listid=1&stop=1&ocid=PID24727::T:WLMTAGL:ON:WL:en-US:WWL_WIN_7secdemo:122009

RE: [squid-users] Setting up two NICs with Squid/DANSGuardian

2009-12-14 Thread Mike Marchywka







> Date: Mon, 14 Dec 2009 14:47:06 +0100
> From: 
> To: squid-users@squid-cache.org
> Subject: [squid-users] Setting up two NICs with Squid/DANSGuardian
>
> Hi list,
>
> I have the following setup:
>
> Debian 5.0/Kernel 2.6.26-2-486
>
> Squid3 Stable 19
>
> Squid.conf excerpts
>
> http_port 127.0.0.1:3128
>
> acl DANS src 127.0.0.1
> http_access allow DANS
>
> *
>
> Dansguardian 2.9.9.4
>
> Dansguardian.conf excerpts
>
> filterip = 172.16.10.214
> filterport = 8080
>
> proxyip = 127.0.0.1
> proxyport = 3128
>
> *
>
> ifconfig output
>
> eth0 Link encap:Ethernet inet address:172.16.10.214
> eth1 Link encap:Ethernet inet address:172.16.10.225
>
> *
>
> Proxying is done explicitly. Currently the users connect to 
> 172.16.10.214:8080. I want to change the setup to make users connect to 
> 214:8080 which passes the connection 225:.
> Diagram:
>
> Currently:
>
> user --> eth0 (214:8080) --> DG --> Squid --> WAN
>
> Desired:
>
> user --> eth0 (214:8080) --> DG --> Squid --> eth1 (225:) --> WAN
>
> The whole point of doing this is to have two different mac adresses/ports 
> which can be used for vlan tagging.
>
> How do i do that?
> Using iptables?
> - Could you give me the rules for that?
> Using a bridge?
> - How do i set it up?
> Another possibility?
> Please give me some solutions.


I'm trying to do something along  similar lines  but I'm not sure this relates 
to squid too well. AFAIK, "ip" is supposed to
replace some obsolete things ( based on googling earlier this morning). I've 
got a debian
machine that I want to use to isolate my other machines in my office. The 
debian uses ndiswrapper
that supports wlan0 that I want to be the only connection the the wireless 
router that attaches to our cable modem. The other
machines in my office use a wired connection a router attached to eth0. I'd 
like to insert squid as a proxy
for http traffic to reduce redundant content and DNS lookups but also need to 
know how to configure the
interface usage. But, presumably I'd use lower level tools for looking for 
spurious or malware related traffic.




>
>
> D. K.
> --
> IT-PARTNER - Martin U. Haneke
> Fichtestraße 26
> 10967 Berlin
> Tel: +49(30)200055-0
> Tel: +49(30)200055-39
  
_
Hotmail: Powerful Free email with security by Microsoft.
http://clk.atdmt.com/GBL/go/171222986/direct/01/

RE: [squid-users] Custom rules to analyze HTTP headers

2009-12-15 Thread Mike Marchywka










> Date: Tue, 15 Dec 2009 10:14:35 -0300
> From:
> To
> CC: squid-users@squid-cache.org
> Subject: Re: [squid-users] Custom rules to analyze HTTP headers
>
> Jeff Pang escribió:
>> I have been using HttpWatch for doing this, a cool tool.
>>
> Hello Jeff!
>
> It appears that it's an extension for web browsers. I'm afraid I need to
> solve this stuff form the Squid size, I'm sure someone should have to
> fight about this in the past.

I'm not entirely sure what you are doing but IIRC it is simple to make
an acl for a header. It took about 1 day of leaving an open squid
up to attract a hacker, and it took less time to an a simple
header to the intended app to authorize it. I don't recall
this was all that difficult and I think I was able
to prevent squid from forwarding it too to minimize
ability for anyone to spoof it. 

Looking at conf file and IIRC, you should be able to make acl's with 
"req_header" 
and I think use "header_access deny all" to prevent the header from being 
forwarded.

Corrections and qualifications appreciated.
Thanks.



>
>
> Greetings,
>
> Dererk
>
> --
> BOFH excuse #183:
> filesystem not big enough for Jumbo Kernel Patch
>
>
  
_
Hotmail: Free, trusted and rich email service.
http://clk.atdmt.com/GBL/go/171222984/direct/01/

RE: [squid-users] Trying to authenticate a user only once per working day

2009-12-20 Thread Mike Marchywka










> Date: Sun, 20 Dec 2009 23:41:14 +1300
> From: squ...@treenet.co.nz
> To: squid-users@squid-cache.org
> Subject: Re: [squid-users] Trying to authenticate a user only once per 
> working day
>
> Rodrigo Castanheira wrote:
>> Hi,
>>
>> I wish to authenticate (NTLM) our users only once per working day:
>>
>> authenticate_ip_shortcircuit_ttl 8 hours
>>
>> When the user browses for the first time, he will be authenticated and his
>> IP will be cached so that, for the next 8 hours, Squid believes that
>> requests coming from this IP belong to that user. Now comes the tricky part:
>> if that user logs off and somebody else logs in before those 8 hours expire,
>> Squid would mistakenly associate the same IP with the previous identity.
>
Anyway to use cookies here? 


> This is the downside of IP-based authorization. (NOTE: this is NOT
> authentication).
>
>> As
>> our IE browsers are pre-configured with a standard home page, and the new
>> user couldn't avoid opening it before being able to go elsewhere, I tried
>> enforcing (re)authentication for the home page:
>>
>> acl HOME_PAGE url_regex -i homepage.intranet
>> authenticate_ip_shortcircuit_access deny HOME_PAGE
>>
>> It didn't work.
>> Does authenticate_ip_shortcircuit_access accept only IP acl's ?
>>
>
> One of the benefits of NTLM is that Windows can be configured to do it
> without generating the authentication popups ("single sign-on"). That is
> the best way to configure what you want. If you set it up that way the
> IP-based bypass does not need to be long.
>
> The short-circuit setting is a very risky bypass to reduce load on slow
> or overloaded auth servers. As you have seen, it allows people to
> trivially access resources under some other persons accounts. The longer
> its set to the more security risk you face.
>
> Amos
> --
> Please be using
> Current Stable Squid 2.7.STABLE7 or 3.0.STABLE20
> Current Beta Squid 3.1.0.15
  
_
Hotmail: Powerful Free email with security by Microsoft.
http://clk.atdmt.com/GBL/go/171222986/direct/01/

RE: [squid-users] Streaming Media from ABC.com CBS.com etc...

2009-12-28 Thread Mike Marchywka










> From: kevin.bro...@mntc.org
> To: squid-users@squid-cache.org
> Date: Mon, 28 Dec 2009 18:33:00 -0600
> Subject: [squid-users] Streaming Media from ABC.com CBS.com etc...
>
> Hello everyone,
>
> I'm sure this is an oversight on my part, but for the life of me I cannot get 
> "Full Episodes" to play from any of the major network sites. I can stream 
> media from everywhere else (Netflix, YouTube, shoutcast, etc...). In an 
> effort to troubleshoot this I have set up a bare minimum install of Squid 3.0 
> Stable 18 and configured a bare bones squid.conf.
>
> (This is the complete squid.conf used for testing only)
> http_port 3128
> cache_effective_user squid
> cache_effective_group squid
> acl localhost src 127.0.0.1/255.255.255.255
> acl localnet src 192.168.0.0/16
> acl HTTP proto HTTP
> always_direct allow HTTP
> acl CONNECT method CONNECT
> http_access allow localnet
> coredump_dir /var/spool/squid
>
> With this configuration I can get as far as watching the "Commercial Ad" 
> portion of any of the sites, but the actually episodes never play (On any of 
> the network sites). Again, I'm sure this is something simple, but I've been 
> searching for an answer for going on a couple weeks now, and am finally 
> breaking down and asking for help.
>
> I'm using a Windows machine to access the Squid Box and I'm using IE. Any 
> help would be appreciated. I'd be more than happy to read through any 
> FAQ/Guide/etc.. that pertains to this issue, but I have had no luck finding 
> anything pertaining to this problem.

Well, it would help to get something like linux or cygwin where you stand a 
chance of getting useful information.
I use cygwin's wget for stuff like this. Hit the url that works and the one 
that doesn't and try to phrase your
question in terms of something that comes up at ietf- they probably don't know 
anything about streaming
media on CBS or ABC. In the past, I've noted that some places react to the user 
agent and can response with html links
that point to either 3gp files or an rtsp stream. Find out where your links 
actually point and see how server responds
when you try to hit it directly. I gather if you are calling the media a "full 
episode" you may not have looked at the underlying
links or response headers from server.




>
> Thanks,
>
> Kevin
>
  
_
Hotmail: Trusted email with powerful SPAM protection.
http://clk.atdmt.com/GBL/go/177141665/direct/01/

RE: [squid-users] Streaming Media from ABC.com CBS.com etc...

2009-12-29 Thread Mike Marchywka



> From:
> To: squid-users@squid-cache.org
> Date: Mon, 28 Dec 2009 21:47:37 -0600
> Subject: RE: [squid-users] Streaming Media from ABC.com CBS.com etc...
>
> I used the term "Full Episodes" just as a way to explain the links that are 
> on the various network websites.
>
> I'm not an expert obviously, but the issue seems to be something I'm not 
> configuring correctly in squid. If I bypass squid and directly connect to the 
> sites everything works fine. I trimmed down my squid.conf as much as I knew 
> how to eliminate any configuration errors that I could think of. Does anyone 
> who is using Squid as their network proxy have the ability to view any of the 
> videos on any of the major network sites?

Well, we have  some limited media testing on a mobile app but I don't recall 
"full episodes" but my point is that that doesn't mean much because the details 
matter.


>
> I originally thought the problem was related to how the sites require the use 
> of their own individual players to view the videos. They do this to prevent 
> people from using addons to download the videos directly. So I tried to 
> stream some Netflix content since they also use a propriety video player. 
> With netflix I didn't have any issuues. I've tried various versions of squid 
> assuming that the problem was perhaps related to my build. But the issue 
> seems universal.


Not all non-browsers are the same. You really need to look at the links in each 
case and pretend to be the various 
user agents and see if the server is sending you something squid can handle. 
You may be able to use
netstat while another player is loading or tcpdump or something to see what it 
is doing. However, from what I have seen,
sometimes the pages contain rtsp links, sometimes http. Probably the shorter 
ones are http and longer rtsp but
you need at least look at your page source, that should be easy from any 
browser. It is well worth your time
to get something like cygwin and learn how to use the tools, not just hunt 
through menu's and icons.
All these people keep changing their sites and even if you get something up 
today it is unlikely to be stable forever.





>
> I've went through my squid logs (Access.log, cache.log, store.log) and no 
> errors show. In fact with the configuration below, the access.log isn't even 
> used because everything is a direct connection.
>
> Kevin
>
>>
>> Hello everyone,
>>
>> I'm sure this is an oversight on my part, but for the life of me I cannot 
>> get "Full Episodes" to play from any of the major network sites. I can 
>> stream media from everywhere else (Netflix, YouTube, shoutcast, etc...). In 
>> an effort to troubleshoot this I have set up a bare minimum install of Squid 
>> 3.0 Stable 18 and configured a bare bones squid.conf.
>>
>> (This is the complete squid.conf used for testing only)
>> http_port 3128
>> cache_effective_user squid
>> cache_effective_group squid
>> acl localhost src 127.0.0.1/255.255.255.255
>> acl localnet src 192.168.0.0/16
>> acl HTTP proto HTTP
>> always_direct allow HTTP
>> acl CONNECT method CONNECT
>> http_access allow localnet
>> coredump_dir /var/spool/squid
>>
>> With this configuration I can get as far as watching the "Commercial Ad" 
>> portion of any of the sites, but the actually episodes never play (On any of 
>> the network sites). Again, I'm sure this is something simple, but I've been 
>> searching for an answer for going on a couple weeks now, and am finally 
>> breaking down and asking for help.
>>
>> I'm using a Windows machine to access the Squid Box and I'm using IE. Any 
>> help would be appreciated. I'd be more than happy to read through any 
>> FAQ/Guide/etc.. that pertains to this issue, but I have had no luck finding 
>> anything pertaining to this problem.
>
> Well, it would help to get something like linux or cygwin where you stand a 
> chance of getting useful information.
> I use cygwin's wget for stuff like this. Hit the url that works and the one 
> that doesn't and try to phrase your
> question in terms of something that comes up at ietf- they probably don't 
> know anything about streaming
> media on CBS or ABC. In the past, I've noted that some places react to the 
> user agent and can response with html links
> that point to either 3gp files or an rtsp stream. Find out where your links 
> actually point and see how server responds
> when you try to hit it directly. I gather if you are calling the media a 
> "full episode" you may not have looked at the underlying
> links or response headers from server.
>
>
>
>
>>
>> Thanks,
>>
>> Kevin
>>
>
> _
  
_
Hotmail: Trusted email with Microsoft’s powerful SPAM protection.
http://clk.atdmt.com/GBL/go/177141664/direct/01/

RE: [squid-users] content filter

2009-12-30 Thread Mike Marchywka






> Date: Wed, 30 Dec 2009 13:54:38 +0100
> From: 
> To: squid-users@squid-cache.org
> Subject: Re: [squid-users] content filter
>
> On 30.12.09 11:55, Jeff Peng wrote:
>> Is there a plugin for squid which can implement content filter?
>> for example, if the webpage includes the keyword of "sex", the plugin
>> will remove it or replace it to "***".
>

I thought this might turn into a merit discussion LOL.
I use ad zapper but at least one piece of code, maybe ad zapper,
stated the authors didn't want it to be used for censorship or protecting
people from information. So you want to replace "Taiwan" with "China?"

> Should that apply to "middlesex" as long as to "sextant"?
> I think that this is a perfect example how not to do content filtering.
>
> Should it also replace "ass" with "butt"? Because people making buttumptions
> will be embarbutted when they observe this mbuttive mistake.
>
> --
> Matus UHLAR - fantomas, uh...@fantomas.sk ; http://www.fantomas.sk/
> Warning: I wish NOT to receive e-mail advertising to this address.
> Varovanie: na tuto adresu chcem NEDOSTAVAT akukolvek reklamnu postu.
> Spam = (S)tupid (P)eople's (A)dvertising (M)ethod
  
_
Hotmail: Trusted email with powerful SPAM protection.
http://clk.atdmt.com/GBL/go/177141665/direct/01/

RE: [squid-users] report of subdomains

2010-01-03 Thread Mike Marchywka



> Date: Sun, 3 Jan 2010 00:06:41 -0300
> From:
> To: squid-users@squid-cache.org
> Subject: [squid-users] report of subdomains
>
> Hi people: anyone knows of a log analizer (like sarg) that joins the
> subdomains in the reports, so you can know how much is consumed by
> domain? without this is impossible to know how much is transferred in
> rapidshare, facebook, etc.

I've never heard of a log analyzer per se . What standard things
would it do? Generally I start exploring and have to code ad hoc
stuff  with sed, awk, and grep in a bash script or maybe use perl. 
Somone may have a collection of scripts to share., that is
all I have ever used. I guess you could get a DB import
tool and then use whatever report generators exist, that seems
to be a popular approach these days.




>
> Tnxs in advance.
>
  
_
Hotmail: Powerful Free email with security by Microsoft.
http://clk.atdmt.com/GBL/go/171222986/direct/01/

RE: [squid-users] Forward Cache not working

2010-01-05 Thread Mike Marchywka



> From:
> To: crobert...@gci.net; squid-users@squid-cache.org
> Date: Mon, 4 Jan 2010 22:12:56 -0600
> Subject: RE: [squid-users] Forward Cache not working
>
> I have attached is a screenshot of WGET header output with the "-S" option.

LOL, can you just email the text in a plain text email? If I didn't know better
I'd think someone put you up to this- you often are forced to with GUI output
from which concise ASCII information can not be extracted. 


>
> I see nothing about "private" in the headers so I'm assuming this content
> should be getting cached. Yet, each time I run wget and then view the Squid
> access log it shows TCP_MISS on every attempt. I'll try the Ignore Private
> parameter in squid just to make sure that isn't the cause.


You can look at ietf spec and grep it for each header key wget returned
( assuming you have an easy way to extract these from your jpg
image that should be quite quick LOL). Text is interoperable, images
require you buy some wget-to-ietf-GUI tool that converts the ietf spec
into the same font as your wget output and looks for blocks of
pixles that are the same ( sorry to beat this to death but it comes
up a lot and creates a lot of problems in other contexts).


>
> Very puzzling.
>
> Mike
>
> -Original Message-
> From: Chris Robertson [mailto:crobert...@gci.net]
> Sent: Monday, January 04, 2010 6:48 PM
> To: squid-users@squid-cache.org
> Subject: Re: [squid-users] Forward Cache not working
>
> Mike Makowski wrote:
>> Here is my basic config. Using defaults for everything else.
>>
>> acl localnet src 172.16.0.0/12
>> http_access allow local_net
>> maximum_object_size 25 MB
>>
>> Here is a log entry showing one connection from a LAN user through the
>> proxy. I am guessing that the TCP_MISS is significant. Perhaps the
>> original source is marked as Private as Chris suggested. Don't really know
>> how to even tell that though.
>
> Add a "-S" to wget to output the server headers.
>
> wget -S http://www.sortmonster.net/master/Updates/test.xyz -O test.new.gz
> --header=Accept-Encoding:gzip --http-user=myuserid --http-passwd=mypassword
>
>
>> Can squid be forced to cache regardless of
>> source settings?
>>
>
> Yes. http://www.squid-cache.org/Versions/v3/3.0/cfgman/refresh_pattern.html
>
> Keyword "ignore-private".
>
>> 1262645523.217 305633 172.17.0.152 TCP_MISS/200 11674081 GET
>> http://www.sortmonster.net/master/Updates/test.xyz - DIRECT/74.205.4.93
>> application/x-sortmonster 1262645523.464 122
>>
>> Mike
>
> Chris
  
_
Hotmail: Trusted email with powerful SPAM protection.
http://clk.atdmt.com/GBL/go/177141665/direct/01/

[squid-users] someone was asking about logging scripts,

2010-01-05 Thread Mike Marchywka



I got curious and took a quick look at the log I have just for personal usage 
on this machine.
Really I'm not sure what you need much beyond one liners. For example, hit rate:

 more  /var/log/squid/access.log | awk '{print $4}' | sort | uniq -c | sort -g 
-r
   7873 TCP_MISS/200
   2376 TCP_REFRESH_HIT/304
   1150 TCP_HIT/200
    885 TCP_MISS/302
    772 TCP_IMS_HIT/304
    608 TCP_REFRESH_MISS/200
    243 TCP_MISS/301
    198 TCP_REFRESH_HIT/200
    194 TCP_MISS/204
    158 TCP_MEM_HIT/200
 96 TCP_MISS/404
 93 TCP_NEGATIVE_HIT/404
 88 TCP_NEGATIVE_HIT/403
 62 TCP_MISS/304
 40 TCP_MISS/403
 24 TCP_HIT/302
 20 TCP_NEGATIVE_HIT/204
  9 TCP_MISS/502
  8 TCP_MISS/503
  2 TCP_MISS/504
  1 TCP_MISS/500

general domain frequencies,
more  /var/log/squid/access.log | awk '{print $7}' | sed -e 's/http...//' | sed 
-e 's/\/.*//'  | sort | uniq -c | sort -g -r | more
   1005 ad.yieldmanager.com
    629 bl113w.blu113.mail.live.com
    627 blog.ostp.gov
    542 l.yimg.com
    444 supportforums.blackberry.com
    440 images.bloomberg.com
    428 bloomberg.com
    403 view.atdmt.com
    401 ad.doubleclick.net
    385 h.msn.com
    367 cdn.images.bloomberg.com
   
It is interesting that largest hit count is for ads LOL.


etc

I'm sure you can make up your own quite easily.



note new address
Mike Marchywka
1975 Village Round
Marietta GA 30064
415-264-8477 (w)<- use this
404-788-1216 (C)<- leave message
989-348-4796 (P)<- emergency only
marchy...@hotmail.com
Note: If I am asking for free stuff, I normally use for hobby/non-profit
information but may use in investment forums, public and private.
Please indicate any concerns if applicable.



  
_
Hotmail: Trusted email with Microsoft’s powerful SPAM protection.
http://clk.atdmt.com/GBL/go/177141664/direct/01/

RE: [squid-users] Squid and java.io.IOException: open HTTP connection failed

2010-01-21 Thread Mike Marchywka









> Date: Fri, 22 Jan 2010 11:53:44 +1300
> From: squ...@treenet.co.nz
> To: squid-users@squid-cache.org
> Subject: Re: [squid-users] Squid and java.io.IOException: open HTTP 
> connection failed
>
> Victor Javier Brizuela wrote:
>> On Wed, Jan 20, 2010 at 19:34, Amos Jeffries  wrote:
>>> Not surprising. That URL does not exist.
>>> I get an official 404 "Página no encontrada" response from the web server
>>> at ftweb-azul.zetti.com.ar when I try to retrieve it manually as well.
>>
>> Well, actually, it's supossed to be dinamically generated by Tomcat
>> when requested internally through some process, so it would give you
>> an error if you try to access it directly.
>
> There you probably have the problem. The 404 response I got matches the
> logs you showed from Squid.
>
> HTTP is stateless. Squid will be serving as much from cache as possible
> and may occasionally request these objects "directly" without some part
> of the initial request sequence.

You really need to put your question in terms of something that doesnt' involve 
a
blackbox applet that requests a dynamic class generation. Normally you
have a static jar file, not individual classes. I'm not sure if squid will cache
404's but I guess it is possible someone ( maybe even the app in an otherwise
irrelevant bug) makes a request that gets a 404 which is cached and then
the good request comes in and gets the cached 404. It could be anything.
I guess if your server returned a Vary heder that may give you some idea
what it was doing, or you could just check server log when it really
works. Maybe the serve won't even generate something for security reasons
if it finds a Via header, again who knows.





>
> Amos
> --
> Please be using
> Current Stable Squid 2.7.STABLE7 or 3.0.STABLE21
> Current Beta Squid 3.1.0.15
  
_
Hotmail: Trusted email with Microsoft’s powerful SPAM protection.
http://clk.atdmt.com/GBL/go/196390706/direct/01/

RE: [squid-users] Error compiling squid 3.0 stable4

2010-02-25 Thread Mike Marchywka







> Date: Thu, 25 Feb 2010 11:38:52 +0530
> From: senthilkumaar2...@gmail.com
> To: squid-users@squid-cache.org
> Subject: [squid-users] Error compiling squid 3.0 stable4
>
> Hi All,
>
> When i compile squid 3.0 stable 4 i getting the following error
>
> logfile.cc: In function ‘Logfile* logfileOpen(const char*, size_t, int)’:
> logfile.cc:105: error: invalid conversion from ‘const char*’ to ‘char*’
> logfile.cc:108: error: invalid conversion from ‘const char*’ to ‘char*’
> make[3]: *** [logfile.o] Error 1
> make[3]: Leaving directory `/home/senthil/Downloads/squid-3.0.STABLE9/src'
> make[2]: *** [all-recursive] Error 1
> make[2]: Leaving directory `/home/senthil/Downloads/squid-3.0.STABLE9/src'
> make[1]: *** [all] Error 2
> make[1]: Leaving directory `/home/senthil/Downloads/squid-3.0.STABLE9/src'
> make: *** [all-recursive] Error 1
>


Can you post code from this file and build options? As someone who casts away 
const from
time to time, LOL, there are ways you may be able to satisfy the compiler
but question would  be how the code got checked in and called stable. 
Sometimes there is confusion over const ptr and const target but I'm
still not sure how it would matter.



> I am using fedora 12
> Help me in solving the error
> Thankyou
>
> Regards
> senthil
  
_
Hotmail: Trusted email with powerful SPAM protection.
http://clk.atdmt.com/GBL/go/201469227/direct/01/

RE: [squid-users] Error compiling squid 3.0 stable4

2010-02-25 Thread Mike Marchywka







> From: marchy...@hotmail.com
> To: senthilkumaar2...@gmail.com; squid-users@squid-cache.org
> Date: Thu, 25 Feb 2010 06:55:27 -0500
> Subject: RE: [squid-users] Error compiling squid 3.0 stable4
>
>
>
>
>
>
>
> 
>> Date: Thu, 25 Feb 2010 11:38:52 +0530
>> From: senthilkumaar2...@gmail.com
>> To: squid-users@squid-cache.org
>> Subject: [squid-users] Error compiling squid 3.0 stable4
>>
>> Hi All,
>>
>> When i compile squid 3.0 stable 4 i getting the following error
>>
>> logfile.cc: In function ‘Logfile* logfileOpen(const char*, size_t, int)’:
>> logfile.cc:105: error: invalid conversion from ‘const char*’ to ‘char*’
>> logfile.cc:108: error: invalid conversion from ‘const char*’ to ‘char*’
>> make[3]: *** [logfile.o] Error 1
>> make[3]: Leaving directory `/home/senthil/Downloads/squid-3.0.STABLE9/src'
>> make[2]: *** [all-recursive] Error 1
>> make[2]: Leaving directory `/home/senthil/Downloads/squid-3.0.STABLE9/src'
>> make[1]: *** [all] Error 2
>> make[1]: Leaving directory `/home/senthil/Downloads/squid-3.0.STABLE9/src'
>> make: *** [all-recursive] Error 1
>>
>
>
> Can you post code from this file and build options? As someone who casts away 
> const from
> time to time, LOL, there are ways you may be able to satisfy the compiler
> but question would  be how the code got checked in and called stable.
> Sometimes there is confusion over const ptr and const target but I'm
> still not sure how it would matter.


It is still early here, what compiler are you using? I think I have
run into this with some compilers and ptr-targer const issues.
Sorry if you posted this, again I've seen stuff like
this come up elsewhere and don't know anything about
squid source but am curious.

>
>
>
>> I am using fedora 12
>> Help me in solving the error
>> Thankyou
>>
>> Regards
>> senthil
>
> _
> Hotmail: Trusted email with powerful SPAM protection.
> http://clk.atdmt.com/GBL/go/201469227/direct/01/
  
_
Your E-mail and More On-the-Go. Get Windows Live Hotmail Free.
http://clk.atdmt.com/GBL/go/201469229/direct/01/