[squid-users] Logging users

2008-02-22 Thread Sam Carleton
Is there any way to get squid-cache to log the user?  The request is
coming from the wife to keep taps on the kids.

Also, does anyone know of any good *NIX firewall/proxy distro's
designed to keep the home clean of the crap?

Sam


Re: [squid-users] Logging users

2008-02-22 Thread Sam Carleton
On Fri, Feb 22, 2008 at 12:43 PM, Odhiambo Washington
<[EMAIL PROTECTED]> wrote:
> What is the setup like? Have you heard about dansguardian?

My setup?  It is a Squid-Cache 2.6 install on my OpenBSD 3.5 firewall.
It is a transparent proxy such that everything on the inbound on the
intranet NIC port 80 is redirected to the Squid-Cache.

Sam


Re: [squid-users] Logging users

2008-02-23 Thread Sam Carleton
On Sat, Feb 23, 2008 at 5:32 AM, Odhiambo Washington <[EMAIL PROTECTED]> wrote:
> OpenBSD 3.what?

I am sorry, I thought I said 3.5.

>  Did you inherit this box from someone? Are you a newbie trying to choose
> an OS but also biting too much in the process?;-)

No, not at all.  I am a Windows C/C++/C# programmer.  I have been
playing around with Linux and the BSD's now since about '94.  I have
not done much in the last few years other then run this firewall at
home.  I set this OpenBSD firewall up with squid-cache a few years ago
and it has been running very nicely every since.   I think it was last
year I upgraded the squid-cache to 2.6s10.

>  Anyway, give us more details about your network setup - the Squid box
>  and the others on the network.

The network consists of 2 workstations and a file server, plus the
OpenBSD firewall.  At one point I was running squidGuard to block bad
content and the other day my wife asked me about that and about where
our kids where going on the web.  This is when I went to the firewall
and realized that I was not logging user info in the squid-cache log.

Someone else suggested Dansguard and I will be looking into that, but
I am still curious about how to log who as well as where, with out
forcing folks to log into the proxy server, for the server is
transparent right now.

Sam


Re: [squid-users] Logging users

2008-02-24 Thread Sam Carleton
On Sun, Feb 24, 2008 at 7:45 AM, Amos Jeffries <[EMAIL PROTECTED]> wrote:
>
> Sam Carleton wrote:
>  > On Fri, Feb 22, 2008 at 12:43 PM, Odhiambo Washington
>  > <[EMAIL PROTECTED]> wrote:
>  >> What is the setup like? Have you heard about dansguardian?
>  >
>  > My setup?  It is a Squid-Cache 2.6 install on my OpenBSD 3.5 firewall.
>  > It is a transparent proxy such that everything on the inbound on the
>  > intranet NIC port 80 is redirected to the Squid-Cache.
>  >
>  > Sam
>
>  Well thats not easily attained. transparency and authentication are
>  largely self-anihilating.

Amos,

I am not looking for authentication, just logging the user name.

Sam


[squid-users] subversion and REPORT requests

2007-02-25 Thread Sam Carleton

I am trying to check out the apache source code via subversion.  I
have been running into the following error:

Execute: Checkout
Error: Error while performing action: REPORT request failed on
'/repos/asf/!svn/vcc/default'
REPORT of '/repos/asf/!svn/vcc/default': 400 Bad Request (http://svn.apache.org)
Ready

It looks like the problem is that squid-cache is not allowing the
REPORT request through.  How do I go about reconfiguring squid-cache
to allow the REPORT request through?  What are the side effects of
doing something like that?

Sam


[squid-users] Understanding a "REPORT request"

2007-02-27 Thread Sam Carleton

I posted a question two days ago and nobody responded, so I assume
nobody knows the answer, thus I thought I might ask it a little
different.  I have learned that my squid-cache is blocking a "REPORT
request" when trying to pull source code from an subversion repository
that is served up by apache.  At first I thought the REPORT request
was a HTTP request method, but after looking at the specs, I don't see
anything about a REPORT request type.  Does anyone have any thoughts
on what squid-cache is actually blocking?  How best to trouble shoot
this issue?

Sam


[squid-users] Subversion and REPORT requests

2007-02-27 Thread Sam Carleton

I am trying to check out the apache source code via subversion.  I
have been running into the following error:

Execute: Checkout
Error: Error while performing action: REPORT request failed on
'/repos/asf/!svn/vcc/default'
REPORT of '/repos/asf/!svn/vcc/default': 400 Bad Request (http://svn.apache.org)
Ready

It looks like the problem is that squid-cache is not allowing the
REPORT request through.  How do I go about reconfiguring squid-cache
to allow the REPORT request through?  What are the side effects of
doing something like that?

Sam


[squid-users] upgrading from 2.5S6 to 2.6S10

2007-03-07 Thread Sam Carleton

I am upgrading after about two and a half years from 2.5 STABLE 6 to
2.6 STABLE 10.  I do run in accelerator mode.  After compiling the new
version I tried simply using the old conf file and it did not work, so
I thought I would simply copy over my settings into the new conf file.
Thus I did a diff on the old conf vs the old default conf.  I moved
over a number of things, but I discovered the follow lines, from what
I can tell the whole accelerator section, is in the old config but not
in new one:

httpd_accel_host virtual
httpd_accel_port 80
httpd_accel_with_proxy on
httpd_accel_uses_host_header on

Is there a new way to setup accelerator mode or is that section simply
not in the current config by default?

Sam


[squid-users] upgrading from 2.5S6 to 2.6S10, part 2

2007-03-07 Thread Sam Carleton

I decided to do a bit more digging and realized that I am asking about
the wrong thing.  After looking at the documentation, it appears that
httpd-accelerator is what I need once I start hosting my own busy web
site and I want a proxy to take some load off my web server.  It is
*NOT* the feature I am using.  I believe it is called a transparent
proxy.  I have my firewall redirect all traffic coming in on the
internal NIC to port 3128 of the local machine/squid server.  With 2.6
STABLE 10, what do I need to do to get it running correctly?

Sam


[squid-users] Is squid-cache the right tool?

2007-03-14 Thread Sam Carleton

I have been using squid-cache at home on my firewall for years now, I
use it for the normal standard old stuff of simply caching of where we
surf.

I am writing a new kiosk based software package that has a GUI app as
the front end for the operator and apache is serving up the pages to
the web browser clients.  What is being served up are images.  The
images that come into the GUI app are full size images, 4 megapixel on
up.

Current what I am doing is after my software copies the full size
images into the computer, it then creates the two different web
images, one is a small thumbnail (120x180) the other is a larger image
for the screen (400x600).  When one is doing this to 200 images at one
time, it takes a while, too long in my opinion.

My first though was to have the indexing page detect if the smaller
images where there and create them, page by page, and then save the
smaller image so that next time it was snappy.  Then it dawned on me:
Isn't that was things like squid-cache do?  Cache these processed
files?

So the question is:  Is squid-cache (on Windows) the right tool to
cache these images?  I know that apache can be setup as a cache, but I
don't know anything about that.  Will I be better off using apache?

The other question I have to ask someone, more myself then anyone...
Am I making this too complicated by adding a proxy along with the web
server?

Thoughts and opinions?

--
The contents of this e-mail are intended for the named addressee only.
It contains information that may be confidential. Unless you are the
named addressee or an authorized designee, you may not copy or use it,
or disclose it to anyone else. If you received it in error please
notify us immediately and then destroy it.


[squid-users] understanding how Last-Modified is really used...

2007-03-16 Thread Sam Carleton

Folks,

If you have been following along my other questions, I want to use a
proxy on my kiosk to cache downsized images.  The problem is that the
system operator can make changes to images, such as rotating them or
cropping them.

If I understand all this correctly, I can change the Last-Modified
header to reflect the last time the image was changed rather then when
the script was run to downsize the image.  Correct?

Assuming I can change the Last-Modified header, is there some type of
"pre-fetch" that the proxy does to JUST get the header info as to
knows whether or not to get the generated content?

Assuming there is some type of pre-fetch.  Is my impression correct
that I have to handle the pre-fetch correctly within my script (PHP,
by the way) and not do the actual content when receiving a pre-fetch
request?  If my impression is correct about this whole thing, how do I
detect that the request is a pre-fetch and not a full fetch,
generically speaking, for I don't know if I am going to be sticking
with PHP;)

Sam


[squid-users] Re: understanding how Last-Modified is really used...

2007-03-16 Thread Sam Carleton

I found the answer:

HTTP/1.1 Specs --> 13.3 Validation Model
http://www.w3.org/Protocols/rfc2616/rfc2616-sec13.html#sec13.3

and

HTTP/1.1 Specs --> 14.24 If-Match
http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.24

Very, very cool!

I am about to go to beta, but this I will do for the RC1!  For those
that helped enlighten me, thanks!

Sam


[squid-users] OpenBSD and TCP Proxying

2004-09-06 Thread Sam Carleton
I am working on setting up squid-cache on my OpenBSD firewall.
When the web browser is configured to go through squid-cache (port
3128), everything works.  I added an entry to proxy everything
that is going through the firewall to the squid-cache with this
line:

rdr on $int_if proto tcp from any to any port 80 -> \
127.0.0.1 port 3128

Then I get an error.  I am assuming that it is an issue with the
pf.conf rather then squid-cache because squid-cache is working
otherwise.  Here is the error:

While trying to retrieve the URL: / 

The following error was encountered: 

Invalid URL 
Some aspect of the requested URL is incorrect. Possible problems: 

Missing or incorrect access protocol (should be `http://'' or
similar) 
Missing hostname 
Illegal double-escape in the URL-Path 
Illegal character in hostname; underscores are not allowed 
Your cache administrator is webmaster.