...@henriknordstrom.net wrote:
ons 2010-02-17 klockan 22:40 -0700 skrev Alex Rousskov:
On 02/16/2010 12:54 PM, Andres Salazar wrote:
Hello,
Iam still having issues with SSLBump .. apparently iam now getting
this error when I visit an https site with my browser explicity
configured to use
...@treenet.co.nz wrote:
On Mon, 22 Feb 2010 15:48:57 -0600, Andres Salazar ndrsslz...@gmail.com
wrote:
Just confirming. You are telling me that I cannot configure a browser
with a proxy while at the same time squid is configured to SSLBump the
https requests?
Please confirm.. without proper docs
Hello Amos,
# /usr/local/sbin/squid -v
Squid Cache: Version 2.7.STABLE6
Iam including the ACLs and the HTTP_ACCESS:
acl msn_mime req_mime_type -i ^application/x-msn-messenger$
acl msn_gw url_regex -i gateway.dll
acl flash_mime rep_mime_type ^application/x-shockwave-flash$
acl
Jeffries squ...@treenet.co.nz wrote:
Andres Salazar wrote:
Hello,
Iam using:
acl allowedurls url_regex /etc/squid/url.txt
and then only allowing localnet to access that acl.
a.) If a user behind localnet types:
http://www.facebook.com/@http://www.allowed.org/page.html they are
able
Hello,
Iam still having issues with SSLBump .. apparently iam now getting
this error when I visit an https site with my browser explicity
configured to use the https_port .
2010/02/16 14:31:14| clientNegotiateSSL: Error negotiating SSL
connection on FD 8: error:1407609B:SSL
, Feb 15, 2010 at 11:07 PM, Amos Jeffries squ...@treenet.co.nz wrote:
Andres Salazar wrote:
Hello,
This time we can see I followed the original config. A page like
cnn.com takes about 60 seconds to load. Without the proxy it takes 10
seconds.
That sort of matches the relative number
Hello,
This time we can see I followed the original config. A page like
cnn.com takes about 60 seconds to load. Without the proxy it takes 10
seconds.
CentOS 5.4 fresh and clean base install with compiling tools.
Then I installed openssl-devel
./configure; make; make install also same problem
Hello,
Iam using:
acl allowedurls url_regex /etc/squid/url.txt
and then only allowing localnet to access that acl.
a.) If a user behind localnet types:
http://www.facebook.com/@http://www.allowed.org/page.html they are
able to peak some content of the disallowed website facebook. Is it
Hello,
Iam trying to configure SSLbump so that I can use squid in transparent
mode and redirect with iptables/pf port 443 and 80 to squid.
When using https_port (based on some mailing lists) it says that isnt
recognized.
I also tried to use
http_port 3129 transparent sslBump
Hello
On Sun, Feb 14, 2010 at 6:59 PM, Amos Jeffries squ...@treenet.co.nz wrote:
key=cert= ??
I saw some examples where they where seperated. It should be only one
file right? Is my process of creating a crt and key sound correct
though?
Thank you.
Hello,
I just installed 3.1.0.16 on CentOS 5.4 I noticed that in general
sites low very slow, when i take the proxy off they load fine. Iam the
only user. This is the cache log (i have cache disabled)
2010/02/14 21:34:57| Starting Squid Cache version 3.1.0.16 for
i686-pc-linux-gnu...
2010/02/14
Hello,
While trying to:
source: 3.1.0.16
./configure --enable-ssl
make
I get this output at the end:
gawk -f ./mk-globals-c.awk ./globals.h globals.cc
gawk -f ./mk-string-arrays.awk ./enums.h string_arrays.c
/bin/sh ./repl_modules.sh lru repl_modules.cc
make all-recursive
make[2]:
On Tue, Oct 27, 2009 at 1:48 AM, Amos Jeffries squ...@treenet.co.nz wrote:
Now that the user is being logged successfully what does a row of the log
look like?
1256653823.243168 172.16.2.35 TCP_REFRESH_HIT/304 295 GET
http://www.usa.net/images/burst.jpg admin
FIRST_UP_PARENT/74.55.186.130 -
Hello,
I have entries like these in my log:
1256612777.111 1145 66.199.62.74 TCP_MISS/200 17337 GET
http://www.bing.com/search? admin DIRECT/8.17.64.41 text/xml
1256612777.605931 66.199.62.74 TCP_MISS/200 15785 GET
http://www.bing.com/search? admin DIRECT/8.17.64.8 text/xml
1256612778.321
You say other requests ... do you mean these ones are not? thats a
problem with squid not even receiving the requests.
With other requestsm i meant every single request from all users and all IPs.
All I can think of is a wild guess that maybe something will change if the
jp log line goes
requests are not detailed!!
Same happens for other similar mechanisms.
Why?
Thanks
Andres
On Mon, Oct 26, 2009 at 10:09 PM, Andres Salazar ndrsslz...@gmail.com wrote:
Hello,
I have entries like these in my log:
1256612777.111 1145 66.199.62.74 TCP_MISS/200 17337 GET
http://www.bing.com
Greetings,
The goal is to manage a LAN of 50-100 users dynamically controlling
with access list the sites each user can see, and the ones they cant.
I also need a simple way of controlling their internet route so that
they can be changed to use different IPs from different peer proxies
around the
Hello,
Is there anyway I can randomize my outgoing_address other then setting
up ACLs with time.
I have a box with a high load of IPs and I want my requests to go out
totally random.
Thank you
Andres
Hello,
I have read the FAQ in regards to interpretating the access.log.
http://wiki.squid-cache.org/SquidFaq/SquidLogs#The_native_log_file_format
For example in this line:
1256339151.354 88 75.10.68.23 TCP_MISS/302 470 GET
http://www.investmentintelligencer.com/favicon.ico -
It is milliseconds, i must have missed that.
Andres
On Fri, Oct 23, 2009 at 6:13 PM, Andres Salazar ndrsslz...@gmail.com wrote:
Hello,
I have read the FAQ in regards to interpretating the access.log.
http://wiki.squid-cache.org/SquidFaq/SquidLogs#The_native_log_file_format
For example
Hello guys,
I understand that to forward all requests to another proxy I would do
something like this:
cache_peer Parent_proxy_IP parent port 0 no-query default
acl all src 0.0.0.0/0.0.0.0
http_access allow all
never_direct allow all
However, I want to be able to forward different src ips to
Hello,
Squid user based authentication is a high advantage to placing access
lists. Iam however forced to place squid as a transparent proxy but I
need some kind of authentication for users passed to squid to manage
the ACLs (specific allow lists, reply body size, etc) .
Is there _any_ work
Hello,
I have 30 users who browse the internet all day through squid. My
version is squid-2.7.STABLE6 . I started squid with 5024 file
descriptors, cache is disabled and its a machine solely dedicated to
this with P4 CPU with PowerEdge 600SC and 1GB RAM.
The problem is that intermittently during
Would it be smart to benchmark squid during the day by running a
script that fetches a file every 5 seconds and times the amount of
time it took to download?
Iam sure there might be better ways.
Andres
On Fri, Oct 16, 2009 at 3:30 PM, Andres Salazar ndrsslz...@gmail.com wrote:
Hello,
I have
Hello,
Iam wanting to pass the option of tcp_outgoing_address when I run the
command to refresh or reload the config file. This so that every hour
I can rorate with a cron the IP that squid uses to browse the
internet.
Is this possible? Or is there a better way then to create dozens of
config
:)
Cheers,
Pieter
- Original Message - From: Andres Salazar ndrsslz...@gmail.com
To: squid-users@squid-cache.org
Sent: Wednesday, October 14, 2009 05:19
Subject: [squid-users] Change tcp_outgoing_address every hour, best way to
do this?
Hello,
Iam wanting to pass the option
Hello,
Ive been searching for ways to conduct httpd through the transparent
mode of squid. This is because Id like to use squids ACLs not so much
as the caching that obviously doesnt work with this protocol.
Are there ways I can proxy https? Ive heard somebody mention that it
is possible by
Hello,
On Tue, Oct 13, 2009 at 7:22 PM, Amos Jeffries squ...@treenet.co.nz wrote:
Squid will not do what you want.
Ok, is there anyway that I can pass the https traffic through the
transparent proxy directly without any interference from squid?
The reason is that Id like to take advantage of
Hello all,
Iam new to squid and I have a successful config that utilizes ncsa
auth to alllow per user access.
However, I was told that I could have an allowed list of sites
vieweable with a directive that points to a file outside the config.
How do I specify that for every user? I also need to
Can every squid user get a different IP and max_body_size limit? or do
I need several instances of squid?
Thank you
--Andres
Hello,
Ive setup my first reverse proxy to accelerate a site, ive used wget
to spider the entire site several times and noticed that even after
running it some files never get cached like html files! I presume it
is because the htmls dont have the correct cache headers.
It didnt even want to
Hello,
Ive setup my first reverse proxy to accelerate a site, ive used wget
to spider the entire site several times and noticed that even after
running it some files never get cached like html files! I presume it
is because the htmls dont have the correct cache headers.
It didnt even want to
:43:11 -0500, Andres Salazar ndrsslz...@gmail.com
wrote:
Hello,
iam using SQUID 2.7.STABLE3 and Iam trying to setup a reverse proxy
for my site whatwould.org , I have followed the instructions at:
http://wiki.squid-cache.org/ConfigExamples/Reverse/BasicAccelerator
and other internet toturials
I have isloated the issue due to this .htaccess file:
AddHandler application/x-httpd-php .html .htm
AddType application/x-httpd-php .html .htm
Options +FollowSymLinks
RewriteEngine on
RewriteCond %{HTTP_HOST} ^protected-domain.com
RewriteRule ^(.*)$ http://www.protected-domain.com/$1 [R=301,L]
Hello,
iam using SQUID 2.7.STABLE3 and Iam trying to setup a reverse proxy
for my site whatwould.org , I have followed the instructions at:
http://wiki.squid-cache.org/ConfigExamples/Reverse/BasicAccelerator
and other internet toturials with no joy.
The error Iam getting with that config is:
Hello,
Ive noticed that when a computer is set to use a transparent proxy
(forced one) , or when it is pointed in the settings of the browser
the contents for /etc/hosts in that particular computer are ignored. I
assume it is because the DNS request is actually performed by squid on
the remote
Hello,
I have been browsing http://www.squid-cache.org/Scripts/ trying to
find a software (preferably non commercial) that can give me a GUI to
identify which domains AND pages for each domain each of my users are
viewing.
Most of what I find out console based, or only provide the domain name
in
37 matches
Mail list logo