Re: [squid-users] uninteruptable proxy

2010-01-03 Thread Alex Braunegg
Hi,

Consider using VMware ESX/ESXi 4.0 and FT.

I have successfully used, tested and demonstrated VMware ESX/ESXi 4.0
and VMware FT services performing web scanning and proxy services. In
the demonstration simply pull the power plug or "reboot" one of the
ESX/ESXi servers. The time to switch to the secondary system was in
the ms range - many times simply pinging the devices the switch was
not detectable.

In each test - all sessions remained active - including download of
data / files such as ISO images, or streaming video.

Each server running Squid was CentOS based, running latest kernel,
including VMware tools. At the time I was using Squid 3.0 (exact
version I cant recall, but it should not matter).

Let me know if you need any further information.

Best Regards,

Alex

On Sat, Jan 2, 2010 at 5:22 PM, Muhammad Sharfuddin
 wrote:
> peoples in my organization do online bidding, and if browsing
> interrupt/disconnect for any reason(e.g hardware failure) and restore
> within a 50 or 60 seconds.. their session terminate/disconnect, and they
> lost.
>
> we need uninteruptable web/proxy services. we need Highly Available
> Proxy Infrastructure/Services
>
> what are the possible solutions
>
> does following are the options
> 1, Clustering Squid (I think session will break)
> 2, Clustering two virtual machines, and virtual machines are running
> Squid, so if one host machine downs/crash, the other host machine
> continue running squid (I think in this case, session wont break,
> because the other host machine will run the same squid)
>
> please suggest/recommend
>
> Regards
> --ms
>
>
>
>
>


Re: [squid-users] uninteruptable proxy

2010-01-03 Thread Genaro Flores

peoples in my organization do online bidding, and if browsing
interrupt/disconnect for any reason(e.g hardware failure) and restore
within a 50 or 60 seconds.. their session terminate/disconnect, and they
lost.


I think with any normal proxy server if the client terminates a session or 
times out the proxy, too, will terminate the session.


I suspect you are using the wrong terminology: what you need is not a 
highly available proxy service, i.e. a proxy suited to heavy load and with 
a very low downtime, rather a _persistent_ proxy. I'm not sure whether 
that's the exact industry term but I'm pretty convinced "highly available 
proxy" is not what you need.


If anyone here can tell you about a persistent proxy--that is, a proxy that 
keeps track of sessions and keeps them alive even if clients time out--then 
you'll be golden. I for one don't have knowledge of such software.




--On Saturday, January 02, 2010 11:22 +0500 Muhammad Sharfuddin 
 wrote:



peoples in my organization do online bidding, and if browsing
interrupt/disconnect for any reason(e.g hardware failure) and restore
within a 50 or 60 seconds.. their session terminate/disconnect, and they
lost.

we need uninteruptable web/proxy services. we need Highly Available
Proxy Infrastructure/Services

what are the possible solutions

does following are the options
1, Clustering Squid (I think session will break)
2, Clustering two virtual machines, and virtual machines are running
Squid, so if one host machine downs/crash, the other host machine
continue running squid (I think in this case, session wont break,
because the other host machine will run the same squid)

please suggest/recommend

Regards
--ms











Re: [squid-users] report of subdomains

2010-01-03 Thread Jason Healy
On Jan 2, 2010, at 10:06 PM, Guido Marino Lorenzutti wrote:

> Hi people: anyone knows of a log analizer (like sarg) that joins the 
> subdomains in the reports, so you can know how much is consumed by domain? 
> without this is impossible to know how much is transferred in rapidshare, 
> facebook, etc.

I think Calamaris does a "2nd-level domain" report, so you see the top-N 
domains (e.g., "apple.com", "facebook.com", etc).  If that isn't quite what you 
want, you could probably hack up one of the other scripts to include the top- 
and second-level domain name instead of just the top.

We used to use Calamaris, but have since switched to an in-house script that 
provides much of the same functionality.  The 2nd-level domain report is one of 
our staples...

Jason

--
Jason Healy|jhe...@logn.net|   http://www.logn.net/






[squid-users] squid transparent proxy setup

2010-01-03 Thread scatter

Hi everyone,

I'm currently attempt to setup a transparent proxy between my linksys router
and my wifi access point.

The hardware setup like the following
Linksys Router --> Ubuntu PC (2 Network Card) --> switch --> Access Point
--> Laptop
On the Ubuntu PC, eth0 is connected to the router, eth1 is connected to the
switch

I've read on how to setup transparent proxy using squid and seems to done
all setting properly. When I was about to test it, it realize that
1) eth1 doesn't have an IP
2) Computer connected to access point cannot obtain IP address

So as of this point, I can not test to see if my transparent proxy is
working or not. So here are my questions

1) How do I make the laptop that's connected to the access point obtaining
IP address from router? Do I need to setup DHCP service on Ubuntu PC and
give a different IP range?
2) Do I set a static IP for eth1? If I do, should I setup the static IP in
the same range as eth0? (eth0 now is 192.168.1.0/24, should I setup eth1
within 192.168.1.0/24 or some other IP range like 192.168.200.0/24)
3) On my Ubuntu box there is only Apache Server and Squid2 are installed, am
I missing any applications?

Thank you all in advanced
-- 
View this message in context: 
http://old.nabble.com/squid-transparent-proxy-setup-tp27002964p27002964.html
Sent from the Squid - Users mailing list archive at Nabble.com.



RE: [squid-users] report of subdomains

2010-01-03 Thread Mike Marchywka



> Date: Sun, 3 Jan 2010 00:06:41 -0300
> From:
> To: squid-users@squid-cache.org
> Subject: [squid-users] report of subdomains
>
> Hi people: anyone knows of a log analizer (like sarg) that joins the
> subdomains in the reports, so you can know how much is consumed by
> domain? without this is impossible to know how much is transferred in
> rapidshare, facebook, etc.

I've never heard of a log analyzer per se . What standard things
would it do? Generally I start exploring and have to code ad hoc
stuff  with sed, awk, and grep in a bash script or maybe use perl. 
Somone may have a collection of scripts to share., that is
all I have ever used. I guess you could get a DB import
tool and then use whatever report generators exist, that seems
to be a popular approach these days.




>
> Tnxs in advance.
>
  
_
Hotmail: Powerful Free email with security by Microsoft.
http://clk.atdmt.com/GBL/go/171222986/direct/01/

[squid-users] "Homeserver", quest for review

2010-01-03 Thread Heinz Diehl
Hi,

I'm planning to set up squid (2.7-STABLE7) on a small homeserver, which is 
responsible for
max. 4 people. The machine has 8GB RAM and the squid cache_dir will reside
on a second harddisk with 30GB space. This is what I've planned to
configure:

cache_mem 512 MB
maximum_object_size_in_memory 10 MB

memory_replacement_policy heap GDSF
cache_replacement_policy heap LFUDA

minimum_object_size 0 KB
maximum_object_size 1 GB

cache_dir coss /var/cache/squid/c1 10240 max-size=3145728 block-size=4096 
maxfullbufs=10
cache_dir aufs /var/cache/squid/cacheA 16000 64 256


Sounds this reasonable, did I overlook something, or is all this just crap?
Any thoughts?

Thanks,
Heinz.