[squid-users] Blacklist

2010-07-31 Thread Francesco Collini
Hello, actually we use urlblacklist.com, we are registered users for providers. It seems the Blacklist is not well maintained: updates are often missing many censored sites, i tried to write them but i had no answer... Someone know good blacklist, also commercial, to be integrated in Squid/Dansgu

[squid-users] How does Squid prevent stampeding during a cache miss?

2010-07-31 Thread Ryan Chan
For example, put Squid as a reverse proxy mode to http://www.example.com/heavyduty.php (expire set to 1 hour, need 10s to generate). When the file is just expired, large amount of clients (e.g. 10K) request for this file reached the Squid at the same time. Are there any heuristics performed by Sq

Re: [squid-users] Blacklist

2010-07-31 Thread Marcus Kool
Francesco, Here is a biased answer: check out http://www.urlfilterdb.com Marcus @ URLfilterDB Francesco Collini wrote: Hello, actually we use urlblacklist.com, we are registered users for providers. It seems the Blacklist is not well maintained: updates are often missing many censored sites,

Re: [squid-users] How does Squid prevent stampeding during a cache miss?

2010-07-31 Thread david robertson
Squid 2.x supports this: # TAG: collapsed_forwarding(on|off) # This option enables multiple requests for the same URI to be # processed as one request. Normally disabled to avoid increased # latency on dynamic content, but there can be benefit from enabling # this in a

Re: [squid-users] How much ram

2010-07-31 Thread Jose Ildefonso Camargo Tolosa
Hi! Ok, just to answer your question, it varies from client to client, ranging from 256MB to 3GB, also, the number of users ranges from 5 to 1000, and the disk cache size ranges from 1GB to 8GB. Sincerely, Ildefonso Camargo 2010/7/28 Tóth Tibor Péter : > Hi Guys! > > How much ram do you have in

Re: [squid-users] How does Squid prevent stampeding during a cache miss?

2010-07-31 Thread Amos Jeffries
david robertson wrote: Squid 2.x supports this: # TAG: collapsed_forwarding(on|off) # This option enables multiple requests for the same URI to be # processed as one request. Normally disabled to avoid increased # latency on dynamic content, but there can be benefit from e