Hello,
I have running Squid Cache: Version 3.0.STABLE13 with ntlm auth, using
samba-3.2.10 and winbind, also SquidGuard 1.4
Since a few days I am detecting that squid is going down (then restart
again without any problem) with the error:
2009/05/06 12:59:33| assertion failed: comm.cc:572: "
> thanks for the advice, i just increased cache size to 300 GB
> (i have 1 Terra raided hdd so i dont mind the size)
> as for object size i've set it to 15 MB. though one question,
> i've read that there's a certain option that keeps cached
> objects in memory for quick retrieval..
Usually the ope
thanks for the advice, i just increased cache size to 300 GB (i have 1 Terra
raided hdd so i dont mind the size)
as for object size i've set it to 15 MB. though one question, i've read that
there's a certain option that keeps cached objects in memory for quick
retrieval..
i've got 6 GB of ram,
hello and thanks for the prompt reply,
sorry for not mentioning earlier, though i continuously check my ISP's PRTG to
notice that we have maxed out our allowed bandwidth.
which is a waste since we only browse specific sites that load specific static
images that may or may not change in none les
Hi All,
We have released an earlier version of an external program( plug-in ) to
log squid access to MySQL database using logfile_daemon feature in squid
2.7.
The plug-in is available at :
http://www.visolve.com/squid/squid-mysqllog.php
Do send your comments for the improvement.
Thanks,
ViS
> Hi,
>
> On Sun, 10 May 2009, Roland Roland wrote:
>
>> users on my network have been complaining of slow browsing sessions for
>> a
>> while now..
>> i'm trying to figure out ways to speed sessions up without necessarily
>> upgrading my current bandwidth plan...
>
> Squid may help with this. How
Hi,
On Sun, 10 May 2009, Roland Roland wrote:
> users on my network have been complaining of slow browsing sessions for a
> while now..
> i'm trying to figure out ways to speed sessions up without necessarily
> upgrading my current bandwidth plan...
Squid may help with this. However, you don
oh ok, makes sense..
thanks for the clarification i appreciate that :)
though one more question if possible, is there anything i could
possibly do to speed up browsing aside what i mentioned earlier?
keep in mind that i only added an allow ACL to my subnet... and that's it! is
it enough?
Adam wrote:
Hello
I have simple external perl helper program
#!/usr/bin/perl
$| = 1;
open(LOG, ">/tmp/squid.log");
print LOG "RUNNING\n";
close(LOG);
while(defined($line = )){
print "OK\n";
open(LOG, ">>/tmp/squid.log");
print LOG "Got: $line\n";
close(LOG);
}
It should alw
Hello
I have simple external perl helper program
#!/usr/bin/perl
$| = 1;
open(LOG, ">/tmp/squid.log");
print LOG "RUNNING\n";
close(LOG);
while(defined($line = )){
print "OK\n";
open(LOG, ">>/tmp/squid.log");
print LOG "Got: $line\n";
close(LOG);
}
It should always returns OK
Roland Roland wrote:
but while using wireshark, i see that for each browsing session i
retrieve all "Static" objects from the net! at the same time caching
logs shows hit after another...
is that normal ?!
I assume you are using Wireshark to watch traffic between your squid box
and the inte
Hi All,
Can someone help me in understanding why there is NONE:// [-] for
Request Header in the logs
logformat -> %ts.%03tu %tg %>a %ru [%>h] [%h] I get NONE:// [-] in the logs
Need help
Regards,
Remy
--
Disclaimer an
Hi All,
users on my network have been complaining of slow browsing sessions for a
while now..
i'm trying to figure out ways to speed sessions up without necessarily
upgrading my current bandwidth plan...
i've thought about Squid, i've set it up on centOS and it's under testing.
so my question
13 matches
Mail list logo