Hi,
unfortunately I still haven't been able to get sibling caches to work.
However I occasionaly see the following line in acess.log:
1162009609.788 0 192.168.1.136 TCP_MISS/200 299 GET
internal://rele132.relevad.lan/squid-internal-dynamic/netdb - NONE/- -
what does that mean exactly?
tha
> I want to know whether or not cache hierarchy (parent/sister
> relation) is useful for squid when it's running on acceleator mode.
> Any suggestion?Thanks.
hello,it seems nobody know this situation?We have some realservers which run
apache for web services,since the requests are high,so I wan
fre 2006-10-27 klockan 17:16 -0400 skrev Nguyen, Khanh, INFOT:
> If it is configured to operate in reverse proxy, it should not serve
> request that come without HOST header, should it?
It's a matter of taste. HTTP/1.0 did not have Host headers. All the
defaultsite says (when used together with vh
Dear all, I use Squid with Squidguard in order to block web sites to my
LAN users, but I can't put it to work. I configured the squidGuard.conf
file with the porn, aggressive, audio-video, . blacklists downloaded
from the squidguard web site. But when I put the "redirect_program
/usr/bin/squi
If it is configured to operate in reverse proxy, it should not serve request
that come without HOST header, should it? Perhaps it is just a matter of
opinion. I want to reserve my resource and only serve when I really need.
If I configure more than one domains in the box (reverse proxy mode), if
fre 2006-10-27 klockan 16:20 -0300 skrev Alexandre Correa:
> i´m using fedora core 3 64bits with kernel 2.6.16-13 ...
Then kernel is pretty much unlimited thanks to the 64-bit virtual memory
size, and ulimit is all you need to care about getting right.
Regards
Henrik
signature.asc
Description:
fre 2006-10-27 klockan 20:03 +0530 skrev Sekar:
> http_port 80 vhost
Another comment: I would recommend adding a defaultsite= to the
http_port line, defining which site HTTP/1.0 clients not sending Host
headers should see..
Regards
Henrik
signature.asc
Description: Detta är en digitalt
fre 2006-10-27 klockan 15:33 +0200 skrev Fabrizio Reale:
> I would like to cache the same url many times, each time for any browser
> with a particular vary parameters combination.
>
> Is it possible with SQUID?
Done by default since Squid-2.5, and done a bit better in 2.6 thanks to
it's ETag su
Javær wrote:
hi all,
i have squid 2.5 set up as a reverse proxy for our mail servers (to
avoid having to expose our mailservers to the outside world). squid is
acting only as a reverse proxy (accelerator), and I have
httpd_accel_with_proxy set to off.
does anyone have any advice or tips for crea
Sekar wrote:
Hello all,
I am using squid-2.6STABLE4 as a reverse proxy for multiple domains
which are hosted on 3 servers.
For example
Server A (192.168.1.101) :
aa1.mydomain.com
aa2.mydomain.com
Server B (192.168.1.102) :
aa3.mydomain.com
aa4.mydomain.com
Server C (192.168.1.103
i´m using fedora core 3 64bits with kernel 2.6.16-13 ...
i recompiled kernel removing features that aren´t used...
on limits of PAM i try:
squidhardnofile 16384
squidhardrss 2048000
squidhardstack unlimited
squid
hi all,
i have squid 2.5 set up as a reverse proxy for our mail servers (to
avoid having to expose our mailservers to the outside world). squid is
acting only as a reverse proxy (accelerator), and I have
httpd_accel_with_proxy set to off.
does anyone have any advice or tips for creating the ACL's
Dear all, I use Squid with Squidguard in order to block web sites to my
LAN users, but I can't put it to work. I configured the squidGuard.conf
file with the porn, aggressive, audio-video, . blacklists downloaded
from the squidguard web site. But when I put the "redirect_program
/usr/bin/squid
Hello all,
I am using squid-2.6STABLE4 as a reverse proxy for multiple domains
which are hosted on 3 servers.
For example
Server A (192.168.1.101) :
aa1.mydomain.com
aa2.mydomain.com
Server B (192.168.1.102) :
aa3.mydomain.com
aa4.mydomain.com
Server C (192.168.1.103):
aa5.mydo
Hi all,
I am using Squid as a reverse proxy and I have discovered that if a user
look at an already cached page with an accept-language different from a
previous one the cache is invalidated.
I would like to cache the same url many times, each time for any browser
with a particular vary parameters
Yes,front of the squid server have lvs and firewall.
Thank you reply, I 'd better to find more infomation from the firewall.
On 27/10/06, Mark Elsen <[EMAIL PROTECTED]> wrote:
> This is an accelerator setup for internal webserver(s).
>
>
If there's nothing in squid's access.log, there may hav
Yes, that's it. Now I will try to understand it.
http_access rules are acknowledged on a first match basis,
therefore you must write an AND(ed) condition, stating
the both conditions must be met :
http_access allow internal_net domainusers
M.
This is an accelerator setup for internal webserver(s).
If there's nothing in squid's access.log, there may have been a connection
attempt at the network level (only). I would however find it
difficult ; the fact that cachemgr would detect this.
Are your squid's protected by firewalling setup
Yes, that's it. Now I will try to understand it.
Thank you.
Marcelo Koehler
Wed, 25 Oct 2006 16:35:17 +0200, "Mark Elsen" <[EMAIL PROTECTED]> escreveu:
> >
> >http_access allow internalnet
> >http_access allow domainusers
> >...
>
> Try :
>
> http_access allow internalnet domainusers
>
This is an accelerator setup for internal webserver(s).
On 27/10/06, Mark Elsen <[EMAIL PROTECTED]> wrote:
> this data just a example.
> I have more than 30 squid server. and I sum the all data from them.
> if a client at 10:00 am have 1000 request and 10:05 am have 3000 ,then
> I can computer t
this data just a example.
I have more than 30 squid server. and I sum the all data from them.
if a client at 10:00 am have 1000 request and 10:05 am have 3000 ,then
I can computer that this client have 3000-1000 requests in the 5
minutes.
Is this an accelerator setup for internal webserver(s),
this data just a example.
I have more than 30 squid server. and I sum the all data from them.
if a client at 10:00 am have 1000 request and 10:05 am have 3000 ,then
I can computer that this client have 3000-1000 requests in the 5
minutes.
On 27/10/06, Mark Elsen <[EMAIL PROTECTED]> wrote:
> hell
hello:
I use this funtion get the data from squid.
getdata(){
squidclient -T 5 -h $1 -p 80 cache_object://${1}/client_list
2>/dev/null|grep -e "Address:" -e "\"|awk
'BEGIN{RS="Address:"}{print $1" -"$4}'
}
And the data like this. I compare the requests every 5 minutes.
Addre
Hi,
I'm configuring a squid to behave as a surrogate over a pool of
webserver (php) and a filer (a web server for static document only).
Some image and static document are not available on the php server but
only on the filer.
Here is some ascii art explaining the scheme:
squid
hello:
I use this funtion get the data from squid.
getdata(){
squidclient -T 5 -h $1 -p 80 cache_object://${1}/client_list
2>/dev/null|grep -e "Address:" -e "\"|awk
'BEGIN{RS="Address:"}{print $1" -"$4}'
}
And the data like this. I compare the requests every 5 minutes.
Address:
fre 2006-10-27 klockan 02:51 -0300 skrev Alexandre Correa:
> how to increase the amount of RAM of single proc can use ???
Kernel parameters, and in some cases ulimit settings (ulimit can
otionally set the limit lower than the kernel limit, with "unlimied"
being limited by the kernel limit).
In mo
hello,list.
I use the cache manager account that some ip had made more
than 1000 requests in 5 minutes. But I can't find any log about this
ip in my squid access.log. and can't detect any package relation this
ip use the snort or tcpdump tools.
Why? who can help me!
Could yo
hello,list.
I use the cache manager account that some ip had made more
than 1000 requests in 5 minutes. But I can't find any log about this
ip in my squid access.log. and can't detect any package relation this
ip use the snort or tcpdump tools.
Why? who can help me!
--
Huang Ming
right, finally it was an IE/IIS issue.
to who ever cares, I can confirm that it works OK since 2.6.
thanks for you help.
On Thu, October 26, 2006 13:38, Joel CARNAT wrote:
>
> and it still does not work. I get the l/p dialog box from IE, then get
> "Impossible to show the page".
> But it works w
No it does not work even when going direct in non-proxy mode.
I get the same thing - no messages.
Do you have any other ideas???
I'm stumped.
-Original Message-
From: Mark Elsen [mailto:[EMAIL PROTECTED]
Sent: Thursday, October 26, 2006 8:18 AM
To: Thomas Raef
Cc: squid-users@squid-cac
30 matches
Mail list logo