Hi.
On 20.04.2011 13:48, Helmut Hullen wrote:
# stat swap.state
95 4012314 -rw-r- 1 squid squid 16326000 10203960 "Apr 19
14:02:21 2011" "Apr 20 10:53:45 2011" "Apr 20 10:53:45 2011" "Apr 19
14:02:21 2011" 16384 19968 0 swap.state
What about deleting the old cache and restarting with
Not sure if it helps but here is an access.log entry for a non-working
sslbump+dynamicssl connection.
1303442234.277 0 192.168.1.107 NONE/000 0 CONNECT
gmail.google.com:443 - HIER_NONE/- -
Regards,
Will
On Wed, Apr 20, 2011 at 9:51 PM, Will Metcalf wrote:
> SSLBump+DynamicSSL was working f
Problem solved.
Reason: CURLput "no-cache" in the http header by default, therefore
squid didn't cache the content.
Solution: It seems to be possible to configure CURL's http header by
hand, but I chose to use wget program in stead of CURL, which is much
simpler to do.
-Henry
On Tue, Apr 19, 2
On 21/04/2011 5:29 PM, Ron Wheeler wrote:
On 21/04/2011 1:46 PM, Jawahar Balakrishnan (JB) wrote:
If you are thinking that is is dynamic content with query strings then
it's not the case. the urls will look like a directory structured
static content but the back-end app server will translate the
On 21/04/2011 1:46 PM, Jawahar Balakrishnan (JB) wrote:
If you are thinking that is is dynamic content with query strings then
it's not the case. the urls will look like a directory structured
static content but the back-end app server will translate the url and
fetch the appropriate content from
Hello! I want to compile a squid 3.1.12 in Debian Lenny but I don't know what
packages I need. Please help me.
Configure options:
./configure --prefix=/usr/local/squid/ --enable-removal-polices=heap,lru
--enable-delay-pools --disable-wccp --disable-wccpv2 --disable-snmp
--enable-arp-acl --disab
If you are thinking that is is dynamic content with query strings then
it's not the case. the urls will look like a directory structured
static content but the back-end app server will translate the url and
fetch the appropriate content from the CMS (alfresco)
On Thu, Apr 21, 2011 at 1:30 PM, Ron
If you google "squid dynamic content" you will find that by default
squid does not cache dynamic content.
If it did, it would be useless as a proxy server since that would make
almost all dynamic sites unusable.
There are lots of instructions about how to trick squid into caching
content that
First I will explain what I am trying to do.
I have a number of tests (executables and scripts) which run on
resources downloaded via HTTP, FTP etc. Some of these tests are third
party compiled executables which would be problematic to change. The
resources can potentially be any type of file
It is all dynamic content going forward
scenarios where a cache flush would be required
1) an article is updated
2) category is updated with a list of articles.
we syndicate content to abut 150 partner and will have same
article/category with a different URL doesn't squid cache based on the
ur
Are you sure that you need to do this?
Squid should be able to tell the difference between static and dynamic
content.
We have a dynamic JSR-168/268 portal based on Tomcat and Jetspeed
sitting behind Apache and Squid and we have never had to intervene with
Squid for 3 years.
We also have lo
I would rather not do a restart of anything unless absolutely required
Here are the challenges we face
1) We are trying to deploy Suqid as a reverse-proxy in front of a CMS
2) We want to trying find a balance between keeping the content fresh
without affecting performance by frequently expiring
Jorge,
I understand that you want to give users maximum 5 minutes access to facebook.
There are various problems with implementing this requirement but
one issue is that Squid nor any other software
has a way to determine if a user visits facebook.com or visits
an other website that has a facebo
Greetings,
I have a a transparent squid in a private net with a 1-1 NAT, Im trying to get
a good understanding of what my clients look like to the outside. What is the
Default setting " for forwarded_for" if my system is running intercept?
to my understanding if I leave the X-Forwarded-For head
Indunil Jayasooriya wrote:
>
> On Thu, Apr 21, 2011 at 1:54 PM, EzyMike wrote:
>> Hi!
>>
>> I have a problem compiling squid 3.1.11 or 3.1.12 on a OpenBSD 4.8 box.
>> When preparing to replace a OpenBSD 4.6 box with a 4.8, the compilation
>> of
>> squid brings this error:
>>
Howdy,
I've been checking the time condition but I didn't found what I need to,
and also don't know if that's possible.
I'd like to setup an ACL to block facebook.com, and allow a timeout of 300
seconds per day, on working days.
I saw that with the time condition I specify to run only on working
@Amos,
Sorry for the late reply.
I experimented a bit during this time. I decided to first forward all
traffic from a single system to a different gateway.
Once this works fine I will go for filtering based on download size.
So, to allow all requests from a different gateway link I did the follow
On Thu, Apr 21, 2011 at 1:54 PM, EzyMike wrote:
> Hi!
>
> I have a problem compiling squid 3.1.11 or 3.1.12 on a OpenBSD 4.8 box.
> When preparing to replace a OpenBSD 4.6 box with a 4.8, the compilation of
> squid brings this error:
>
> Making all in lib
> cc1: warnings being treated as error
cc wrote:
> Hi,
>
> I managed to get Squid 2.7 running ok for the past
> day; but have come across a puzzling error.
>
> No matter what url I type in, I get an error
> message:
>
> The requested URL could not be retrieved
>
> While trying to retrieve the URL:
>
> The following error was encou
Hi,
I managed to get Squid 2.7 running ok for the past
day; but have come across a puzzling error.
No matter what url I type in, I get an error
message:
The requested URL could not be retrieved
While trying to retrieve the URL:
The following error was encountered:
Connection to Failed
Th
Hi!
I have a problem compiling squid 3.1.11 or 3.1.12 on a OpenBSD 4.8 box.
When preparing to replace a OpenBSD 4.6 box with a 4.8, the compilation of
squid brings this error:
Making all in lib
cc1: warnings being treated as errors
In file included from ../include/util.h:49,
from b
21 matches
Mail list logo