Is there any way to turn off reporting of unparseable HTTP headers for these?
I get them also all day only for lijit.com. I know I can choose to block the
domain, was just curious if there was a way to put something in the conf that
will prevent these from being logged. I searched through the
>Thanks! I missed that directive. And yes when I lowered it I was able to
>decrease the initial lookup time to 5 seconds if I set the directive to 5
>seconds. I am wondering what >tradeoff I will have, if anything else will
>break by lowering this too low.
>Is the timeout on lookup I exper
>One last thing to check is what happens when you query for records
>on that domain. That is the major change between 3.1 and 3.2 DNS.
>What I get is:
>## time host -t A webapps.kattenlaw.com
>webapps.kattenlaw.com has address 38.98.128.19
>0.000u 0.004s 0:00.06 0.0% 0+0k 0+0io 0pf+0w
uses a recursive DNS record. (30 seconds to bring up site)
On 4/9/2013 10:46 PM, Duncan, Brian M. wrote:
> Thanks for the reply and further clarification,
>
> I still believe the issue I am reporting is specific to DNS and how Squid's
> internal DNS resolver works.
>
> I
>Sorry if I wasn't clear.
>I will try to rephrase the logic.
>I went from the buttom up.
>curl + wget + simple ruby script = slow response.
>notice that this address is a redirection.
>I am unsure now about the dns issue that I have seen this morning.
>The main problem is not the page but the ssl
>Probably dns issues not related to squid version in any way.
>This is a known issue that is not related to squid but I'm happy you
>posted about this issue.
>Regards,
>Eliezer
I appreciate the reply, I don't follow your logic though sorry. You think my
issue is probably DNS related? I tried
Testing 3.1, and 3.29 on CentOS 6.4 64 bit.
Found an issue that I do not know how to resolve and any searches I made of the
archive for the mailing list just turned up people saying to disable caching on
domains. Which this has nothing to do with. What I am trying to do below
works fine on m
FATAL: Bungled (null) line 8: icap_retry deny all Squid Cache (Version
3.2.8): Terminated abnormally.
squid3 -v
did you ./configure squid using --enable-icap-client
Resolved
The request was never even making it to the proxy server.
The workstation was using a pac file that had some return "DIRECT" for
hotmail.com
live.com
login.live.com
Fmail.live.com
Not sure why they were interfering, but removing them from the pac file
made the problem go away.
A Hotmail account converted to hotmail's new "outlook style email"
which I believe uses sliverlight, consistently fails to allow file
attachments when going through proxy.
Any ideas?
Squid Cache: Version 3.1.19
We're been using ESI to include dynamic content into otherwise static
pages, but one of our web content authors tried using it to include
some static HTML and ran into a bit of a problem when refreshing a
page in his browser. I think I've tracked it down and found a way to
fix it, but I'd like some
Hi Mike,
Mike Marchywka wrote:
Anyone know off hand how much squid can contribute to browsing speed on
various platforms due to DNS caching? I just setup a debian system and
notice while browsing I had very low BW at times. I suspected it
may have been doing lots of DNS lookups since there
were
Hello,
Trying to build squid on a Sun T5120 (SPARC based server).
the INSTALL file for squid 3.0 STABLE 19 says to do the ./configure,
followed by make.
The machine came with gcc 4.0.4 and a Sun C compiler (V5.9) pre-installed.
I carefully set the path so I could try them one at a time. Neithe
>Hi everyone,
>Im looking how to change the time that appears in the access.log to
make it friendlier. Where can I change the code to do it? Thanks for
answering
A shell script:
$ cat squidtime.sh
#!/bin/sh
perl -p -e 's/^([0-9]*)/"[".localtime($1)."]"/e'
This lets me do:
$ cat access.
workaround: the ESI requests also pass through squid so I
added a rule to strip out the content-length header:
acl get_requests method GET
acl esi_content urlpath_regex -i /esi/.*\.aspx
request_header_access Content-Length deny get_requests esi_content
Duncan
proxy_yes = "PROXY proxy.baladia.gov.kw:3128";
function FindProxyForURL(url, host)
{
// variable strings to return
if (
shExpMatch(url, "http://www.baladia.gov.kw*";) ||
shExpMatch(url, "http://host.kmun.gov.kw*";) ||
shExpMatch(url, "http://km_online*";)) {
ret
Amos Jeffries wrote:
Duncan Booth wrote:
I'm trying to use Squid 3 with ESI enabled, and while it works fine
for a few pages as soon as we put it under load it just crashes. I've
tried a variety of squid 3.0 versions: from STABLE6 (which is the one
we used on our development system)
I'm trying to use Squid 3 with ESI enabled, and while it works fine for
a few pages as soon as we put it under load it just crashes. I've tried
a variety of squid 3.0 versions: from STABLE6 (which is the one we used
on our development system) up to 3.0.STABLE13-20090212 and they all
behave the
box
>From what I can see it looks like the in house proxy server (squid) is
>requesting a username and password as the 172.16.1.190 address is the IP
>address of that server.
Thanks
Duncan Peacock
Systems Administrator
Adare
Park Mill
Clayton West
Huddersfield
West Yorkshire
HD8 9QQ
T
fatal error
c:squidlibexecsquid_ldap_auth No such file or directory.
I just can't work out the correct syntax to use?
Thanks
Duncan Peacock
Systems Administrator
Adare
Park Mill
Clayton West
Huddersfield
West Yorkshire
HD8 9QQ
T +44 (0)1484 867 304
M +44 (0)7790 807 571
F +44 (0)148
ain Server
auth_param basic credentialsttl 5 hours
But I get the following error from the squid.exe txt file:
FATAL: auth_param basic program /squid/libexec/squid_ldap_auth: (2) No such
file or directory
Any idea what could be wrong?
Thanks
Duncan Peacock
ain Server
auth_param basic credentialsttl 5 hours
But I get the following error from the squid.exe txt file:
FATAL: auth_param basic program /squid/libexec/squid_ldap_auth: (2) No such
file or directory
Any idea what could be wrong?
Thanks
Duncan Peacock
In a transparent proxy environment where I have no control over the
user's browser configuration settings, how do I handle requests for
https:// web sites. http:// sites are served up just fine, but https://
sites fail.
OS FreeBSD 6.1-STABLE
Squid Cache: Version 2.5.STABLE14
configure options:
en <[EMAIL PROTECTED]> wrote:
Duncan McQueen wrote:
> Hello,
>
> I am trying to do direct whois queries over squid. I can do such
> queries as HTTP requests, but was curious as to whether it is possible
> to do th requests directly (to port 43)? If so, what would be the
> co
1 etc", how would he similar command look for a whois
query?
Thanks,
Duncan McQueen
to authenticate to
the parent. I can see from the documents how one would
control access based on IP address.
Is there a way to handle this with a username/password
pair, and furthermore, manage that u/p pair through an
external helper (so I could store the u/p in MySQL for
example)
increase your bandwidth will be the final answer should u stil want to use
that list.
- Original Message -
From: <[EMAIL PROTECTED]>
To: <[EMAIL PROTECTED]>
Sent: Thursday, August 26, 2004 4:24 PM
Subject: [squid-users] Blacklist file
Hi Everybody
I want to use urlblacklist.com`s
You have to use your ACL lists in squid.conf . Guess all u need is time with
the documentation there say 15mins and u can do it.
- Original Message -
From: <[EMAIL PROTECTED]>
To: <[EMAIL PROTECTED]>
Sent: Wednesday, August 25, 2004 5:56 PM
Subject: [squid-users] how to block Yahoo messen
mote users to one platform/ OS (One browser if it's
cross platform isn't too bad)
Cheers,
Duncan
29 matches
Mail list logo