Thank you, I saw the problem.
So now I have to deal with Cache-Control: private header sent from IIS7.5
Don't know why IIS 7.5 always return private, Google show some bugs of this.
Thank you again Mr Jeffries.
On Tue, Nov 26, 2013 at 2:14 PM, Amos Jeffries squ...@treenet.co.nz wrote:
On
Hi Kinkie,
yes i made a capture but don't see the cause.
I send you my traces.
Kind regards.
Marc
-Ursprüngliche Nachricht-
Von: Kinkie [mailto:gkin...@gmail.com]
Gesendet: Montag, 25. November 2013 15:45
An: Grooz, Marc (regio iT)
Cc: squid-users@squid-cache.org
Betreff: Re:
Hi All,
I am doing a small test for bandwidth measurement of my test setup
while squid is running. I am running a script to pump the traffic from
client browser to Web-server via Squid box. The script creates
around 50 user sessions and tries to do wget of randomly selected
dynamic URL's.
On Tuesday 26 November 2013 at 11:37, SaRaVanAn wrote:
Hi All,
I am doing a small test for bandwidth measurement of my test setup
while squid is running. I am running a script to pump the traffic from
client browser to Web-server via Squid box.
Er, do you really mean you are sending data
In my first case:
Squid request:
-MGET
/cgi-bin/upload_status.cgi?uid=060950223627files=:iso-27001-router-security-audit-checklist.xlsok=1
HTTP/1.1
Accept: text/html, application/xhtml+xml, */*
Referer: http://xyz/
Accept-Language: de-DE
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64;
Hi,
CentOS / RHEL 6.4 runs natively on the Hyper-V platform. Just keep in
mind that i've never done an install with a desktop manager running as
i generally just with the console / ssh. I manage several web
filtering servers based on squid running that distro (usually squid
3.3.9/10 on CentOS
On 27/11/2013 1:00 a.m., Grooz, Marc (regio iT) wrote:
In my first case:
Squid request:
-MGET
/cgi-bin/upload_status.cgi?uid=060950223627files=:iso-27001-router-security-audit-checklist.xlsok=1
HTTP/1.1
Accept: text/html, application/xhtml+xml, */*
Webserver answer:
[-MHTTP/1.1 200
On Tue, Nov 26, 2013 at 5:16 PM, Antony Stone
antony.st...@squid.open.source.it wrote:
On Tuesday 26 November 2013 at 11:37, SaRaVanAn wrote:
Hi All,
I am doing a small test for bandwidth measurement of my test setup
while squid is running. I am running a script to pump the traffic from
I've got it. I set the option forwared-for from off to delete and now both
website gets displayed thru squid.
Kind regrads
Marc
-Ursprüngliche Nachricht-
Von: Amos Jeffries [mailto:squ...@treenet.co.nz]
Gesendet: Dienstag, 26. November 2013 13:11
An: squid-users@squid-cache.org
Hi,
I found http://wiki.squid-cache.org/Features/HTTP2 and I wonder if it is
the actual state, that SPDY is planned for squid 3.5, or is it allready
implemented in the actual version.
--
Regards
Dieter
--
I do not get viruses because I do not use MS software.
If you use Outlook then please
Hi,
as I understand from several messages on the squid-dev mailing list,
SPDY is not going to be supported.
The first HTTP/2.0-related code is being debated and worked on in these weeks.
If you are interested, you may want to join the squid-dev mailing
list. Contributions are always welcome :)
On 2013-11-27 04:20, Dieter Bloms wrote:
Hi,
I found http://wiki.squid-cache.org/Features/HTTP2 and I wonder if it
is
the actual state, that SPDY is planned for squid 3.5, or is it allready
implemented in the actual version.
SPDY is not planned at all. Unless the SPDY people re-write their
On Tue, Nov 26, 2013 at 5:30 AM, Amos Jeffries squ...@treenet.co.nz wrote:
On 26/11/2013 10:13 a.m., Ghassan Gharabli wrote:
Hi,
I have built a PHP script to cache HTTP 1.X 206 Partial Content like
WindowsUpdates Allow seeking through Youtube many websites .
Ah. So you have written your
Hey Ghassan,
Moving from PHP to C++ is a nice idea.
I do not know the size of the cache or it's limits but couple things to
consider while implementing the cache:
* clients latency
* server overload
* total cost
* efficiency of the cache
Bandwidth can cost lots of money in some cases and
Hi,
I want to use Squid as a reverse proxy (accel) to my main website but
only if they've authenticated - something like a captive portal (not
sure if that's the right phrase). By authenticated, I don't mean
basic or digest etc. I want to provide my own logon page (say php) - I
can host another
15 matches
Mail list logo