[squid-users] R: Re: [squid-users] to test multiple http access

2014-05-28 Thread Riccardo Castellani
For axample I'd like simulate from my Windows client, at the same time about  
20-30 http requests to different sites to understand the Squid behaviour.
With 
these tools ?


>Messaggio originale
>Da: squ...@treenet.co.nz
>Data: 
29-mag-2014 5.16
>A: 
>Ogg: Re: [squid-users] to test multiple http access
>

>On 29/05/2014 4:05 a.m., Riccardo Castellani wrote:
>> I'd like to simulate 
several http access requests before to put into production 
>> my new squid 
server, do you know way to test before to connect server to 
>> network ?
>> 
>

>* Connect your browser to it.
>* squidclient
>* ApacheBench
>* Web Polygraph

>* wget
>* curl
>* custom script in your favourite language
>
>Amos
>
>




Re: [squid-users] to test multiple http access

2014-05-28 Thread Amos Jeffries
On 29/05/2014 4:05 a.m., Riccardo Castellani wrote:
> I'd like to simulate several http access requests before to put into 
> production 
> my new squid server, do you know way to test before to connect server to 
> network ?
> 

* Connect your browser to it.
* squidclient
* ApacheBench
* Web Polygraph
* wget
* curl
* custom script in your favourite language

Amos



[squid-users] to test multiple http access

2014-05-28 Thread Riccardo Castellani
I'd like to simulate several http access requests before to put into production 
my new squid server, do you know way to test before to connect server to 
network ?


[squid-users] Test 2. Please disregard

2013-01-10 Thread Amos Jeffries

This is a test message verifying report of squid-announce issues.


[squid-users] Test 1. Please disregard.

2013-01-10 Thread Amos Jeffries

This is a test message verifying report of squid-announce issues.


Re: [squid-users] Please help test a streaming problem through squid

2012-11-08 Thread Eliezer Croitoru

On 11/8/2012 2:53 PM, Peter Olsson wrote:

Thanks, I will set this up later.
I assume that the outside of the proxy
is the most interesting to capture?

From the client to the proxy\router.

Regards,
Eliezer

--
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting for Nonprofit organizations
eliezer  ngtech.co.il


Re: [squid-users] Please help test a streaming problem through squid

2012-11-08 Thread Peter Olsson
On Thu, Nov 08, 2012 at 02:44:17PM +0200, Eliezer Croitoru wrote:
> On 11/8/2012 2:13 PM, Peter Olsson wrote:
> > But why is the commercial part working in the same plugin?
> > Flash web clips work fine behind proxy for these commercials,
> > and also for complete video clips on other webs.
> Sorry I cant answer you yet since I dont have data on it.
> If you are willing to capture the sessions with WireShark or tcpdump 
> (filtered) I will be happy to take a small look at it.
> 
> Regards,
> Eliezer

Thanks, I will set this up later.
I assume that the outside of the proxy
is the most interesting to capture?

Peter Olssonp...@leissner.se


Re: [squid-users] Please help test a streaming problem through squid

2012-11-08 Thread Eliezer Croitoru

On 11/8/2012 2:13 PM, Peter Olsson wrote:

But why is the commercial part working in the same plugin?
Flash web clips work fine behind proxy for these commercials,
and also for complete video clips on other webs.

Sorry I cant answer you yet since I dont have data on it.
If you are willing to capture the sessions with WireShark or tcpdump 
(filtered) I will be happy to take a small look at it.


Regards,
Eliezer

--
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting for Nonprofit organizations
eliezer  ngtech.co.il


Re: [squid-users] Please help test a streaming problem through squid

2012-11-08 Thread Peter Olsson
On Thu, Nov 08, 2012 at 02:06:46PM +0200, Eliezer Croitoru wrote:
> On 11/8/2012 1:40 PM, Peter Olsson wrote:
> > I'm sorry, I don't know much about streaming,
> > but their requirement for playing the streams
> > is Adobe Flash. And the commercial part which
> > works is played in the same plugin player as
> > the main part which doesn't work.
> >
> > Maybe I'm using the word stream in the wrong way?
> > Maybe video clip is a better phrase in this case?
> >
> > When I right click in the Flash plugin that is
> > failing to play the video clip, it says
> > "Qbrick Professional: 3.8.1.211" and
> > "OSMF Version: 1.0", if that has any relevance.
> >
> > The Flash plugin in my web browser is version
> > 11.3.300.270.
> >
> > These video clips are free to view so if you have
> > a proxy squid available you could try them.
> > I don't think they are limited to Swedish clients,
> > at least I don't see anything about limitations on
> > their web.
> >
> > http://www.dn.se/webbtv/
> >
> > Thanks!
> It's most likely RTMP.
> If you can access the internet only using proxy you do have problem 
> since flash dosnt really support proxy settings.
> There might be a way to create an rtmp proxy to use CONNECT for RTMP but 
> it's a very big thing.
> 
> Regards,
> Eliezer

But why is the commercial part working in the same plugin?
Flash web clips work fine behind proxy for these commercials,
and also for complete video clips on other webs.

-- 
Peter Olssonp...@leissner.se


Re: [squid-users] Please help test a streaming problem through squid

2012-11-08 Thread Eliezer Croitoru

On 11/8/2012 1:40 PM, Peter Olsson wrote:

I'm sorry, I don't know much about streaming,
but their requirement for playing the streams
is Adobe Flash. And the commercial part which
works is played in the same plugin player as
the main part which doesn't work.

Maybe I'm using the word stream in the wrong way?
Maybe video clip is a better phrase in this case?

When I right click in the Flash plugin that is
failing to play the video clip, it says
"Qbrick Professional: 3.8.1.211" and
"OSMF Version: 1.0", if that has any relevance.

The Flash plugin in my web browser is version
11.3.300.270.

These video clips are free to view so if you have
a proxy squid available you could try them.
I don't think they are limited to Swedish clients,
at least I don't see anything about limitations on
their web.

http://www.dn.se/webbtv/

Thanks!

It's most likely RTMP.
If you can access the internet only using proxy you do have problem 
since flash dosnt really support proxy settings.
There might be a way to create an rtmp proxy to use CONNECT for RTMP but 
it's a very big thing.


Regards,
Eliezer


--
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting for Nonprofit organizations
eliezer  ngtech.co.il


Re: [squid-users] Please help test a streaming problem through squid

2012-11-08 Thread Peter Olsson
On Thu, Nov 08, 2012 at 05:14:41PM +1300, Amos Jeffries wrote:
> On 8/11/2012 4:33 p.m., Peter Olsson wrote:
> > Hello!
> >
> > We run squid as a caching proxy. No transparency
> > or intercept in any form, and the only way out is
> > through the squid proxy server. Web browsers use
> > either wpad or hardcoded proxy configuration.
> >
> > Streams from www.dn.se/webbtv don't work. The commercial
> > part first in every stream works fine, but when it's
> > time to switch to the main stream it just stops and the
> > screen goes black.
> >
> > Our production squid runs 3.1.21, and in a lab server
> > I have tried with squid 2.7.STABLE9, 3.2.3 and
> > 3.3.0.1-20121107-r12377. Same problem in all versions.
> > The configuration in the lab server squids have been
> > exactly as they were default installed, except that I
> > enabled cache_dir ufs in all of them.
> >
> > Any ideas about this?
> 
> Can you define "stream" in terms of the actual protocol taking place?
> 
> There are many types of protocol involved with streaming. Squid only 
> supports the HTTP and ICY streaming protocols. RTSP, RTMP, VoIP, VoD, 
> SPDY and WebSockets streaming are not specifically supported by Squid 
> (may require CONNECT tunnel access, but that is as close as it gets).
> 
> and what agent is being used as a client?
> 
> Sadly not all applets or embeded clients are capable of using an HTTP proxy.
> 
> 
> Amos

I'm sorry, I don't know much about streaming,
but their requirement for playing the streams
is Adobe Flash. And the commercial part which
works is played in the same plugin player as
the main part which doesn't work.

Maybe I'm using the word stream in the wrong way?
Maybe video clip is a better phrase in this case?

When I right click in the Flash plugin that is
failing to play the video clip, it says
"Qbrick Professional: 3.8.1.211" and
"OSMF Version: 1.0", if that has any relevance.

The Flash plugin in my web browser is version
11.3.300.270.

These video clips are free to view so if you have
a proxy squid available you could try them.
I don't think they are limited to Swedish clients,
at least I don't see anything about limitations on
their web.

http://www.dn.se/webbtv/

Thanks!

-- 
Peter Olssonp...@leissner.se


Re: [squid-users] Please help test a streaming problem through squid

2012-11-07 Thread Amos Jeffries

On 8/11/2012 4:33 p.m., Peter Olsson wrote:

Hello!

We run squid as a caching proxy. No transparency
or intercept in any form, and the only way out is
through the squid proxy server. Web browsers use
either wpad or hardcoded proxy configuration.

Streams from www.dn.se/webbtv don't work. The commercial
part first in every stream works fine, but when it's
time to switch to the main stream it just stops and the
screen goes black.

Our production squid runs 3.1.21, and in a lab server
I have tried with squid 2.7.STABLE9, 3.2.3 and
3.3.0.1-20121107-r12377. Same problem in all versions.
The configuration in the lab server squids have been
exactly as they were default installed, except that I
enabled cache_dir ufs in all of them.

Any ideas about this?


Can you define "stream" in terms of the actual protocol taking place?

There are many types of protocol involved with streaming. Squid only 
supports the HTTP and ICY streaming protocols. RTSP, RTMP, VoIP, VoD, 
SPDY and WebSockets streaming are not specifically supported by Squid 
(may require CONNECT tunnel access, but that is as close as it gets).



and what agent is being used as a client?

Sadly not all applets or embeded clients are capable of using an HTTP proxy.


Amos


[squid-users] Please help test a streaming problem through squid

2012-11-07 Thread Peter Olsson
Hello!

We run squid as a caching proxy. No transparency
or intercept in any form, and the only way out is
through the squid proxy server. Web browsers use
either wpad or hardcoded proxy configuration.

Streams from www.dn.se/webbtv don't work. The commercial
part first in every stream works fine, but when it's
time to switch to the main stream it just stops and the
screen goes black.

Our production squid runs 3.1.21, and in a lab server
I have tried with squid 2.7.STABLE9, 3.2.3 and
3.3.0.1-20121107-r12377. Same problem in all versions.
The configuration in the lab server squids have been
exactly as they were default installed, except that I
enabled cache_dir ufs in all of them.

Any ideas about this?
(I will ask their support what the difference is in
the streaming methods of the commercial part and the
main part.)

Thanks!

-- 
Peter Olssonp...@leissner.se


Re: [squid-users] Error to test connectivity to internal MS Exchange server

2012-05-27 Thread Amos Jeffries

On 23/05/2012 10:08 a.m., Ruiyuan Jiang wrote:

Hi, all

I am trying to setup MS webmail over rpc Exchange server access through squid 
(squid 3.1.19, SPARC, Solaris 10) from internet. Here is my pilot squid 
configuration (squid.conf):

https_port 156.146.2.196:443 accel 
cert=/opt/squid-3.1.19/ssl.crt/webmail_juicycouture_com.crt 
key=/opt/squid-3.1.19/ssl.crt/webmail_juicycouture_com.key 
cafile=/opt/apache2.2.21/conf/ssl.crt/DigiCertCA.crt 
defaultsite=webmail.juicycouture.com

cache_peer 10.150.2.15 parent 443 0 no-query originserver login=PASS ssl 
sslcert=/opt/squid-3.1.19/ssl.crt/webmail_katespade_com.crt 
sslkey=/opt/squid-3.1.19/ssl.crt/webmail_katespade_com.key 
sslcafile=/opt/apache2.2.21/conf/ssl.crt/DigiCertCA.crt name=exchangeServer



2012/05/22 17:44:15| fwdNegotiateSSL: Error negotiating SSL connection on FD 
13: error:14090086:SSL routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify 
failed (1/-1/0)
2012/05/22 17:44:15| TCP connection to 10.150.2.15/443 failed
2012/05/22 17:44:15| fwdNegotiateSSL: Error negotiating SSL connection on FD 
13: error:14090086:SSL routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify 
failed (1/-1/0)

 From the packet capture, the internal Exchange server reset the connection from the squid 
proxy server by either "Alert (Level: Fatal, Description: Unknown CA)" when I used 
above official certificates or "Alert (Level: Fatal, Description: Certificate Unknown) 
when I used internal CA signed certificate after initial https handshaking between squid and 
exchange server through https connection. Can anyone tell me how do I correctly configure 
cache_peer statement to make it work?


In case you did not figure this out already... Squid is unable to 
validate the exchange server certificate using either the openssl 
libraries trusted CA certificates or the sslcafile= parameter 
certificate given to verify it with.


* Check that your openSSL library trusted CA are up to date on the Squid 
machine - this is the most common cause of validation errors.


* Check that your /opt/apache2.2.21/conf/ssl.crt/DigiCertCA.crt file on 
the Squid machine contains the CA used to sign the exchange servers 
certificate.


Amos


[squid-users] Error to test connectivity to internal MS Exchange server

2012-05-22 Thread Ruiyuan Jiang
Hi, all

I am trying to setup MS webmail over rpc Exchange server access through squid 
(squid 3.1.19, SPARC, Solaris 10) from internet. Here is my pilot squid 
configuration (squid.conf):

https_port 156.146.2.196:443 accel 
cert=/opt/squid-3.1.19/ssl.crt/webmail_juicycouture_com.crt 
key=/opt/squid-3.1.19/ssl.crt/webmail_juicycouture_com.key 
cafile=/opt/apache2.2.21/conf/ssl.crt/DigiCertCA.crt 
defaultsite=webmail.juicycouture.com

cache_peer 10.150.2.15 parent 443 0 no-query originserver login=PASS ssl 
sslcert=/opt/squid-3.1.19/ssl.crt/webmail_katespade_com.crt 
sslkey=/opt/squid-3.1.19/ssl.crt/webmail_katespade_com.key 
sslcafile=/opt/apache2.2.21/conf/ssl.crt/DigiCertCA.crt name=exchangeServer

cache_peer_access exchangeServer allow all

http_access allow all

miss_access allow all

>From the access log of squid:

1337723055.845  7 207.46.14.63 TCP_MISS/503 3905 RPC_IN_DATA 
https://webmail.juicycouture.com/rpc/rpcproxy.dll - 
FIRST_UP_PARENT/exchangeServer text/html
1337723055.934  5 207.46.14.63 TCP_MISS/503 3932 RPC_IN_DATA 
https://webmail.juicycouture.com/rpc/rpcproxy.dll - 
FIRST_UP_PARENT/exchangeServer text/html


>From the cache.log of the squid:

2012/05/22 17:33:28| Starting Squid Cache version 3.1.19 for 
sparc-sun-solaris2.10...
2012/05/22 17:33:28| Process ID 7071
2012/05/22 17:33:28| With 256 file descriptors available
2012/05/22 17:33:28| Initializing IP Cache...
2012/05/22 17:33:28| DNS Socket created at [::], FD 8
2012/05/22 17:33:28| DNS Socket created at 0.0.0.0, FD 9
2012/05/22 17:33:28| Adding domain fifthandpacific.com from /etc/resolv.conf
2012/05/22 17:33:28| Adding nameserver 12.127.17.71 from /etc/resolv.conf
2012/05/22 17:33:28| Adding nameserver 12.127.16.67 from /etc/resolv.conf
2012/05/22 17:33:28| Adding nameserver 156.146.2.190 from /etc/resolv.conf
2012/05/22 17:33:28| Unlinkd pipe opened on FD 14
2012/05/22 17:33:28| Store logging disabled
2012/05/22 17:33:28| Swap maxSize 0 + 262144 KB, estimated 20164 objects
2012/05/22 17:33:28| Target number of buckets: 1008
2012/05/22 17:33:28| Using 8192 Store buckets
2012/05/22 17:33:28| Max Mem  size: 262144 KB
2012/05/22 17:33:28| Max Swap size: 0 KB
2012/05/22 17:33:28| Using Least Load store dir selection
2012/05/22 17:33:28| Current Directory is /opt/squid-3.1.19/var/logs
2012/05/22 17:33:28| Loaded Icons.
2012/05/22 17:33:28| Accepting HTTPS connections at 156.146.2.196:443, FD 15.
2012/05/22 17:33:28| HTCP Disabled.
2012/05/22 17:33:28| Configuring Parent 10.150.2.15/443/0
2012/05/22 17:33:28| Squid plugin modules loaded: 0
2012/05/22 17:33:28| Ready to serve requests.
2012/05/22 17:33:29| storeLateRelease: released 0 objects
-BEGIN SSL SESSION PARAMETERS-
MIGNAgEBAgIDAQQCAC8EIAj2TdmdLmNKL8/+V0D37suIYsli5OZLvCZu6u1+voNA
BDAy5uGQ23i/G+ozoVu/RDjm8yMq3zAJAWiXKz+U537Fd5uMDJeCmo30/cy9WPeF
6fmhBgIET7wIr6IEAgIBLKQCBACmGgQYd2VibWFpbC5qdWljeWNvdXR1cmUuY29t
-END SSL SESSION PARAMETERS-
-BEGIN SSL SESSION PARAMETERS-
MIGNAgEBAgIDAQQCAC8EILcgJcTbarlfw3jpifpmpBZQpBYheYouh2NZp9eoPJUy
BDBs6l+2LMOMI4D/RPQG3mOYbZ7OBcpanTJFaa8zCBV4s6AxtTpIFL2LnxRoJ0uB
I/WhBgIET7wIr6IEAgIBLKQCBACmGgQYd2VibWFpbC5qdWljeWNvdXR1cmUuY29t
-END SSL SESSION PARAMETERS-
2012/05/22 17:44:15| fwdNegotiateSSL: Error negotiating SSL connection on FD 
13: error:14090086:SSL routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify 
failed (1/-1/0)
2012/05/22 17:44:15| TCP connection to 10.150.2.15/443 failed
2012/05/22 17:44:15| fwdNegotiateSSL: Error negotiating SSL connection on FD 
13: error:14090086:SSL routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify 
failed (1/-1/0)

>From the packet capture, the internal Exchange server reset the connection 
>from the squid proxy server by either "Alert (Level: Fatal, Description: 
>Unknown CA)" when I used above official certificates or "Alert (Level: Fatal, 
>Description: Certificate Unknown) when I used internal CA signed certificate 
>after initial https handshaking between squid and exchange server through 
>https connection. Can anyone tell me how do I correctly configure cache_peer 
>statement to make it work? 


Thanks in advance.

Ryan Jiang




This message (including any attachments) is intended
solely for the specific individual(s) or entity(ies) named
above, and may contain legally privileged and
confidential information. If you are not the intended 
recipient, please notify the sender immediately by 
replying to this message and then delete it.
Any disclosure, copying, or distribution of this message,
or the taking of any action based on it, by other than the
intended recipient, is strictly prohibited.



[squid-users] Re: Unable to test HTTP PUT-based file upload via Squid Proxy

2012-05-16 Thread Harry

Amos Jeffries-2 wrote
> 
> On 16.05.2012 00:39, Harry Simons wrote:
> 
>>
>> **Request:**
>>
>> PUT http://WEB-SERVER/upload/sample.put HTTP/1.1
>> User-Agent: curl/7.15.5 (i686-redhat-linux-gnu) libcurl/7.15.5
>> OpenSSL/0.9.8b zlib/1.2.3 libidn/0.6.5
>> Host: WEB-SERVER
>> Pragma: no-cache
>> Accept: */*
>> Proxy-Connection: Keep-Alive
>> Transfer-Encoding: chunked
>> Expect: 100-continue
>>
>> **Response:**
>>
>> HTTP/1.0 501 Not Implemented
>> Server: squid/2.6.STABLE21
>> Date: Sun, 13 May 2012 02:11:39 GMT
>> Content-Type: text/html
>> Content-Length: 1078
>> Expires: Sun, 13 May 2012 02:11:39 GMT
>> X-Squid-Error: ERR_UNSUP_REQ 0
>> X-Cache: MISS from SQUID-PROXY-FQDN
>> X-Cache-Lookup: NONE from SQUID-PROXY-FQDN:3128
>> Via: 1.0 SQUID-PROXY-FQDN:3128 (squid/2.6.STABLE21)
>> Proxy-Connection: close
>>
> 
> 
> Curl is attempting to use HTTP/1.1 features which 2.6 does not support 
> (Expect:100-continue, Transfer-Encoding:chunked), and is too old to even 
> have proper workarounds for broken clients. Your request won't work due 
> to these even if PUT was okay.
> 
> Please upgrade. squid-2.7/3.1 are still HTTP/1.0 but have some hacks to 
> workaround the HTTP/1.1 features curl is asking for. Squid-3.2 (beta) 
> has HTTP/1.1 support.
> 
> Amos
> 

I have not upgraded Squid (yet). But, now, when I try to issue a simple,
manually constructed PUT request via socat (shown below), there's no output
from socat.

1. First, I tested my PUT request (and WEB-SERVER's ability to accept it) by
issuing it straight to the WEB-SERVER, like so:

$ cat put.req# The PUT request to send a 3-character file containing
'xyz'
PUT http://WEB-SERVER/upload/sample.put HTTP/1.0
Host: WEB-SERVER
Content-Type: text/plain
Content-Length: 3

xyz
$ cat put.req | socat - TCP:WEB-SERVER:80
HTTP/1.1 201 Created
Date: Thu, 17 May 2012 05:34:03 GMT
Server: Apache/2.2.22 (Fedora)
Location: http://WEB-SERVER/upload/sample.put
Content-Length: 263
Connection: close
Content-Type: text/html; charset=ISO-8859-1



201 Created


Created

Resource /upload/sample.put has been created.

Apache/2.2.22 (Fedora) Server at WEB-SERVER Port 80

$ 

2. Then, I verified the file and its contents on the WEB-SERVER: the file
did in fact get put successfully.

$cat /var/www/html/upload/sample.put 
xyz$
$

3. Then, I issued the same PUT but this time routing it via the Squid proxy,
like so:

$ cat put.req | socat - TCP:SQUID-PROXY:3128
$ 

As you can see, there was no output this time.

Would appreciate if you or someone could tell me what I'm doing wrong here.

Regards,
/HS


--
View this message in context: 
http://squid-web-proxy-cache.1019090.n4.nabble.com/Unable-to-test-HTTP-PUT-based-file-upload-via-Squid-Proxy-tp4634404p4642584.html
Sent from the Squid - Users mailing list archive at Nabble.com.


[squid-users] Re: Unable to test HTTP PUT-based file upload via Squid Proxy

2012-05-16 Thread Harry
Thanks, Amos.

--
View this message in context: 
http://squid-web-proxy-cache.1019090.n4.nabble.com/Unable-to-test-HTTP-PUT-based-file-upload-via-Squid-Proxy-tp4634404p4641518.html
Sent from the Squid - Users mailing list archive at Nabble.com.


Re: [squid-users] Unable to test HTTP PUT-based file upload via Squid Proxy

2012-05-15 Thread Amos Jeffries

On 16.05.2012 00:39, Harry Simons wrote:



**Request:**

PUT http://WEB-SERVER/upload/sample.put HTTP/1.1
User-Agent: curl/7.15.5 (i686-redhat-linux-gnu) libcurl/7.15.5
OpenSSL/0.9.8b zlib/1.2.3 libidn/0.6.5
Host: WEB-SERVER
Pragma: no-cache
Accept: */*
Proxy-Connection: Keep-Alive
Transfer-Encoding: chunked
Expect: 100-continue

**Response:**

HTTP/1.0 501 Not Implemented
Server: squid/2.6.STABLE21
Date: Sun, 13 May 2012 02:11:39 GMT
Content-Type: text/html
Content-Length: 1078
Expires: Sun, 13 May 2012 02:11:39 GMT
X-Squid-Error: ERR_UNSUP_REQ 0
X-Cache: MISS from SQUID-PROXY-FQDN
X-Cache-Lookup: NONE from SQUID-PROXY-FQDN:3128
Via: 1.0 SQUID-PROXY-FQDN:3128 (squid/2.6.STABLE21)
Proxy-Connection: close




Curl is attempting to use HTTP/1.1 features which 2.6 does not support 
(Expect:100-continue, Transfer-Encoding:chunked), and is too old to even 
have proper workarounds for broken clients. Your request won't work due 
to these even if PUT was okay.


Please upgrade. squid-2.7/3.1 are still HTTP/1.0 but have some hacks to 
workaround the HTTP/1.1 features curl is asking for. Squid-3.2 (beta) 
has HTTP/1.1 support.


Amos



[squid-users] Unable to test HTTP PUT-based file upload via Squid Proxy

2012-05-15 Thread Harry Simons
Hello,

I can upload a file to my Apache web server using Curl just fine:

echo "[$(date)] file contents." | curl -T -
http://WEB-SERVER/upload/sample.put

However, if I put a Squid proxy server in between, then I am not able to:

echo "[$(date)] file contents." | curl -x http://SQUID-PROXY:3128 -T -
http://WEB-SERVER/upload/sample.put

Curl reports the following error:

*Note: The following error response was in HTML format, but I've removed
the tags for ease of reading.*

ERROR: The requested URL could not be retrieved

ERROR
The requested URL could not be retrieved

While trying to retrieve the URL:
http://WEB-SERVER/upload/sample.put

The following error was encountered:
Unsupported Request Method and Protocol

Squid does not support all request methods for all access protocols.
For example, you can not POST a Gopher request.
Your cache administrator is root.

My `squid.conf` doesn't seem to be having any ACL/rule that should disallow
based on the `src` or `dst` IP addresses, or the `protocol`, or the HTTP
`method`... **as I can do an `HTTP POST` just fine between the same client
and the web server, with the same proxy sitting in between.**

In case of the failing `HTTP PUT` case, to see the request and response
traffic that was actually occurring, I placed a `netcat` process in between
Curl and Squid, and this is what I saw:

**Request:**

PUT http://WEB-SERVER/upload/sample.put HTTP/1.1
User-Agent: curl/7.15.5 (i686-redhat-linux-gnu) libcurl/7.15.5
OpenSSL/0.9.8b zlib/1.2.3 libidn/0.6.5
Host: WEB-SERVER
Pragma: no-cache
Accept: */*
Proxy-Connection: Keep-Alive
Transfer-Encoding: chunked
Expect: 100-continue

**Response:**

HTTP/1.0 501 Not Implemented
Server: squid/2.6.STABLE21
Date: Sun, 13 May 2012 02:11:39 GMT
Content-Type: text/html
Content-Length: 1078
Expires: Sun, 13 May 2012 02:11:39 GMT
X-Squid-Error: ERR_UNSUP_REQ 0
X-Cache: MISS from SQUID-PROXY-FQDN
X-Cache-Lookup: NONE from SQUID-PROXY-FQDN:3128
Via: 1.0 SQUID-PROXY-FQDN:3128 (squid/2.6.STABLE21)
Proxy-Connection: close



*Note: I have anonymized the IP addresses and server names throughout for
readability reasons.*

*Note: I had posted this question on [StackOverflow also][1], but got no
helpful response. Posting it here, in case people on StackOverflow are
seeing this as a non-programming question and not taking interest.*

Regards,
/HS

  [1]:
http://stackoverflow.com/questions/10568655/unable-to-test-http-put-based-file-upload-via-squid-proxy


Re: [squid-users] Prefetch patch test

2012-02-16 Thread Helmut Hullen
Hallo, anita.sivakumar,

Du meintest am 16.02.12:

> Sorry Amos. But where else do I post this ? I thought I can mail it
> to this mail id squid-users@squid-cache.org. But if there is some
> other place, please let me know.

[full quote deleted - don't top post, please, don't full quote, please]

The address is ok.
But when you want to write a new question then you shouldn't really  
answer to an existing problem and then only change the headline. Your  
mail reader can produce a "new" mail too.

Viele Gruesse!
Helmut


RE: [squid-users] Prefetch patch test

2012-02-15 Thread anita.sivakumar
Sorry Amos. But where else do I post this ? I thought I can mail it to this 
mail id squid-users@squid-cache.org. But if there is some other place, please 
let me know.

- Anita

-Original Message-
From: Amos Jeffries [mailto:squ...@treenet.co.nz]
Sent: 15 February 2012 18:17
To: squid-users@squid-cache.org
Subject: Re: [squid-users] Prefetch patch test

Before we start. Please do not hijack other topics discussions. It ruins
the groups archive threading and threaded mailer tools many of us use to
track the group mail. Thank you.

On 15/02/2012 5:24 p.m., anita.sivakumar wrote:
> Hi,
>
> Has anyone used and tested the squid prefetch patch available in the squid 
> website?
> For me it apparently gave a segmentation fault when I tried to prefetch. It 
> works normally for other requests though.

I assume you means the prefetch project patch from
devel.squid-cache.org? That was last updated for one of the 3.0
PRE-releases (5 or 6 by the looks of it).
It was not accepted into mainline for some reason unknown to me.

Apart from ESI support, all body content filtering and adaptations have
been pushed off to ICAP and eCAP processors. The whole devel.* site is
now outdated, all projects there are in the deprecated bin. If you would
like to revive one please get in touch with squid-dev about joining
development and be prepared for a fair bit of hacking to get it ported
to current 3.HEAD in BZR.

There are other tools (such as "squid-prefetch") which can do prefetch
for any version of Squid without patching which you may want to
investigate first.

Although be aware the in most instances pre-fetching at the proxy level
has usually been found to be a large waste of bandwidth and cache
resources, with little benefits (or none) to offset the costs. Modern
browsers do a different kind of pre-fetch themseves which has a far more
efficient algorithm for calculating what resources to fetch early. Squid
and other proxies do not have access to enough of the users information
to do it efficiently.

Amos

Please do not print this email unless it is absolutely necessary. 

The information contained in this electronic message and any attachments to 
this message are intended for the exclusive use of the addressee(s) and may 
contain proprietary, confidential or privileged information. If you are not the 
intended recipient, you should not disseminate, distribute or copy this e-mail. 
Please notify the sender immediately and destroy all copies of this message and 
any attachments. 

WARNING: Computer viruses can be transmitted via email. The recipient should 
check this email and any attachments for the presence of viruses. The company 
accepts no liability for any damage caused by any virus transmitted by this 
email. 

www.wipro.com


Re: [squid-users] Prefetch patch test

2012-02-15 Thread Amos Jeffries
Before we start. Please do not hijack other topics discussions. It ruins 
the groups archive threading and threaded mailer tools many of us use to 
track the group mail. Thank you.


On 15/02/2012 5:24 p.m., anita.sivakumar wrote:

Hi,

Has anyone used and tested the squid prefetch patch available in the squid 
website?
For me it apparently gave a segmentation fault when I tried to prefetch. It 
works normally for other requests though.


I assume you means the prefetch project patch from 
devel.squid-cache.org? That was last updated for one of the 3.0 
PRE-releases (5 or 6 by the looks of it).

It was not accepted into mainline for some reason unknown to me.

Apart from ESI support, all body content filtering and adaptations have 
been pushed off to ICAP and eCAP processors. The whole devel.* site is 
now outdated, all projects there are in the deprecated bin. If you would 
like to revive one please get in touch with squid-dev about joining 
development and be prepared for a fair bit of hacking to get it ported 
to current 3.HEAD in BZR.


There are other tools (such as "squid-prefetch") which can do prefetch 
for any version of Squid without patching which you may want to 
investigate first.


Although be aware the in most instances pre-fetching at the proxy level 
has usually been found to be a large waste of bandwidth and cache 
resources, with little benefits (or none) to offset the costs. Modern 
browsers do a different kind of pre-fetch themseves which has a far more 
efficient algorithm for calculating what resources to fetch early. Squid 
and other proxies do not have access to enough of the users information 
to do it efficiently.


Amos


[squid-users] Prefetch patch test

2012-02-14 Thread anita.sivakumar
Hi,

Has anyone used and tested the squid prefetch patch available in the squid 
website?
For me it apparently gave a segmentation fault when I tried to prefetch. It 
works normally for other requests though.

I had integrated squid 3.1.16 with the prefetch patch.

My test file:
I have a pf1.html whose contents are (just 1 line):
http://l0.0.21.993/pf2.html"/>

pf2.html is a dummy file.

When I tried to get this pf1.html using squidclient, I am getting the 
segmentation fault.

Any idea why? Or is my test file wrong?

- Anita

Please do not print this email unless it is absolutely necessary. 

The information contained in this electronic message and any attachments to 
this message are intended for the exclusive use of the addressee(s) and may 
contain proprietary, confidential or privileged information. If you are not the 
intended recipient, you should not disseminate, distribute or copy this e-mail. 
Please notify the sender immediately and destroy all copies of this message and 
any attachments. 

WARNING: Computer viruses can be transmitted via email. The recipient should 
check this email and any attachments for the presence of viruses. The company 
accepts no liability for any damage caused by any virus transmitted by this 
email. 

www.wipro.com


[squid-users] test

2011-09-27 Thread Dayo

sorry


Re: [squid-users] any badly broken webserver to test corner cases?

2011-06-28 Thread Amos Jeffries

On 28/06/11 23:32, Tomasz Chmielewski wrote:

I was wondering what Squid developers use to test the proxy (or
generally, any other software which core usage is interacting with web
servers, like a web browser)?


You had probably as the developers that. They/we hang out on the 
squid-dev mailing list. This is a general helpdesk kind of list for 
admin folk.



Speaking for myself I use a perl script to generate custom replies in 
various states of brokenness matching whatever I'm fixing. Or the real 
servers people are having trouble with in their bug reports.




Is there any "badly broken webserver" out there, which could be used to
test corner case Squid usage?

I mean broken by design, with configurable broken stuff, randomly
resetting connections, feeding rubbish in headers,
improper/unexpected/slow replies etc.



Sounds like you want to speak to The Measurement Factory. They have 
quite a range of HTTP testing systems.


PS. I'm interested in the options here too if anyone else out there has 
more info.



Amos
--
Please be using
  Current Stable Squid 2.7.STABLE9 or 3.1.12
  Beta testers wanted for 3.2.0.9 and 3.1.12.3


[squid-users] any badly broken webserver to test corner cases?

2011-06-28 Thread Tomasz Chmielewski
I was wondering what Squid developers use to test the proxy (or 
generally, any other software which core usage is interacting with web 
servers, like a web browser)?



Is there any "badly broken webserver" out there, which could be used to 
test corner case Squid usage?



I mean broken by design, with configurable broken stuff, randomly 
resetting connections, feeding rubbish in headers, 
improper/unexpected/slow replies etc.




--
Tomasz Chmielewski
http://wpkg.org


Re: [squid-users] Can squid test whether a redirect target is up?

2011-03-28 Thread Amos Jeffries

On Mon, 28 Mar 2011 11:27:16 -0400, David Guertin wrote:

On 2011-03-22 17:25, Amos Jeffries wrote:
Thepreferred alternative is cache_peer link(s) to the origin 
server(s) or app(s). Squid 'tests" these during each connection setup 
and can failover between several of them or 'DIRECT' Internet DNS 
details as needed.


Usage is detailed under reverse-proxy where they are commonly used:
http://wiki.squid-cache.org/ConfigExamples/Reverse/BasicAccelerator


Thanks, this has helped a lot, and I'm much closer now. However, now
I'm running into other limitations of my Squid knowledge. Following
the example of that link, my cache_peer statements are working
correctly for the site I'm interested in, i.e. URLs of the form
/somepage.html are cached. However, URLs to any other
site, i.e /index.html are also all getting caught
and are failing, because Squid is converting them to
"/index.html". In other words, I haven't yet found out 
how

to catch only requests for a single domain.

Here's the head of my config file:

http_port 80 accel defaultsite=
cache_peer  parent 80 0 no-query originserver
name=myAccel proxy-only
acl proxy_sites dstdomain 
acl all src 0.0.0.0/0.0.0.0
http_access allow proxy_sites
cache_peer_access myAccel allow proxy_sites
cache_peer_access myAccel deny all

I have tried other configuration as well, including additional
cache_peer statements, but nothing I've tried has worked. The 
requests

don't fail over to the other cache_peer statements.

What am I missing?


You've just reached the next level of complexity :)

Basic reverse-proxy assumes that all traffic arriving at the port is 
for one website (the defaultsite=...).


Add the "vhost" option to http_port to enable virtual hosting support 
for multiple domains.


Amos


Re: [squid-users] Can squid test whether a redirect target is up?

2011-03-28 Thread David Guertin

On 2011-03-22 17:25, Amos Jeffries wrote:
Thepreferred alternative is cache_peer link(s) to the origin server(s) 
or app(s). Squid 'tests" these during each connection setup and can 
failover between several of them or 'DIRECT' Internet DNS details as 
needed.


Usage is detailed under reverse-proxy where they are commonly used:
http://wiki.squid-cache.org/ConfigExamples/Reverse/BasicAccelerator


Thanks, this has helped a lot, and I'm much closer now. However, now I'm 
running into other limitations of my Squid knowledge. Following the 
example of that link, my cache_peer statements are working correctly for 
the site I'm interested in, i.e. URLs of the form 
/somepage.html are cached. However, URLs to any other site, 
i.e /index.html are also all getting caught and are 
failing, because Squid is converting them to "/index.html". 
In other words, I haven't yet found out how to catch only requests for a 
single domain.


Here's the head of my config file:

http_port 80 accel defaultsite=
cache_peer  parent 80 0 no-query originserver name=myAccel 
proxy-only

acl proxy_sites dstdomain 
acl all src 0.0.0.0/0.0.0.0
http_access allow proxy_sites
cache_peer_access myAccel allow proxy_sites
cache_peer_access myAccel deny all

I have tried other configuration as well, including additional 
cache_peer statements, but nothing I've tried has worked. The requests 
don't fail over to the other cache_peer statements.


What am I missing?

Dave


Re: [squid-users] Can squid test whether a redirect target is up?

2011-03-22 Thread Amos Jeffries

On Tue, 22 Mar 2011 08:35:38 -0400, David Guertin wrote:

On 2011-03-21 19:38, Amos Jeffries wrote:

"if site A is up, redirect to site A, but if it's
down, redirect to site B."

Is there any way to do this? Is squid the correct tool for this?
Would a different redirector that squidGuard be a better choice?



Using a redirector for this is not a good choice. Redirectors only 
pass a URL to Squid to inform the client to try there. It is up to the 
redirector to test


Thanks for the help. I've been sort of been coming to this conclusion
as I learn my way around Squid. It looks like your reply was cut off.
What would be a better alternate strategy? The remote site is a bunch
of database-driven forms with confidential data, which we are not 
able
to store securely (which is why they are off-site in the first 
place).
Would it be a better idea (or possible) to cache the forms, even if 
we

do not cache the data?

Thanks,
Dave


Yes there was more on that reply...

Thepreferred alternative is cache_peer link(s) to the origin server(s) 
or app(s). Squid 'tests" these during each connection setup and can 
failover between several of them or 'DIRECT' Internet DNS details as 
needed.


Usage is detailed under reverse-proxy where they are commonly used:
http://wiki.squid-cache.org/ConfigExamples/Reverse/BasicAccelerator

NP: This will also make available the "proxy-only" flag on the 
cache_peer line. Which prevents squid storing anything fetched from 
there without bothering about fancy cache rules.


FWIW,  by default Squid does not cache the body portion of POST 
requests. So if they are doing normal forms the data will not be cached. 
The empty form page is a GET so may be cached if they let it. Whether 
the reply page after submission is cacheable depends on what 3xx status 
code and HTTP headers they respond with.



Amos




Re: [squid-users] Can squid test whether a redirect target is up?

2011-03-22 Thread David Guertin

On 2011-03-21 19:38, Amos Jeffries wrote:

"if site A is up, redirect to site A, but if it's
down, redirect to site B."

Is there any way to do this? Is squid the correct tool for this?
Would a different redirector that squidGuard be a better choice?



Using a redirector for this is not a good choice. Redirectors only 
pass a URL to Squid to inform the client to try there. It is up to the 
redirector to test


Thanks for the help. I've been sort of been coming to this conclusion as 
I learn my way around Squid. It looks like your reply was cut off. What 
would be a better alternate strategy? The remote site is a bunch of 
database-driven forms with confidential data, which we are not able to 
store securely (which is why they are off-site in the first place). 
Would it be a better idea (or possible) to cache the forms, even if we 
do not cache the data?


Thanks,
Dave


Re: [squid-users] Can squid test whether a redirect target is up?

2011-03-21 Thread Amos Jeffries

On Mon, 21 Mar 2011 15:17:38 -0400, David Guertin wrote:

Hello,

One of our remote web sites has a habit of going offline frequently.
I am trying to configure squid to act as a proxy for this site so 
that

if the site is down, the browser gets redirected to an alternate page
instead of getting a generic "server error" page.


Okay.



I have configured squid and squidGuard to handle redirects, but there
doesn't seem to be any way to add any kind of conditional statements
to the config, i.e. "if site A is up, redirect to site A, but if it's
down, redirect to site B."

Is there any way to do this? Is squid the correct tool for this?
Would a different redirector that squidGuard be a better choice?


Using a redirector for this is not a good choice. Redirectors only pass 
a URL to Squid to inform the client to try there. It is up to the 
redirector to test


[squid-users] Can squid test whether a redirect target is up?

2011-03-21 Thread David Guertin

Hello,

One of our remote web sites has a habit of going offline frequently. I 
am trying to configure squid to act as a proxy for this site so that if 
the site is down, the browser gets redirected to an alternate page 
instead of getting a generic "server error" page.


I have configured squid and squidGuard to handle redirects, but there 
doesn't seem to be any way to add any kind of conditional statements to 
the config, i.e. "if site A is up, redirect to site A, but if it's down, 
redirect to site B."


Is there any way to do this? Is squid the correct tool for this? Would a 
different redirector that squidGuard be a better choice?


Thanks,
Dave


[squid-users] test post ::please delete::

2010-11-29 Thread donovan jeffrey j
testing for bounces
-j


Re: [squid-users] ANN: stress test tool

2010-10-19 Thread Josip Almasi

Amos Jeffries wrote:

On 20/10/10 01:08, Josip Almasi wrote:

Hi all,

I've just deployed a few squid boxes.
Had an unusual requirement - each box is to handle 1000 connections (not
requests) per second.
Also had some 173 millions urls taken from access logs.

Couldn't find what I needed to test it, so I wrote it.

So if you need to squeeze your squid, here it is:
http://sf.net/projects/spizd

Regards...



ApacheBench and Web Polygraph are the standard tools. There are probably 
others I'm not aware of too.


Right. However, ab measures number of requests rather than connections, 
and polygraph is... not exactly easy to use.



Thanks for this one. Added to the profiling FAQ page.
http://wiki.squid-cache.org/SquidFaq/SquidProfiling


Thanks for adding it. Slight typo though: SPIZD, not SPITZD.

Regards...



Re: [squid-users] ANN: stress test tool

2010-10-19 Thread Amos Jeffries

On 20/10/10 01:08, Josip Almasi wrote:

Hi all,

I've just deployed a few squid boxes.
Had an unusual requirement - each box is to handle 1000 connections (not
requests) per second.
Also had some 173 millions urls taken from access logs.

Couldn't find what I needed to test it, so I wrote it.

So if you need to squeeze your squid, here it is:
http://sf.net/projects/spizd

Regards...



ApacheBench and Web Polygraph are the standard tools. There are probably 
others I'm not aware of too.


Thanks for this one. Added to the profiling FAQ page.
http://wiki.squid-cache.org/SquidFaq/SquidProfiling

Amos
--
Please be using
  Current Stable Squid 2.7.STABLE9 or 3.1.8
  Beta testers wanted for 3.2.0.2


[squid-users] ANN: stress test tool

2010-10-19 Thread Josip Almasi

Hi all,

I've just deployed a few squid boxes.
Had an unusual requirement - each box is to handle 1000 connections (not 
requests) per second.

Also had some 173 millions urls taken from access logs.

Couldn't find what I needed to test it, so I wrote it.

So if you need to squeeze your squid, here it is:
http://sf.net/projects/spizd

Regards...



Re: [squid-users] how to simulate big file workload test for squid

2010-09-21 Thread Henrik Nordström
tis 2010-09-21 klockan 03:27 + skrev Amos Jeffries:

> Your own access.log history is probably the best source of such info.
> The actual file contents does not matter to Squid so can be synthesized to
> fit the URLs logged transfer sizes, with the URLs themselves re-written to
> match for the testing.
> (Have not done this on a large scale or with web-polygraph myself)

Supported directly by web-polygraph if you have a log of the workload
you want to test.

http://www.web-polygraph.org/docs/userman/access2poly.html


If you do not have logs of the workload but a good idea of the content
distribution then it's not too hard to define your own content
distribution.

http://www.web-polygraph.org/docs/reference/models/traffic.html#Sect:4

Don't forget to set the working set size to a reasonable value as well.

Regards
Henrik



Re: [squid-users] how to simulate big file workload test for squid

2010-09-20 Thread Amos Jeffries
On Tue, 21 Sep 2010 10:27:12 +0800, du du  wrote:
> Hi
> I want to make a workload test for squid,  I want to use the tool
> web-polygraph which has some standard workload models. But in these
> workload, the content size  is too small to me.
> 
> Dose anyone has a modified workload with big content size or has another
> method to simulate big file transaction?
> 
> My file size is 1-5MB

Your own access.log history is probably the best source of such info.
The actual file contents does not matter to Squid so can be synthesized to
fit the URLs logged transfer sizes, with the URLs themselves re-written to
match for the testing.
(Have not done this on a large scale or with web-polygraph myself)

Amos



[squid-users] how to simulate big file workload test for squid

2010-09-20 Thread du du
Hi
I want to make a workload test for squid,  I want to use the tool
web-polygraph which has some standard workload models. But in these
workload, the content size  is too small to me.

Dose anyone has a modified workload with big content size or has another
method to simulate big file transaction?

My file size is 1-5MB

--
thanks,
Ergod


[squid-users] mailer test

2010-05-30 Thread Amos Jeffries

Sorry folks. Just testing the list server is still working.

Amos
--
Please be using
  Current Stable Squid 2.7.STABLE9 or 3.1.3


Re: [squid-users] to Amos Jeffries,you said squid performance could be up to 300,000 rps in lab test.

2010-04-07 Thread Amos Jeffries

wang.gao...@zte.com.cn wrote:
I read this at the end of 
http://www.squid-cache.org/mail-archive/squid-users/201002/0795.html
I want to use squid as a reverse proxy,so I am interested in the squid 
performance.

Can you post a detailed result about this lab test?
The test is a test about single machine or Cluster?
The record of the aiCache is just 25,000 rps,so your record is very 
amazing.

Can you give me some viewpoint about squid and aiCache?
Thank you.



As I said it was for a lab test and _very_ artificial. The 300K results 
was specifically from testing of the new accept() handler for Squid-3.1, 
since I was facing complaints it could not get more than 5 concurrent 
requests.
The 3rps was achieved by fetching google front page image (non 
cacheable, ~4KB remote object).


I achieved that by using Squid-3.1 with a RAM cache, fetching a single 
1KB object pre-stored in memory, with very short headers on both reply 
and request. Using apachebench via the localhost interface (64KB RSS, 
almost zero network stack IO delay) at some high concurrency just below 
the cap point where Squid starts slowing from too many concurrent 
requests (I forget exactly what that is right now, maybe 400-500 
concurrency?). It took a few trials and that was what ab reported, give 
or take a few Krps.


As soon as any real networking is attached, ie fetching from a box next 
door, the rate drops to something around that 30Krps for the same 
artificial memory-cached small object. I suspect that is simply due to 
the kernel network stacks and buffering.


With real remote objects and URL were added in, thus incurring more 
processing delays, it drops down to below 1Krps in line with the real 
benchmarks that are starting to appear for Squid.


I guess, in theory Squid could process that many new requests in real 
use, but time to supply would be vastly inflated as transfer resources 
went into accepting new requests.


The point was that lab tests produce a wide variety of results, 
depending on what is tested.


Amos
--
Please be using
  Current Stable Squid 2.7.STABLE9 or 3.1.1


[squid-users] to Amos Jeffries,you said squid performance could be up to 300,000 rps in lab test.

2010-04-07 Thread wang . gaohao
I read this at the end of 
http://www.squid-cache.org/mail-archive/squid-users/201002/0795.html
I want to use squid as a reverse proxy,so I am interested in the squid 
performance.
Can you post a detailed result about this lab test?
The test is a test about single machine or Cluster?
The record of the aiCache is just 25,000 rps,so your record is very 
amazing.
Can you give me some viewpoint about squid and aiCache?
Thank you.



ZTE Information Security Notice: The information contained in this mail is 
solely property of the sender's organization. This mail communication is 
confidential. Recipients named above are obligated to maintain secrecy and are 
not permitted to disclose the contents of this communication to others.
This email and any files transmitted with it are confidential and intended 
solely for the use of the individual or entity to whom they are addressed. If 
you have received this email in error please notify the originator of the 
message. Any views expressed in this message are those of the individual sender.
This message has been scanned for viruses and Spam by ZTE Anti-Spam system.


[squid-users] Extreme Slow Resposne from Squid ( Test environment only 4 users at the moment)

2010-03-25 Thread GIGO .

>From the multiple instance setup using Squid 3stable25 i have shifted to 
>squid3stable1 packaged with ubuntu 8.04 LTS.However i am unable to understand 
>why its too much slow. Whats wrong please anybody help out.Is it something to 
>do with Operating system? Or initially Squid runs that much slow? I feel 
>helpless. Please guide me.
 
My Hardware:
Physical Server IBM 3650
Physical RAID 1 + A Volume Disk each of 73 GB size. currently i am doing 
caching on RAID1.
RAM 4GB
 
My Conf File:
 
visible_hostname squidLhr
unique_hostname squidDefault
pid_filename /var/run/squid3.pid
http_port 10.1.82.53:8080
icp_port 0
snmp_port 0
access_log /var/log/squid3/access.log squid
cache_log /var/log/squid3/cache.log
cache_peer 10.1.82.205  parent 8080 0 default no-digest no-query
#cache_peer 127.0.0.1 parent 3128 0 default no-digest no-query proxy-only 
no-delay use in the multiple setup
#temporarily Directive
never_direct allow all
#prefer_direct off use in the multiple setup while ponder on the above 
directive as well as it may not be needed with direct internet access.
cache_dir aufs /var/spool/squid3 1 32 320
coredump_dir /var/spool/squid3
cache_swap_low 75
cache_mem 100 MB
range_offset_limit 0 KB
maximum_object_size 4096 MB
minimum_object_size 0 KB
quick_abort_min 16 KB
cache_replacement_policy lru
refresh_pattern ^ftp:   144020% 10080
refresh_pattern ^gopher:14400%  1440
refresh_pattern . 0 20% 4320
#specific for youtube belowone
refresh_pattern (get_video\?|videoplayback\?|videodownload\?) 5259487 % 
5259487
refresh_pattern -i (/cgi-bin/|\?) 0 0% 0
acl manager proto cache_object
acl localhost src 127.0.0.1/32
acl to_localhost dst 127.0.0.0/8
#Define Local Network.
acl FcUsr src "/etc/squid3/FcUsr.conf"
acl PUsr src "/etc/squid3/PUsr.conf"
acl RUsr src "/etc/squid3/RUsr.conf"
#Define Local Servers
acl localServers dst 10.0.0.0/8
#Defining & allowing ports section
acl SSL_ports port 443  #https
acl Safe_ports port 80  # http
acl Safe_ports port 21  # ftp
acl Safe_ports port 443 # https
acl Safe_ports port 70  # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535  # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT
# Only allow cachemgr access from localhost
http_access allow manager localhost
http_access deny manager
# Deny request to unknown ports
http_access deny !Safe_ports
# Deny request to other than SSL ports
http_access deny CONNECT !SSL_ports
#Allow access from localhost
http_access allow localhost
# Local server should never be forwarded to neighbour/peers and they should 
never be cached.
always_direct allow localservers
cache deny LocalServers
# Windows Update Section...
acl windowsupdate dstdomain windowsupdate.microsoft.com
acl windowsupdate dstdomain .update.microsoft.com
acl windowsupdate dstdomain download.windowsupdate.com
acl windowsupdate dstdomain redir.metaservices.microsoft.com
acl windowsupdate dstdomain images.metaservices.microsoft.com
acl windowsupdate dstdomain c.microsoft.com
acl windowsupdate dstdomain www.download.windowsupdate.com
acl windowsupdate dstdomain wustat.windows.com
acl windowsupdate dstdomain crl.microsoft.com
acl windowsupdate dstdomain sls.microsoft.com
acl windowsupdate dstdomain productactivation.one.microsoft.com
acl windowsupdate dstdomain ntservicepack.microsoft.com
acl wuCONNECT dstdomain www.update.microsoft.com
acl wuCONNECT dstdomain sls.microsoft.com
http_access allow CONNECT wuCONNECT FcUsr
http_access allow CONNECT wuCONNECT PUsr
http_access allow CONNECT wuCONNECT RUsr
http_access allow CONNECT wuCONNECT localhost
http_access allow windowsupdate all
http_access allow windowsupdate localhost
acl workinghours time MTWHF 09:00-12:59
acl workinghours time MTWHF 15:00-17:00
acl BIP dst "/etc/squid3/Blocked.conf"
Definitions for BlockingRules#
###Definition of MP3/MPEG
acl FTP proto FTP
acl MP3url urlpath_regex \.mp3(\?.*)?$
acl Movies rep_mime_type video/mpeg
acl MP3s rep_mime_type audio/mpeg
###Definition of Flash Video
acl deny_rep_mime_flashvideo rep_mime_type video/flv
###Definition of  Porn
acl Sex urlpath_regex sex
acl PornSites url_regex "/etc/squid3/pornlist"
Definition of YouTube.
## The videos come from several domains
acl youtube_domains dstdomain .youtube.com .googlevideo.com .ytimg.com
###Definition of FaceBook
acl facebook_sites dstdomain .facebook.com
 Definition of MSN Messenger
acl msn urlpath_regex -i gateway.dll
acl msnd dstdomain messenger.msn.com gateway.messenger.hotmail.com
acl msn1 req_mime_type application/x-msn-messenger
Definition of Skype
acl numeric_IPs url_regex 
^(([0-9]+\.[0-9]+\.[0-9]+\.[0-9]+)|(\[([0-9af]+)?:([0-9af:]+)?:([0-9af]+)?\])):443
acl Skype_UA browser ^skype^
##Definition of Yahoo! Messenger
acl ym dst

Re: [squid-users] Re: Squid 3.1.0.13 Speed Test - Upload breaks?

2010-01-22 Thread Irvan Adrian K
Mine too.. using Squid  Version 3.1.0.15-20091212, and running on 
configuration of TPROXY 4, have a  problem on upload, squid often error 
when upload facebook photo, attache an email on GMAIL or Yahoo, etc..


Irvan

On 1/23/2010 5:53 AM, Amos Jeffries wrote:

Linda Walsh wrote:

jay60103 wrote:
I'm using  Version 3.1.0.6 and speakeasy.net doesn't work for me 
either.
Download test okay, but when it starts the upload part it fails with 
"Upload
test returned an error while trying to read the upload file." 


    FWIW, this speed test works for me using 3.HEAD.BZR (head
version of 3.1.0.15).


Sorry, correction. 3.HEAD-BZR is the 3.2.0.0 alpha code.
Still very similar to 3.1, but some more advanced features. Though 
none which would affect



Amos




Re: [squid-users] Re: Squid 3.1.0.13 Speed Test - Upload breaks?

2010-01-22 Thread Amos Jeffries

Linda Walsh wrote:

jay60103 wrote:

I'm using  Version 3.1.0.6 and speakeasy.net doesn't work for me either.
Download test okay, but when it starts the upload part it fails with 
"Upload
test returned an error while trying to read the upload file." 


    FWIW, this speed test works for me using 3.HEAD.BZR (head
version of 3.1.0.15).


Sorry, correction. 3.HEAD-BZR is the 3.2.0.0 alpha code.
Still very similar to 3.1, but some more advanced features. Though none 
which would affect



Amos
--
Please be using
  Current Stable Squid 2.7.STABLE7 or 3.0.STABLE21
  Current Beta Squid 3.1.0.15


[squid-users] Re: Squid 3.1.0.13 Speed Test - Upload breaks?

2010-01-22 Thread Linda Walsh

jay60103 wrote:

I'm using  Version 3.1.0.6 and speakeasy.net doesn't work for me either.
Download test okay, but when it starts the upload part it fails with "Upload
test returned an error while trying to read the upload file." 


    FWIW, this speed test works for me using 3.HEAD.BZR (head
version of 3.1.0.15).

I have pipeline_prefetch set to 'on'.

-l



Re: [squid-users] Squid 3.1.0.13 Speed Test - Upload breaks?

2010-01-04 Thread jay60103

I'm using  Version 3.1.0.6 and speakeasy.net doesn't work for me either.
Download test okay, but when it starts the upload part it fails with "Upload
test returned an error while trying to read the upload file." 

http://bigcartel.com fails also. Possibly related?


Adrian Chadd-3 wrote:
> 
> The pipelining used by speedtest.net and such won't really get a
> benefit from the current squid pipelining support.
> 
> 
> 
> Adrian
> 
> 2009/8/15 Daniel :
>> Henrik,
>>
>>        I added 'pipeline_prefetch on' to my squid.conf and it still isn't
>> working right. I've pasted my entire squid.conf below, if you have
>> anything extra turned on/off or et cetera than please let me know and
>> I'll try it.  Thanks!
>>
>> acl manager proto cache_object
>> acl localhost src 127.0.0.1/32
>> acl to_localhost dst 127.0.0.0/8
>> acl TestPoolIPs src lpt-hdq-dmtqq31 wksthdq88w
>> acl localnet src 10.0.0.0/8     # RFC1918 possible internal network
>> acl localnet src 172.16.0.0/12  # RFC1918 possible internal network
>> acl localnet src 192.168.0.0/16 # RFC1918 possible internal network
>> acl sclthdq01w src 10.211.194.187/32    # custom acl for apache/cache
>> manager
>> acl SSL_ports port 443
>> acl Safe_ports port 80          # http
>> acl Safe_ports port 21          # ftp
>> acl Safe_ports port 443         # https
>> acl Safe_ports port 70          # gopher
>> acl Safe_ports port 210         # wais
>> acl Safe_ports port 1025-65535  # unregistered ports
>> acl Safe_ports port 280         # http-mgmt
>> acl Safe_ports port 488         # gss-http
>> acl Safe_ports port 591         # filemaker
>> acl Safe_ports port 777         # multiling http
>> acl CONNECT method CONNECT
>> http_access allow manager localhost
>> http_access allow manager sclthdq01w
>> http_access deny manager
>> http_access deny !Safe_ports
>> http_access deny CONNECT !SSL_ports
>> #http_access allow localnet
>> http_access allow localhost
>> http_access allow TestPoolIPs
>> http_access deny all
>> http_port 3128
>> hierarchy_stoplist cgi-bin ?
>> coredump_dir /usr/local/squid/var/cache
>> cache_mem 512 MB
>> pipeline_prefetch on
>> refresh_pattern ^ftp:           1440    20%     10080
>> refresh_pattern ^gopher:        1440    0%      1440
>> refresh_pattern -i (/cgi-bin/|\?) 0     0%      0
>> refresh_pattern .               0       20%     4320
>>
>> -Original Message-
>> From: Henrik Lidström [mailto:free...@lidstrom.eu]
>> Sent: Monday, August 10, 2009 8:16 PM
>> To: Daniel
>> Cc: squid-users@squid-cache.org
>> Subject: Re: [squid-users] Squid 3.1.0.13 Speed Test - Upload breaks?
>>
>> Daniel skrev:
>>> Kinkie,
>>>
>>>       I'm using the default settings, so I don't have any specific max
>>> request sizes specified. I guess I'll hold out until someone else
>>> running 3.1 can test this.
>>>
>>> Thanks!
>>>
>>> -Original Message-
>>> From: Kinkie [mailto:gkin...@gmail.com]
>>> Sent: Saturday, August 08, 2009 6:44 AM
>>> To: squid-users@squid-cache.org
>>> Subject: Re: [squid-users] Squid 3.1.0.13 Speed Test - Upload breaks?
>>>
>>> Maybe the failure could depend on some specific settings, such as max
>>> request size?
>>>
>>> On 8/8/09, Heinz Diehl  wrote:
>>>
>>>> On 08.08.2009, Daniel wrote:
>>>>
>>>>
>>>>> Would anyone else using Squid mind doing this same bandwidth test and
>>>>> seeing
>>>>> if they have the same issue(s)?
>>>>>
>>>> It works flawlessly using both 2.7-STABLE6 and 3.0-STABLE18 here.
>>>>
>>>>
>>>>
>>>
>>>
>>>
>> Squid Cache: Version 3.1.0.13
>>
>> Working without a problem, tested multiple sites on the list.
>> Nothing special in the config except maybe "pipeline_prefetch on"
>>
>> /Henrik
>>
>>
> 
> 

-- 
View this message in context: 
http://old.nabble.com/Squid-3.1.0.13-Speed-Test---Upload-breaks--tp24868479p27018936.html
Sent from the Squid - Users mailing list archive at Nabble.com.



[squid-users] QA/Test machines sought!

2009-08-18 Thread Robert Collins
Hi, a few of us dev's have been working on getting a build-test
environment up and running. We're still doing fine tuning on it but the
basic facility is working.

We'd love it if users of squid, both individuals and corporates, would
consider contributing a test machine to the buildfarm.

The build farm is at http://build.squid-cache.org/ with docs about it at
http://wiki.squid-cache.org/BuildFarm.

What we'd like is to have enough machines that are available to run test
builds, that we can avoid having last-minute scrambles to fix things at
releases.

If you have some spare bandwidth and CPU cycles you can easily
volunteer. 

We don't need test slaves to be on all the time - if they aren't on they
won't run tests, but they will when the come on. We'd prefer machines
that are always on over some-times on.

We only do test builds on volunteer machines after a 'master' job has
passed on the main server. This avoids using resources up when something
is clearly busted in the main source code.

Each version of squid we test takes about 150MB on disk when idle, and
when a test is going on up to twice that (because of the build test
scripts).

We currently test
2.HEAD
3.0
3.1
3.HEAD

and I suspect we'll add 2.7 to that list. So I guess we'll use abut
750MB of disk if a given slave is testing all those versions.

Hudson, our build test software, can balance out the machines though -
if we have two identical platforms they will each get some of the builds
to test.

So, if your favorite operating system is not currently represented in
the build farm, please let us know - drop a mail here or to noc @
squid-cache.org - we'll be delighted to hear from you, and it will help
ensure that squid is building well on your OS!

-Rob


signature.asc
Description: This is a digitally signed message part


Re: [squid-users] Squid 3.1.0.13 Speed Test - Upload breaks?

2009-08-16 Thread Adrian Chadd
The pipelining used by speedtest.net and such won't really get a
benefit from the current squid pipelining support.



Adrian

2009/8/15 Daniel :
> Henrik,
>
>        I added 'pipeline_prefetch on' to my squid.conf and it still isn't 
> working right. I've pasted my entire squid.conf below, if you have anything 
> extra turned on/off or et cetera than please let me know and I'll try it.  
> Thanks!
>
> acl manager proto cache_object
> acl localhost src 127.0.0.1/32
> acl to_localhost dst 127.0.0.0/8
> acl TestPoolIPs src lpt-hdq-dmtqq31 wksthdq88w
> acl localnet src 10.0.0.0/8     # RFC1918 possible internal network
> acl localnet src 172.16.0.0/12  # RFC1918 possible internal network
> acl localnet src 192.168.0.0/16 # RFC1918 possible internal network
> acl sclthdq01w src 10.211.194.187/32    # custom acl for apache/cache manager
> acl SSL_ports port 443
> acl Safe_ports port 80          # http
> acl Safe_ports port 21          # ftp
> acl Safe_ports port 443         # https
> acl Safe_ports port 70          # gopher
> acl Safe_ports port 210         # wais
> acl Safe_ports port 1025-65535  # unregistered ports
> acl Safe_ports port 280         # http-mgmt
> acl Safe_ports port 488         # gss-http
> acl Safe_ports port 591         # filemaker
> acl Safe_ports port 777         # multiling http
> acl CONNECT method CONNECT
> http_access allow manager localhost
> http_access allow manager sclthdq01w
> http_access deny manager
> http_access deny !Safe_ports
> http_access deny CONNECT !SSL_ports
> #http_access allow localnet
> http_access allow localhost
> http_access allow TestPoolIPs
> http_access deny all
> http_port 3128
> hierarchy_stoplist cgi-bin ?
> coredump_dir /usr/local/squid/var/cache
> cache_mem 512 MB
> pipeline_prefetch on
> refresh_pattern ^ftp:           1440    20%     10080
> refresh_pattern ^gopher:        1440    0%      1440
> refresh_pattern -i (/cgi-bin/|\?) 0     0%      0
> refresh_pattern .               0       20%     4320
>
> -Original Message-
> From: Henrik Lidström [mailto:free...@lidstrom.eu]
> Sent: Monday, August 10, 2009 8:16 PM
> To: Daniel
> Cc: squid-users@squid-cache.org
> Subject: Re: [squid-users] Squid 3.1.0.13 Speed Test - Upload breaks?
>
> Daniel skrev:
>> Kinkie,
>>
>>       I'm using the default settings, so I don't have any specific max 
>> request sizes specified. I guess I'll hold out until someone else running 
>> 3.1 can test this.
>>
>> Thanks!
>>
>> -Original Message-
>> From: Kinkie [mailto:gkin...@gmail.com]
>> Sent: Saturday, August 08, 2009 6:44 AM
>> To: squid-users@squid-cache.org
>> Subject: Re: [squid-users] Squid 3.1.0.13 Speed Test - Upload breaks?
>>
>> Maybe the failure could depend on some specific settings, such as max
>> request size?
>>
>> On 8/8/09, Heinz Diehl  wrote:
>>
>>> On 08.08.2009, Daniel wrote:
>>>
>>>
>>>> Would anyone else using Squid mind doing this same bandwidth test and
>>>> seeing
>>>> if they have the same issue(s)?
>>>>
>>> It works flawlessly using both 2.7-STABLE6 and 3.0-STABLE18 here.
>>>
>>>
>>>
>>
>>
>>
> Squid Cache: Version 3.1.0.13
>
> Working without a problem, tested multiple sites on the list.
> Nothing special in the config except maybe "pipeline_prefetch on"
>
> /Henrik
>
>


RE: [squid-users] Squid 3.1.0.13 Speed Test - Upload breaks?

2009-08-14 Thread Daniel
Henrik,

I added 'pipeline_prefetch on' to my squid.conf and it still isn't 
working right. I've pasted my entire squid.conf below, if you have anything 
extra turned on/off or et cetera than please let me know and I'll try it.  
Thanks!

acl manager proto cache_object
acl localhost src 127.0.0.1/32
acl to_localhost dst 127.0.0.0/8
acl TestPoolIPs src lpt-hdq-dmtqq31 wksthdq88w
acl localnet src 10.0.0.0/8 # RFC1918 possible internal network
acl localnet src 172.16.0.0/12  # RFC1918 possible internal network
acl localnet src 192.168.0.0/16 # RFC1918 possible internal network
acl sclthdq01w src 10.211.194.187/32# custom acl for apache/cache manager
acl SSL_ports port 443
acl Safe_ports port 80  # http
acl Safe_ports port 21  # ftp
acl Safe_ports port 443 # https
acl Safe_ports port 70  # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535  # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT
http_access allow manager localhost
http_access allow manager sclthdq01w
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
#http_access allow localnet
http_access allow localhost
http_access allow TestPoolIPs
http_access deny all
http_port 3128
hierarchy_stoplist cgi-bin ?
coredump_dir /usr/local/squid/var/cache
cache_mem 512 MB
pipeline_prefetch on
refresh_pattern ^ftp:   144020% 10080
refresh_pattern ^gopher:14400%  1440
refresh_pattern -i (/cgi-bin/|\?) 0 0%  0
refresh_pattern .   0   20% 4320

-Original Message-
From: Henrik Lidström [mailto:free...@lidstrom.eu] 
Sent: Monday, August 10, 2009 8:16 PM
To: Daniel
Cc: squid-users@squid-cache.org
Subject: Re: [squid-users] Squid 3.1.0.13 Speed Test - Upload breaks?

Daniel skrev:
> Kinkie,
>
>   I'm using the default settings, so I don't have any specific max 
> request sizes specified. I guess I'll hold out until someone else running 3.1 
> can test this.
>
> Thanks!
>
> -Original Message-
> From: Kinkie [mailto:gkin...@gmail.com] 
> Sent: Saturday, August 08, 2009 6:44 AM
> To: squid-users@squid-cache.org
> Subject: Re: [squid-users] Squid 3.1.0.13 Speed Test - Upload breaks?
>
> Maybe the failure could depend on some specific settings, such as max
> request size?
>
> On 8/8/09, Heinz Diehl  wrote:
>   
>> On 08.08.2009, Daniel wrote:
>>
>> 
>>> Would anyone else using Squid mind doing this same bandwidth test and
>>> seeing
>>> if they have the same issue(s)?
>>>   
>> It works flawlessly using both 2.7-STABLE6 and 3.0-STABLE18 here.
>>
>>
>> 
>
>
>   
Squid Cache: Version 3.1.0.13

Working without a problem, tested multiple sites on the list.
Nothing special in the config except maybe "pipeline_prefetch on"

/Henrik



Re: [squid-users] Squid 3.1.0.13 Speed Test - Upload breaks?

2009-08-10 Thread Henrik Lidström

Daniel skrev:

Kinkie,

I'm using the default settings, so I don't have any specific max 
request sizes specified. I guess I'll hold out until someone else running 3.1 
can test this.

Thanks!

-Original Message-
From: Kinkie [mailto:gkin...@gmail.com] 
Sent: Saturday, August 08, 2009 6:44 AM

To: squid-users@squid-cache.org
Subject: Re: [squid-users] Squid 3.1.0.13 Speed Test - Upload breaks?

Maybe the failure could depend on some specific settings, such as max
request size?

On 8/8/09, Heinz Diehl  wrote:
  

On 08.08.2009, Daniel wrote:



Would anyone else using Squid mind doing this same bandwidth test and
seeing
if they have the same issue(s)?
  

It works flawlessly using both 2.7-STABLE6 and 3.0-STABLE18 here.






  

Squid Cache: Version 3.1.0.13

Working without a problem, tested multiple sites on the list.
Nothing special in the config except maybe "pipeline_prefetch on"

/Henrik


RE: [squid-users] Squid 3.1.0.13 Speed Test - Upload breaks?

2009-08-10 Thread Daniel
Kinkie,

I'm using the default settings, so I don't have any specific max 
request sizes specified. I guess I'll hold out until someone else running 3.1 
can test this.

Thanks!

-Original Message-
From: Kinkie [mailto:gkin...@gmail.com] 
Sent: Saturday, August 08, 2009 6:44 AM
To: squid-users@squid-cache.org
Subject: Re: [squid-users] Squid 3.1.0.13 Speed Test - Upload breaks?

Maybe the failure could depend on some specific settings, such as max
request size?

On 8/8/09, Heinz Diehl  wrote:
> On 08.08.2009, Daniel wrote:
>
>> Would anyone else using Squid mind doing this same bandwidth test and
>> seeing
>> if they have the same issue(s)?
>
> It works flawlessly using both 2.7-STABLE6 and 3.0-STABLE18 here.
>
>


-- 
/kinkie



Re: [squid-users] Squid 3.1.0.13 Speed Test - Upload breaks?

2009-08-08 Thread Kinkie
Maybe the failure could depend on some specific settings, such as max
request size?

On 8/8/09, Heinz Diehl  wrote:
> On 08.08.2009, Daniel wrote:
>
>> Would anyone else using Squid mind doing this same bandwidth test and
>> seeing
>> if they have the same issue(s)?
>
> It works flawlessly using both 2.7-STABLE6 and 3.0-STABLE18 here.
>
>


-- 
/kinkie


[squid-users] Re: Squid 3.1.0.13 Speed Test - Upload breaks?

2009-08-08 Thread Heinz Diehl
On 08.08.2009, Daniel wrote: 

> Would anyone else using Squid mind doing this same bandwidth test and seeing
> if they have the same issue(s)?

It works flawlessly using both 2.7-STABLE6 and 3.0-STABLE18 here.



RE: [squid-users] Squid 3.1.0.13 Speed Test - Upload breaks?

2009-08-07 Thread Daniel
Ok thanks for those who have tried. Are there any 3.1 testers out there
willing to try this? It'd be great if someone was running 3.1.0.13 like
myself to do the test.

Thanks again in advance.

-Daniel

-Original Message-
From: Chris Robertson [mailto:crobert...@gci.net] 
Sent: Friday, August 07, 2009 3:37 PM
To: Squid-Users@Squid-Cache.org
Subject: Re: [squid-users] Squid 3.1.0.13 Speed Test - Upload breaks?

Daniel wrote:
> I was curious about the speed differences between our production ISA 2000
> server and our new Squid Cache 3.1.0.13 testing server.
>
> When I go to http://www.speakeasy.net/speedtest/ and do the speed tests,
the
> speedtest works 100% of the time for ISA & Squid. However, I do not
believe
> it has ever successfully ran the upload test with squid (but has worked
100%
> with ISA).
>
> One of two (2) things always happens with Squid: 
>
> 1) It will sit and hang on the 'PREPARING UPLOAD TEST.' until it finally
> times out after about 5 minutes.
> 2) It will sit and hang on the 'TESTING UPLOAD SPEED.' right about where
you
> think it should finish. It will eventually time out around 5~ minutes
later.
>
> The end result appears to always be the same. There's an HTTP 502 error
> message one it times out.
>
> Would anyone else using Squid mind doing this same bandwidth test and
seeing
> if they have the same issue(s)?
>   

Works for me with Squid 2.7STABLE6 (on a X86_64 CentOS 5 install):

1249673256.072   3363 209.165.141.254 TCP_MISS/200 17817236 GET 
http://sea.speakeasy.net/speedtest/random3000x3000.jpg? - 
DIRECT/66.93.87.2 image/jpeg
1249673257.822   5111 209.165.141.254 TCP_MISS/200 17817236 GET 
http://sea.speakeasy.net/speedtest/random3000x3000.jpg? - 
DIRECT/66.93.87.2 image/jpeg
1249673258.010 29 209.165.141.254 TCP_MISS/200 675 GET 
http://sea.speakeasy.net/crossdomain.xml - DIRECT/66.93.87.2 application/xml
1249673258.422405 209.165.141.254 TCP_MISS/200 319 POST 
http://sea.speakeasy.net/speedtest/upload.php? - DIRECT/66.93.87.2 text/html
1249673258.764746 209.165.141.254 TCP_MISS/200 319 POST 
http://sea.speakeasy.net/speedtest/upload.php? - DIRECT/66.93.87.2 text/html

> Thanks!
>
> -Daniel
>   

Chris



Re: [squid-users] Squid 3.1.0.13 Speed Test - Upload breaks?

2009-08-07 Thread Chris Robertson

Daniel wrote:

I was curious about the speed differences between our production ISA 2000
server and our new Squid Cache 3.1.0.13 testing server.

When I go to http://www.speakeasy.net/speedtest/ and do the speed tests, the
speedtest works 100% of the time for ISA & Squid. However, I do not believe
it has ever successfully ran the upload test with squid (but has worked 100%
with ISA).

One of two (2) things always happens with Squid: 


1) It will sit and hang on the 'PREPARING UPLOAD TEST.' until it finally
times out after about 5 minutes.
2) It will sit and hang on the 'TESTING UPLOAD SPEED.' right about where you
think it should finish. It will eventually time out around 5~ minutes later.

The end result appears to always be the same. There's an HTTP 502 error
message one it times out.

Would anyone else using Squid mind doing this same bandwidth test and seeing
if they have the same issue(s)?
  


Works for me with Squid 2.7STABLE6 (on a X86_64 CentOS 5 install):

1249673256.072   3363 209.165.141.254 TCP_MISS/200 17817236 GET 
http://sea.speakeasy.net/speedtest/random3000x3000.jpg? - 
DIRECT/66.93.87.2 image/jpeg
1249673257.822   5111 209.165.141.254 TCP_MISS/200 17817236 GET 
http://sea.speakeasy.net/speedtest/random3000x3000.jpg? - 
DIRECT/66.93.87.2 image/jpeg
1249673258.010 29 209.165.141.254 TCP_MISS/200 675 GET 
http://sea.speakeasy.net/crossdomain.xml - DIRECT/66.93.87.2 application/xml
1249673258.422405 209.165.141.254 TCP_MISS/200 319 POST 
http://sea.speakeasy.net/speedtest/upload.php? - DIRECT/66.93.87.2 text/html
1249673258.764746 209.165.141.254 TCP_MISS/200 319 POST 
http://sea.speakeasy.net/speedtest/upload.php? - DIRECT/66.93.87.2 text/html



Thanks!

-Daniel
  


Chris



[squid-users] Squid 3.1.0.13 Speed Test - Upload breaks?

2009-08-07 Thread Daniel
I was curious about the speed differences between our production ISA 2000
server and our new Squid Cache 3.1.0.13 testing server.

When I go to http://www.speakeasy.net/speedtest/ and do the speed tests, the
speedtest works 100% of the time for ISA & Squid. However, I do not believe
it has ever successfully ran the upload test with squid (but has worked 100%
with ISA).

One of two (2) things always happens with Squid: 

1) It will sit and hang on the 'PREPARING UPLOAD TEST.' until it finally
times out after about 5 minutes.
2) It will sit and hang on the 'TESTING UPLOAD SPEED.' right about where you
think it should finish. It will eventually time out around 5~ minutes later.

The end result appears to always be the same. There's an HTTP 502 error
message one it times out.

Would anyone else using Squid mind doing this same bandwidth test and seeing
if they have the same issue(s)?

Thanks!

-Daniel





[squid-users] test message

2009-07-10 Thread Abdul Khan

Please disregard it.



Re: [squid-users] Need definitive test between ICAP server as defined in squid.conf

2009-02-17 Thread Amos Jeffries

da...@davidwbrown.name wrote:

Hello squid users all, I am currently trying to get an ICAP server and Squid to 
communicate. I have run: ./squid -N -X and the output looks OK if not at least 
promising. The target ICAP server is responding to: telnet and an ICAP client 
as expected. Squid of course is doing its normal job. Squid is running 
transparent and both servers are on the same box. If there are any ICAP gurus 
out there that have a good idea for real-time testing of these two daemons 
please speak your mind. Regards, David.


What version of Squid?

Amos
--
Please be using
  Current Stable Squid 2.7.STABLE6 or 3.0.STABLE13
  Current Beta Squid 3.1.0.5


[squid-users] Test

2009-02-17 Thread Thomas
Hi there.

Test to see if I can now post to the list.

Regards
Thomas





[squid-users] Need definitive test between ICAP server as defined in squid.conf

2009-02-17 Thread david
Hello squid users all, I am currently trying to get an ICAP server and Squid to 
communicate. I have run: ./squid -N -X and the output looks OK if not at least 
promising. The target ICAP server is responding to: telnet and an ICAP client 
as expected. Squid of course is doing its normal job. Squid is running 
transparent and both servers are on the same box. If there are any ICAP gurus 
out there that have a good idea for real-time testing of these two daemons 
please speak your mind. Regards, David.


[squid-users] Free-test russian xxx site

2009-01-20 Thread metro5

Free-test russian xxx site  http://xxx.gamapa.ru http://xxx.gamapa.ru 
-- 
View this message in context: 
http://www.nabble.com/Free-test-russian-xxx-site-tp21568452p21568452.html
Sent from the Squid - Users mailing list archive at Nabble.com.



[squid-users] Squid 3.0 STABLE11-RC1 (test) is available

2008-12-02 Thread Amos Jeffries

The Squid HTTP Proxy team is pleased to announce the
availability of the Squid-3.0.STABLE11-RC1 test release!


This release is labeled RC1 due to the experimental nature of two
major alterations:

 * A minor patch introduced in 3.0.STABLE10 was found to cause
   certain objects to be stored in cache despite bad content.
   Changing away from STABEL10 to any other version, may cause
   a large number of warnings to be displayed and some response
   delay as these objects are cleared from the cache.

* Bug 2526: ACLChecklist was found to contain an implicit allow on
  certain error cases in the code. This does not effect any of the
  main security controls. But the effects of closing this hole on
  minor controls is not yet widely tested.
  If any strange behavior of access controls is noted on this release
  please notify squid-dev mailing list immediately.


This release also fixes a few minor regressions and bugs found in the 
last release.

  - Fixes regression: access.log request size tag
  - Fixes cache_peer forceddomainname=X option


Please refer to the release notes at
http://www.squid-cache.org/Versions/v3/3.0/RELEASENOTES.html
if and when you are ready to make the switch to Squid-3.

This new release can be downloaded from our HTTP or FTP servers

 http://www.squid-cache.org/Versions/v3/3.0/
 ftp://ftp.squid-cache.org/pub/squid-3/STABLE/

or the mirrors. For a list of mirror sites see

 http://www.squid-cache.org/Download/http-mirrors.dyn
 http://www.squid-cache.org/Download/mirrors.dyn

If you encounter any issues with this release please file a bug report.
 http://bugs.squid-cache.org/


Amos Jeffries


[squid-users] test

2008-06-04 Thread Carl Fox


_
NEW! Get Windows Live FREE.
http://www.get.live.com/wl/all


[squid-users] cache directive test page

2008-05-08 Thread Christian Seifert
I created a little test page that allows to easily check the cache 
configuration of a proxymaybe of use to some of you

http://www.mcs.vuw.ac.nz/~cseifert/cache/test.html



  

Be a better friend, newshound, and 
know-it-all with Yahoo! Mobile.  Try it now.  
http://mobile.yahoo.com/;_ylt=Ahu06i62sR8HDtDypao8Wcj9tAcJ


[squid-users] test !!! Ignore this message

2007-09-26 Thread josse wang
test !!! Ignore this message


Re: [squid-users] How to test squid using squid client

2007-07-13 Thread Andreas Pettersson

ying lcs wrote:

Thanks. But how can I tell squidclient is retrieved the content from
the cache if I execute the same url the second time?


You should see a header like:
X-Cache: HIT from your.proxy.server

and have a TCP_HIT in access.log instead of TCP_MISS.

--
Andreas




Re: [squid-users] How to test squid using squid client

2007-07-12 Thread ying lcs

Thanks. But how can I tell squidclient is retrieved the content from
the cache if I execute the same url the second time?

Thank you.

On 7/12/07, John Yatsko, Jr. <[EMAIL PROTECTED]> wrote:

Did you set up the acl in your squid.conf? Squid is configured to block all
connections by default.

Look for these lines in your squid.conf:
---
# Example rule allowing access from your local networks. Adapt
# to list your (internal) IP networks from where browsing should
# be allowed
#acl our_networks src 192.168.1.0/24 192.168.2.0/24
#http_access allow our_networks

# And finally deny all other access to this proxy
http_access deny all
---

EDIT THESE:

#acl our_networks src 192.168.1.0/24 192.168.2.0/24
#http_access allow our_networks

Thank you,

John Yatsko, Jr.
Technology Assistant
Erie County Public Library
160 East Front St
Erie PA 16507
(814) 451-7307
- Original Message -
From: "ying lcs" <[EMAIL PROTECTED]>
To: 
Sent: Thursday, July 12, 2007 12:00 PM
Subject: [squid-users] How to test squid using squid client


> Hi,
>
> I am reading "Test Squid" section here:
> http://www.deckle.co.za/squid-users-guide/Starting_Squid
>
> I am able to get squid running, but when I do the step of "
> ./squidclient http://www.squid-cache.org/"; to test the squid, I get
> the following permission denied error.
>
> Can you please me how I can test  a running squid the first time?
>
>
> $ ./squidclient http://www.squid-cache.org/
> HTTP/1.0 403 Forbidden
> Server: squid/2.6.STABLE13
> Date: Thu, 12 Jul 2007 15:54:53 GMT
> Content-Type: text/html
> Content-Length: 1073
> Expires: Thu, 12 Jul 2007 15:54:53 GMT
> X-Squid-Error: ERR_ACCESS_DENIED 0
> X-Cache: MISS from [EMAIL PROTECTED]
> Via: 1.0 [EMAIL PROTECTED]:3128 (squid/2.6.STABLE13)
> Proxy-Connection: close
>
>  "http://www.w3.org/TR/html4/loose.dtd";>
> 
> ERROR: The requested URL could not be retrieved
>  
type="text/css"><!--BODY{background-color:#ff;font-family:verdana,sans-serif}PRE{font-family:sans-serif}-->
> 
> ERROR
> The requested URL could not be retrieved
> 
> 
> While trying to retrieve the URL:
> http://www.squid-cache.org/";>http://www.squid-cache.org/
> 
> The following error was encountered:
> 
> 
> 
> Access Denied.
> 
> 
> Access control configuration prevents your request from
> being allowed at this time.  Please contact your service provider if
> you feel this is incorrect.
> 
> Your cache administrator is mailto:webmaster";>webmaster.
>
>
> 
> 
> 
> Generated Thu, 12 Jul 2007 15:54:53 GMT by [EMAIL PROTECTED]
> (squid/2.6.STABLE13)
> 
> 




Re: [squid-users] How to test squid using squid client

2007-07-12 Thread John Yatsko, Jr.
Did you set up the acl in your squid.conf? Squid is configured to block all 
connections by default.


Look for these lines in your squid.conf:
---
# Example rule allowing access from your local networks. Adapt
# to list your (internal) IP networks from where browsing should
# be allowed
#acl our_networks src 192.168.1.0/24 192.168.2.0/24
#http_access allow our_networks

# And finally deny all other access to this proxy
http_access deny all
---

EDIT THESE:

#acl our_networks src 192.168.1.0/24 192.168.2.0/24
#http_access allow our_networks

Thank you,

John Yatsko, Jr.
Technology Assistant
Erie County Public Library
160 East Front St
Erie PA 16507
(814) 451-7307
- Original Message - 
From: "ying lcs" <[EMAIL PROTECTED]>

To: 
Sent: Thursday, July 12, 2007 12:00 PM
Subject: [squid-users] How to test squid using squid client



Hi,

I am reading "Test Squid" section here:
http://www.deckle.co.za/squid-users-guide/Starting_Squid

I am able to get squid running, but when I do the step of "
./squidclient http://www.squid-cache.org/"; to test the squid, I get
the following permission denied error.

Can you please me how I can test  a running squid the first time?


$ ./squidclient http://www.squid-cache.org/
HTTP/1.0 403 Forbidden
Server: squid/2.6.STABLE13
Date: Thu, 12 Jul 2007 15:54:53 GMT
Content-Type: text/html
Content-Length: 1073
Expires: Thu, 12 Jul 2007 15:54:53 GMT
X-Squid-Error: ERR_ACCESS_DENIED 0
X-Cache: MISS from [EMAIL PROTECTED]
Via: 1.0 [EMAIL PROTECTED]:3128 (squid/2.6.STABLE13)
Proxy-Connection: close

http://www.w3.org/TR/html4/loose.dtd";>

ERROR: The requested URL could not be retrieved
<tt>type="text/css"><!--BODY{background-color:#ff;font-family:verdana,sans-serif}PRE{font-family:sans-serif}-->


ERROR
The requested URL could not be retrieved


While trying to retrieve the URL:
http://www.squid-cache.org/";>http://www.squid-cache.org/

The following error was encountered:



Access Denied.


Access control configuration prevents your request from
being allowed at this time.  Please contact your service provider if
you feel this is incorrect.

Your cache administrator is mailto:webmaster";>webmaster.





Generated Thu, 12 Jul 2007 15:54:53 GMT by [EMAIL PROTECTED] 
(squid/2.6.STABLE13)


 




[squid-users] How to test squid using squid client

2007-07-12 Thread ying lcs

Hi,

I am reading "Test Squid" section here:
http://www.deckle.co.za/squid-users-guide/Starting_Squid

I am able to get squid running, but when I do the step of "
./squidclient http://www.squid-cache.org/"; to test the squid, I get
the following permission denied error.

Can you please me how I can test  a running squid the first time?


$ ./squidclient http://www.squid-cache.org/
HTTP/1.0 403 Forbidden
Server: squid/2.6.STABLE13
Date: Thu, 12 Jul 2007 15:54:53 GMT
Content-Type: text/html
Content-Length: 1073
Expires: Thu, 12 Jul 2007 15:54:53 GMT
X-Squid-Error: ERR_ACCESS_DENIED 0
X-Cache: MISS from [EMAIL PROTECTED]
Via: 1.0 [EMAIL PROTECTED]:3128 (squid/2.6.STABLE13)
Proxy-Connection: close

http://www.w3.org/TR/html4/loose.dtd";>

ERROR: The requested URL could not be retrieved
<!--BODY{background-color:#ff;font-family:verdana,sans-serif}PRE{font-family:sans-serif}-->

ERROR
The requested URL could not be retrieved


While trying to retrieve the URL:
http://www.squid-cache.org/";>http://www.squid-cache.org/

The following error was encountered:



Access Denied.


Access control configuration prevents your request from
being allowed at this time.  Please contact your service provider if
you feel this is incorrect.

Your cache administrator is mailto:webmaster";>webmaster.





Generated Thu, 12 Jul 2007 15:54:53 GMT by [EMAIL PROTECTED] 
(squid/2.6.STABLE13)




[squid-users] Just a test, please ignore

2007-07-09 Thread Henrik Nordstrom
This is just a test of the mail server. Please ignore.

Regards
Henrik


[squid-users] test

2007-07-08 Thread Adrian Chadd
testing, please ignore.



[squid-users] test 2

2007-07-08 Thread Adrian Chadd
test 2.




[squid-users] Test, please ignore

2007-06-11 Thread Henrik Nordstrom
Just a stupid test of the mail server. Please ignore.

Regards
Henrik


signature.asc
Description: Detta är en digitalt signerad	meddelandedel


[squid-users] Polygraph test help

2006-10-12 Thread Sekar

Hello all,

In polygraph, will a persistent connection be used for particular domain 
or will it use the same connection for more than one domain.




Thanks in advance
Sekar


Re: [squid-users] Test

2006-10-02 Thread Mernoz Rostangi
Hi,

make sure you are not using HTML when composing your mail ! It must be in plain 
text, otherwise the maillerdaemon will discard it.

:-)
./m


- Original Message -
From: George Levy
[mailto:[EMAIL PROTECTED]
To: squid-users@squid-cache.org
Sent: Mon, 02
Oct 2006 00:34:36 +0200
Subject: [squid-users] Test


> My posts bounce back. Here is the message:
> 
> ezmlm-reject: fatal: Sorry, a message part has an unacceptable MIME
> Content-Type: text/html (#5.2.3)
> 
> I am just testing to see if the problem is in the content of the post.
> George
> 


Re: [squid-users] Test

2006-10-01 Thread George Levy

Thank you Henrik, it worked
George

Henrik Nordstrom wrote:


sön 2006-10-01 klockan 15:34 -0700 skrev George Levy:
 


My posts bounce back. Here is the message:

ezmlm-reject: fatal: Sorry, a message part has an unacceptable MIME 
Content-Type: text/html (#5.2.3)

I am just testing to see if the problem is in the content of the post.
   



Yes, it is. The list server only accepts plain text email, not HTML
formatted email. But it appears you got the settings correct now as your
message was accepted.

Regards
Henrik
 





Re: [squid-users] Test

2006-10-01 Thread Henrik Nordstrom
sön 2006-10-01 klockan 15:34 -0700 skrev George Levy:
> My posts bounce back. Here is the message:
> 
> ezmlm-reject: fatal: Sorry, a message part has an unacceptable MIME 
> Content-Type: text/html (#5.2.3)
> 
> I am just testing to see if the problem is in the content of the post.

Yes, it is. The list server only accepts plain text email, not HTML
formatted email. But it appears you got the settings correct now as your
message was accepted.

Regards
Henrik


signature.asc
Description: Detta är en digitalt signerad	meddelandedel


[squid-users] Test

2006-10-01 Thread George Levy

My posts bounce back. Here is the message:

ezmlm-reject: fatal: Sorry, a message part has an unacceptable MIME 
Content-Type: text/html (#5.2.3)

I am just testing to see if the problem is in the content of the post.
George


[squid-users] test

2006-05-26 Thread nima sadeghian

test

--
Best Regards
NIMA SADEGHIAN


Re: [squid-users] Setting up the poly graph test for squid

2006-04-07 Thread Henrik Nordstrom
tor 2006-03-30 klockan 11:32 -0800 skrev Balu:

> But while starting the polyclient with the command 
> 

> 000.06| fyi: no real host addresses for Robot side
> specified; will not attempt to create agent addresses
> 000.07| created 0 agents total
> bin/polyclt: no Robot matches local interface

the station where you run attempted to run policlnt does not have any
configued IP addresses matching your test profile...

Please follow the instructions & how-tos for the workload you selected
carefully. They contain fairly detailed instructions on every detail you
need to think of..

  http://www.web-polygraph.org/docs/workloads/
  http://www.measurement-factory.com/docs/

Regards
Henrik


signature.asc
Description: Detta är en digitalt signerad	meddelandedel


[squid-users] Setting up the poly graph test for squid

2006-03-30 Thread Balu
I did not got any reply from ploygrph mailing list
thought some body from squid mailing list may help 

Hello All,

Need help to srart with

I am trying to do performance test of squid. 

1) I was able to start the squid server
2) I was able to start the polysrv

But while starting the polyclient with the command 

#bin/polyclt --config workloads/vicache.pg --log
/tmp/sample_run1 --verb_lvl 10 --proxy
192.10.10.43:3128 --ports 6000:6

I fails with followng information. Last few lines from
the error message are given below 

=
000.06| group-id: 0dc807b9.1ee80f45:0004 pid: 3909
000.06| current time: 1143558666.624116 or Tue, 28 Mar
2006 15:11:06 GMT
000.06| registered client-side session watches: 0
000.06| registered client-side data filters: 0
000.06| fyi: PGL configuration stored (4902bytes)
000.06| fyi: no real host addresses for Robot side
specified; will not attempt to create agent addresses
000.07| created 0 agents total
bin/polyclt: no Robot matches local interface
addresses
Robot addresses:['10.1.1.1', '10.1.1.2', '10.1.1.3',
'10.1.1.4', '10.1.1.5', '10.1.1.6', '10.1.1.7',
'10.1.1.8', '10.1.1.9', '10.1.1.10', '10.1.1.11',
'10.1.1.12', '10.1.1.13', '10.1.1.14', '10.1.1.15',
'10.1.1.16', '10.1.1.17', '10.1.1.18', '10.1.1.19',
'10.1.1.20', '10.1.1.21', '10.1.1.22', '10.1.1.23',
'10.1.1.24', '10.1.1.25', '10.1.1.26', '10.1.1.27',
'10.1.1.28', '10.1.1.29', '10.1.1.30', '10.1.1.31',
'10.1.1.32', '10.1.1.33', '10.1.1.34', '10.1.1.35',
'10.1.1.36', '10.1.1.37', '10.1.1.38', '10.1.1.39',
'10.1.1.40', '10.1.1.41', '10.1.1.42', '10.1.1.43',
'10.1.1.44', '10.1.1.45', '10.1.1.46', '10.1.1.47',
'10.1.1.48', '10.1.1.49', '10.1.1.50', '10.1.1.51',
'10.1.1.52', '10.1.1.53', '10.1.1.54', '10.1.1.55',
'10.1.1.56', '10.1.1.57', '10.1.1.58', '10.1.1.59',
'10.1.1.60']
local addresses: ['127.0.0.1', '192.10.10.41',
'10.1.129.1', '10.1.129.2', '10.1.129.3',
'10.1.129.4', '10.1.129.5', '10.1.129.6',
'10.1.129.7', '10.1.129.8', '10.1.129.9',
'10.1.129.10', '10.1.129.11', '10.1.129.12',
'10.1.129.13', '10.1.129.14', '10.1.129.15',
'10.1.129.16', '10.1.129.17', '10.1.129.18',
'10.1.129.19', '10.1.129.20', '10.1.129.21',
'10.1.129.22', '10.1.129.23', '10.1.129.24',
'10.1.129.25', '10.1.129.26', '10.1.129.27',
'10.1.129.28', '10.1.129.29', '10.1.129.30',
'10.1.129.31', '10.1.129.32', '10.1.129.33',
'10.1.129.34', '10.1.129.35', '10.1.129.36',
'10.1.129.37', '10.1.129.38', '10.1.129.39',
'10.1.129.40', '10.1.129.41', '10.1.129.42',
'10.1.129.43', '10.1.129.44', '10.1.129.45',
'10.1.129.46', '10.1.129.47', '10.1.129.48',
'10.1.129.49', '10.1.129.50', '10.1.129.51',
'10.1.129.52', '10.1.129.53', '10.1.129.54',
'10.1.129.55', '10.1.129.56', '10.1.129.57',
'10.1.129.58', '10.1.129.59', '10.1.129.60',
'10.1.129.61', '10.1.129.62', '10.1.129.63',
'10.1.129.64', '10.1.129.65', '10.1.129.66',
'10.1.129.67', '10.1.129.68', '10.1.129.69',
'10.1.129.70', '10.1.129.71', '10.1.129.72',
'10.1.129.73', '10.1.129.74', '10.1.129.75',
'10.1.129.76', '10.1.129.77', '10.1.129.78',
'10.1.129.79', '10.1.129.80', '10.1.129.81',
'10.1.129.82', '10.1.129.83', '10.1.129.84',
'10.1.129.85', '10.1.129.86', '10.1.129.87',
'10.1.129.88', '10.1.129.89', '10.1.129.90',
'10.1.129.91', '10.1.129.92', '10.1.129.93',
'10.1.129.94', '10.1.129.95', '10.1.129.96',
'10.1.129.97', '10.1.129.98', '10.1.129.99',
'10.1.129.100

Re: [squid-users] DNS test on start squidNT

2005-12-15 Thread Wadi
Hello Guillaume,

Wednesday, December 14, 2005, 10:05:14 PM, you wrote:

> hi, guys!
> I have a new problem on a squidNT proxy, when the server start, in
> cache.log i found these lines:
> #
> #2005/12/13 14:41:34| Performing DNS Tests...
> #2005/12/13 14:41:34| DNS name lookup tests failed...
> #
> At this moment, my internet link was down, and squid refused to start.

> 2 questions :)

> 1] what are these DNS lookups
> &
> 2] what can i do to let the service start even if the internet link is down

> thanks for your replies
> Guillaume

DNS lookup test used for check if your server could connect to the
network (internet) and resolve the address of domain that specified on
domain lookup in squid.conf.

Specified in squid.conf
# MISCELLANEOUS
# -

#  TAG: dns_testnames
#   The DNS tests exit as soon as the first site is successfully looked up
#
#   This test can be disabled with the -D command line option.
#
#Default:
# dns_testnames netscape.com internic.net nlanr.net microsoft.com


-- 
Best regards,
 Wadimailto:[EMAIL PROTECTED]



Re: [squid-users] DNS test on start squidNT

2005-12-14 Thread Mark Elsen
> hi, guys!
> I have a new problem on a squidNT proxy, when the server start, in
> cache.log i found these lines:
> #
> #2005/12/13 14:41:34| Performing DNS Tests...
> #2005/12/13 14:41:34| DNS name lookup tests failed...
> #
> At this moment, my internet link was down, and squid refused to start.
>
> 2 questions :)
>
> 1] what are these DNS lookups
> &
> 2] what can i do to let the service start even if the internet link is down
>
> thanks for your replies
> Guillaume
>

 Check the :

  dns_testnames

 directive in squid.conf.

 Try making the list empty.

 M.


[squid-users] DNS test on start squidNT

2005-12-14 Thread Guillaume
hi, guys!
I have a new problem on a squidNT proxy, when the server start, in
cache.log i found these lines:
#
#2005/12/13 14:41:34| Performing DNS Tests...
#2005/12/13 14:41:34| DNS name lookup tests failed...
#
At this moment, my internet link was down, and squid refused to start.

2 questions :)

1] what are these DNS lookups
&
2] what can i do to let the service start even if the internet link is down

thanks for your replies
Guillaume


[squid-users] test

2005-12-13 Thread Peter Zechmeister
unsubscribe



Re: [squid-users] syntax to test ldap groups?

2005-11-07 Thread Henrik Nordstrom



On Mon, 7 Nov 2005, Derrick MacPherson wrote:


Just following up on this, all is working except I'm not sure what I
need for syntax in referring to an AD group with a space in the name,


For this to work you need to place the group in an external file. In 
external files each line is read as a group name, including any spaces or 
other odd characters..


Regards
Henrik


Re: [squid-users] syntax to test ldap groups?

2005-11-07 Thread Derrick MacPherson
Just following up on this, all is working except I'm not sure what I
need for syntax in referring to an AD group with a space in the name,
i've tried:

Internet Access
'Internet Access'
`Internet Access`
Internet%20Access

all without working and

"Internet Access" refers to an external file.


What have I missed?



Re: [squid-users] syntax to test ldap groups?

2005-11-05 Thread Henrik Nordstrom



On Fri, 4 Nov 2005, Derrick MacPherson wrote:


acl InetAccess external InetGroup eng


This means either of the groups InetGroup or eng.


InetGroup comes from :

external_acl_type InetGroup %LOGIN


Right, just me being tired... the group is eng, nothing else.

Regards
Henrik


Re: [squid-users] syntax to test ldap groups?

2005-11-04 Thread Henrik Nordstrom



On Fri, 4 Nov 2005, Derrick MacPherson wrote:


whats the syntax to test lookups for groups on squid_ldap_group?


username groupname

If the group name contains spaces enclose the group name in double quotes

username "group name"

Regards
Henrik


[squid-users] syntax to test ldap groups?

2005-11-04 Thread Derrick MacPherson
whats the syntax to test lookups for groups on squid_ldap_group?


thanks.



[squid-users] test

2005-11-03 Thread rs
test



Re: [squid-users] error in the test with wbinfo but authentication working with squid

2005-09-29 Thread Henrik Nordstrom

On Thu, 29 Sep 2005 [EMAIL PROTECTED] wrote:


bash-2.05# wbinfo -a domain\\bj%
plaintext password authentication failed
error code was NT_STATUS_TRUSTED_DOMAIN_FAILURE (0xc18c)
error messsage was: Trusted domain failure
Could not authenticate user domain\\bj% with plaintext password
challenge/response password authentication failed
error code was NT_STATUS_TRUSTED_DOMAIN_FAILURE (0xc18c)
error messsage was: Trusted domain failure
Could not authenticate user domain\\bj with challenge/response

still working with squid and ntlm auth


Odd indeed.


More info for me ?


Not my field. This is a Samba question, not a Squid question (nothing of 
Squid involved in the above).


Is your Samba version up to date?

Regards
Henrik


Re: [squid-users] error in the test with wbinfo but authentication working with squid

2005-09-29 Thread Arno . STREULI

and having that :

bash-2.05# wbinfo -a domain\\bj%
plaintext password authentication failed
error code was NT_STATUS_TRUSTED_DOMAIN_FAILURE (0xc18c)
error messsage was: Trusted domain failure
Could not authenticate user domain\\bj% with plaintext password
challenge/response password authentication failed
error code was NT_STATUS_TRUSTED_DOMAIN_FAILURE (0xc18c)
error messsage was: Trusted domain failure
Could not authenticate user domain\\bj with challenge/response

still working with squid and ntlm auth
More info for me ?
 here is the config of smb.conf (sorry I can't ask the samba mailing list
was never able to subscribe to it !?!?)

[global]
workgroup = D-CI3
server string = penelope proxy %v
security = DOMAIN
password server = 10.17.12.56 10.17.12.57
client NTLMv2 auth = Yes
client lanman auth = No
client plaintext auth = No
name resolve order = wins host
wins server = 10.17.12.9, 10.17.17.8
idmap uid = 1-2
idmap gid = 1-2

Arno Streuli




  
  Henrik Nordstrom  
  
  <[EMAIL PROTECTED]To:   [EMAIL PROTECTED] 
  
  org> cc:   
squid-users@squid-cache.org  
   Subject:  Re: [squid-users] 
error in the test with wbinfo but authentication working   
  29.09.2005 13:50  with squid  
  

  

  




On Thu, 29 Sep 2005 [EMAIL PROTECTED] wrote:

> bash-2.05# wbinfo -a domain\\bj%
> plaintext password authentication failed
> error code was NT_STATUS_TRUSTED_DOMAIN_FAILURE (0xc18c)
> error messsage was: Trusted domain failure
> Could not authenticate user domain\\bj% with plaintext password
>
> but if I use squid and ntlm_auth every thing is working fine !??!
> Any one can explain ?

Maybe the security policy of the domain does not allow plain text
authentication? But I would expect another error code if this was the
case..

Regards
Henrik






**
DISCLAIMER - E-MAIL
---
The information contained in this E-Mail is intended for the named
recipient(s). It may  contain certain  privileged and confidential
information, or  information  which  is  otherwise  protected from
disclosure. If  you  are  not the intended recipient, you must not
copy,distribute or take any action in reliance on this information
**


Re: [squid-users] error in the test with wbinfo but authentication working with squid

2005-09-29 Thread Henrik Nordstrom

On Thu, 29 Sep 2005 [EMAIL PROTECTED] wrote:


bash-2.05# wbinfo -a domain\\bj%
plaintext password authentication failed
error code was NT_STATUS_TRUSTED_DOMAIN_FAILURE (0xc18c)
error messsage was: Trusted domain failure
Could not authenticate user domain\\bj% with plaintext password

but if I use squid and ntlm_auth every thing is working fine !??!
Any one can explain ?


Maybe the security policy of the domain does not allow plain text 
authentication? But I would expect another error code if this was the 
case..


Regards
Henrik


[squid-users] error in the test with wbinfo but authentication working with squid

2005-09-29 Thread Arno . STREULI
Hi,
I have some strange stuff in my config:
when I do some test with wbinfo on one of my proxy all test is ok (list of
trusted domain, list of user, group and check of the rpc key). But when I
try to test a use it dosen't work:

bash-2.05# wbinfo -a domain\\bj%
plaintext password authentication failed
error code was NT_STATUS_TRUSTED_DOMAIN_FAILURE (0xc18c)
error messsage was: Trusted domain failure
Could not authenticate user domain\\bj% with plaintext password

but if I use squid and ntlm_auth every thing is working fine !??!
Any one can explain ?
thanks

regards,


Arno Streuli


PS: I'm using solaris 8 and squid 2.5S9, and samba 3.0.14a



**
DISCLAIMER - E-MAIL
---
The information contained in this E-Mail is intended for the named
recipient(s). It may  contain certain  privileged and confidential
information, or  information  which  is  otherwise  protected from
disclosure. If  you  are  not the intended recipient, you must not
copy,distribute or take any action in reliance on this information
**


Re: [squid-users] Test mail

2005-09-15 Thread Pankaj Karna
On Thu, 2005-09-15 at 18:23 +0545, Pankaj Karna wrote:
> Dear all 
> 
> Hello,
> 
> This is check mail .
> Hope u recivied my check mail.
> 
> Thanks 
> Pankaj
> 
> 
> 



[squid-users] Test mail

2005-09-15 Thread Pankaj Karna
Dear all 

Hello,

This is check mail .
Hope u recivied my check mail.

Thanks 
Pankaj




[squid-users] test - pls ignore

2005-08-02 Thread Hendro Susanto
test


[squid-users] Anyone using the feature to test a group wuith NTLM or I'm the only one ? is that a bug ?

2005-06-21 Thread Arno . STREULI
Hi,
I try to find someone who know how to configure the wbinfo_group.pl as a
external helper.
I have squid 2.5 STABLE9 runing on solaris 8 and the authentication is
working with a NT domain (the user auth is working fine)

here is my config:
## basic auth
auth_param basic program /opt/samba/bin/ntlm_auth
--helper-protocol=squid-2.5-ba
sic
auth_param basic children 64
auth_param basic credentialsttl 2 hours
auth_param basic realm CAI Internet access control Gen\350ve
## NTLM auth
auth_param ntlm program /opt/samba/bin/ntlm_auth
--helper-protocol=squid-2.5-ntl
mssp
auth_param ntlm children 64
auth_param ntlm max_challenge_lifetime 30 minutes
auth_param ntlm max_challenge_reuses 0

authenticate_cache_garbage_interval 10 minute
authenticate_ttl 10 minute
external_acl_type NT_global_group %LOGIN /opt/squid/libexec/wbinfo_group.pl

acl techuser external NT_global_group D-CH-BI1\SurfeursWebCAICH-T
acl webuser external NT_global_group D-CH-BI1\SurfeursWebCAICH
D-CH-BI1\SurfeursWebCAICH-T

http_access deny ftp !techuser
http_access allow cai-auth webuser
http_access deny all

but that dosen't wokr the wbinfo_group.pl is only testing the first group,
not the second or the third, here is the output of a test user: (he is
member of SurfeursWebCAICH-T)

here is the debug I have on cache.log
Got d-ch-bi1\\bi9yj D-CH-BI1\\SurfeursWebCAICH D-CH-BI1\\SurfeursWebCAICH-T
from squid
User:  -d-ch-bi1\bi9yj-
Group: -D-CH-BI1\SurfeursWebCAICH-
SID:   -S-1-5-21-907243726-1387878072-1859928627-9560 Domain Group (2)-
GID:   -10013-
Sending ERR to squid

but if I do a wbinfo -r d-ch-bi1\\bi9yj
here is my group:
1
10001
10002
10003
10004
10005
10006
10007
10008
10009
10010
10011
10012

so the wbinfo_group.pl only test the first group it receive from squid not
the other.
How can I make it work ?

thanks for any help
Arno




**
DISCLAIMER - E-MAIL
---
The information contained in this E-Mail is intended for the named
recipient(s). It may  contain certain  privileged and confidential
information, or  information  which  is  otherwise  protected from
disclosure. If  you  are  not the intended recipient, you must not
copy,distribute or take any action in reliance on this information
**


  1   2   >