[squid-users] ##palin AW: [squid-users] #Can't access certain webpages

2013-11-26 Thread Grooz, Marc (regio iT)
I've got it. I set the option "forwared-for" from off to delete and now both 
website gets displayed thru squid.

Kind regrads
Marc


-Ursprüngliche Nachricht-
Von: Amos Jeffries [mailto:squ...@treenet.co.nz] 
Gesendet: Dienstag, 26. November 2013 13:11
An: squid-users@squid-cache.org
Betreff: Re: [squid-users] ##palin AW: [squid-users] #Can't access certain 
webpages

On 27/11/2013 1:00 a.m., Grooz, Marc (regio iT) wrote:
> In my first case:
> 
> Squid request:
> 
> -MGET 
> /cgi-bin/upload_status.cgi?uid=060950223627&files=:iso-27001-router-se
> curity-audit-checklist.xls&ok=1 HTTP/1.1
> Accept: text/html, application/xhtml+xml, */*
> 
> Webserver answer:
> [-MHTTP/1.1 200 OK
> Date: Mon, 25 Nov 2013 12:48:57 GMT

>> Squid send the first request again and again.
> 
> Direct request without squid:
> 
> Gm/GET 
> /cgi-bin/upload_status.cgi?uid=318568766743&files=:aukirche.JPG&ok=1 
> HTTP/1.1
> 
> Webserver answer:
> GmHTTP/1.1 200 OK

> 
>> Website gets displayed.
> 


Are those "-M" "Gm/" cgaracters really in front of the GET method name and the 
HTTP/1.1 response version label?

It looks like you may be receiving SOCKS protocol traffic.

Amos


smime.p7s
Description: S/MIME cryptographic signature


Re: [squid-users] ##palin AW: [squid-users] #Can't access certain webpages

2013-11-26 Thread Amos Jeffries
On 27/11/2013 1:00 a.m., Grooz, Marc (regio iT) wrote:
> In my first case:
> 
> Squid request:
> 
> -MGET 
> /cgi-bin/upload_status.cgi?uid=060950223627&files=:iso-27001-router-security-audit-checklist.xls&ok=1
>  HTTP/1.1
> Accept: text/html, application/xhtml+xml, */*
> 
> Webserver answer:
> [-MHTTP/1.1 200 OK
> Date: Mon, 25 Nov 2013 12:48:57 GMT

>> Squid send the first request again and again.
> 
> Direct request without squid:
> 
> Gm/GET /cgi-bin/upload_status.cgi?uid=318568766743&files=:aukirche.JPG&ok=1 
> HTTP/1.1
> 
> Webserver answer:
> GmHTTP/1.1 200 OK

> 
>> Website gets displayed.
> 


Are those "-M" "Gm/" cgaracters really in front of the GET method name
and the HTTP/1.1 response version label?

It looks like you may be receiving SOCKS protocol traffic.

Amos



[squid-users] ##palin AW: [squid-users] #Can't access certain webpages

2013-11-26 Thread Grooz, Marc (regio iT)
In my first case:

Squid request:

-MGET 
/cgi-bin/upload_status.cgi?uid=060950223627&files=:iso-27001-router-security-audit-checklist.xls&ok=1
 HTTP/1.1
Accept: text/html, application/xhtml+xml, */*
Referer: http://xyz/
Accept-Language: de-DE
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; Trident/7.0; rv:11.0) like Gecko
Accept-Encoding: gzip, deflate
Host: xyz
X-Forwarded-For: unknown, unknown
Cache-Control: max-age=0
Connection: keep-alive

Webserver answer:
[-MHTTP/1.1 200 OK
Date: Mon, 25 Nov 2013 12:48:57 GMT
Server: Apache/2.2.22 (Linux/SUSE)
Expires: Mon, 26 Jul 1997 05:00:00 GMT
Pragma: no-cache
Keep-Alive: timeout=15, max=100
Connection: Keep-Alive
Transfer-Encoding: chunked
Content-Type: text/html

> Squid send the first request again and again.

Direct request without squid:

Gm/GET /cgi-bin/upload_status.cgi?uid=318568766743&files=:aukirche.JPG&ok=1 
HTTP/1.1
Accept: text/html, application/xhtml+xml, */*
Referer: http://xyz/
Accept-Language: de-DE
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; Trident/7.0; rv:11.0) like Gecko
Accept-Encoding: gzip, deflate
Host: xyz
DNT: 1
Connection: Keep-Alive

Webserver answer:
GmHTTP/1.1 200 OK
Date: Tue, 26 Nov 2013 10:36:25 GMT
Server: Apache/2.2.22 (Linux/SUSE)
Expires: Mon, 26 Jul 1997 05:00:00 GMT
Pragma: no-cache
Keep-Alive: timeout=15, max=100
Connection: Keep-Alive
Transfer-Encoding: chunked
Content-Type: text/html

>Website gets displayed.



In my second case:

Squid request:

SGET / HTTP/1.1
Accept: text/html, application/xhtml+xml, */*
Accept-Language: de-DE
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; Trident/7.0; rv:11.0) like Gecko
Accept-Encoding: gzip, deflate
If-Modified-Since: Tue, 26 Nov 2013 10:52:01 GMT
DNT: 1
Host: xyz
Pragma: no-cache
X-Forwarded-For: unknown, unknown
Cache-Control: max-age=259200
Connection: keep-alive

> No answer from Host

Direct request without squid:

S   GET / HTTP/1.1
Accept: text/html, application/xhtml+xml, */*
Accept-Language: de-DE
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; Trident/7.0; rv:11.0) like Gecko
Accept-Encoding: gzip, deflate
Host: xyz
If-Modified-Since: Tue, 26 Nov 2013 10:52:01 GMT
DNT: 1
Connection: Keep-Alive

> successful answer from Webserver.

Kind regards marc


-Ursprüngliche Nachricht-
Von: Grooz, Marc (regio iT) [mailto:marc.gr...@regioit.de] 
Gesendet: Dienstag, 26. November 2013 11:55
An: Kinkie
Cc: squid-users@squid-cache.org
Betreff: [squid-users] ##palin AW: [squid-users] #Can't access certain webpages

Hi Kinkie,

yes i made a capture but don't see the cause.

I send you my traces.

Kind regards.

Marc

-Ursprüngliche Nachricht-
Von: Kinkie [mailto:gkin...@gmail.com] 
Gesendet: Montag, 25. November 2013 15:45
An: Grooz, Marc (regio iT)
Cc: squid-users@squid-cache.org
Betreff: Re: [squid-users] #Can't access certain webpages

On Mon, Nov 25, 2013 at 3:21 PM, Grooz, Marc (regio iT)  
wrote:
> Hi,
>
> Currently I use Squid 3.3.8 and I can't use/access two webservers thru squid. 
> If I bypass squid this websites work great.
>
> One of this websites is a fileupload/download website with a generated 
> downloadlink. When I upload a file I receive the following Squidlog Entrys:
>
> TCP_MISS/200 398 GET http://w.y.x.z/cgi-bin/upload_status.cgi?
> .
> .
> TCP_MISS_ABORTED/000 0 GET http:// w.y.x.z/cgi-bin/upload_status.cgi?
> TCP_MISS/200 398 GET http://w.y.x.z/cgi-bin/upload_status.cgi?
>
> And the downloadlink never gets generated.
>
>
> In the second case you never get a webpage back from squid. If I use lynx 
> from the commandline of the squid system the Webpage gets loaded.
> With a tcpdump I see that if squid makes the request then the Webserver 
> didn't answer.

Well, this is consistent with the behavior in squid's logs.
Have you tried accessing the misbehaving server from a client running on the 
squid box, and comparing the differences in the network traces?


-- 
/kinkie


smime.p7s
Description: S/MIME cryptographic signature


[squid-users] ##palin AW: [squid-users] #Can't access certain webpages

2013-11-26 Thread Grooz, Marc (regio iT)
Hi Kinkie,

yes i made a capture but don't see the cause.

I send you my traces.

Kind regards.

Marc

-Ursprüngliche Nachricht-
Von: Kinkie [mailto:gkin...@gmail.com] 
Gesendet: Montag, 25. November 2013 15:45
An: Grooz, Marc (regio iT)
Cc: squid-users@squid-cache.org
Betreff: Re: [squid-users] #Can't access certain webpages

On Mon, Nov 25, 2013 at 3:21 PM, Grooz, Marc (regio iT)  
wrote:
> Hi,
>
> Currently I use Squid 3.3.8 and I can't use/access two webservers thru squid. 
> If I bypass squid this websites work great.
>
> One of this websites is a fileupload/download website with a generated 
> downloadlink. When I upload a file I receive the following Squidlog Entrys:
>
> TCP_MISS/200 398 GET http://w.y.x.z/cgi-bin/upload_status.cgi?
> .
> .
> TCP_MISS_ABORTED/000 0 GET http:// w.y.x.z/cgi-bin/upload_status.cgi?
> TCP_MISS/200 398 GET http://w.y.x.z/cgi-bin/upload_status.cgi?
>
> And the downloadlink never gets generated.
>
>
> In the second case you never get a webpage back from squid. If I use lynx 
> from the commandline of the squid system the Webpage gets loaded.
> With a tcpdump I see that if squid makes the request then the Webserver 
> didn't answer.

Well, this is consistent with the behavior in squid's logs.
Have you tried accessing the misbehaving server from a client running on the 
squid box, and comparing the differences in the network traces?


-- 
/kinkie


smime.p7s
Description: S/MIME cryptographic signature