Re: [squid-users] Squid transparent not caching apt requests from deb.debian.org

2020-04-08 Thread Matus UHLAR - fantomas

On 4/7/20 8:48 PM, zrm wrote:

https://www.trustiosity.com/squid/cache-debug.log.xz



On 4/8/20 10:46, Alex Rousskov wrote:

I found the reason for the difference.

After the destination IP address of your apt requests fails Host header
validation, Squid marks the request as "not cachable":


On 08.04.20 13:01, zrm wrote:
I checked the DNS query apt is making to see why it's different. It's 
making a SRV query for _http._tcp.deb.debian.org and then using the IP 
address of the name (prod.debian.map.fastly.net) returned in the SRV 
query. By contrast, squid does the A record query for deb.debian.org 
and gets a CNAME for debian.map.fastly.net. Almost the same, but since 
it's a CDN with many IP addresses, enough different that they happen 
to not both return the same address and then validation fails.


Meanwhile wget does the same A record query as squid and gets the same 
address.


The question then becomes what to do about it. Maybe if squid fails 
the validation for the A query, it should try the SRV query and accept 
the address as valid if it matches that. Another possibility would be 
a config option to have squid completely ignore the address the client 
used and always use the address it gets by doing its own DNS query for 
the host, in which case the result would be safe to cache.


But these are obviously changes requiring a new version of squid. Is 
there any way to make it work without that?


I'd contact debian.org DNS masters. I believe CDN wasn't designedto cause this
kind of issues.

--
Matus UHLAR - fantomas, uh...@fantomas.sk ; http://www.fantomas.sk/
Warning: I wish NOT to receive e-mail advertising to this address.
Varovanie: na tuto adresu chcem NEDOSTAVAT akukolvek reklamnu postu.
Despite the cost of living, have you noticed how popular it remains?
___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users


Re: [squid-users] Squid transparent not caching apt requests from deb.debian.org

2020-04-08 Thread zrm

On 4/8/20 10:46, Alex Rousskov wrote:

On 4/7/20 8:48 PM, zrm wrote:


https://www.trustiosity.com/squid/cache-debug.log.xz


I found the reason for the difference.

After the destination IP address of your apt requests fails Host header
validation, Squid marks the request as "not cachable":


I checked the DNS query apt is making to see why it's different. It's 
making a SRV query for _http._tcp.deb.debian.org and then using the IP 
address of the name (prod.debian.map.fastly.net) returned in the SRV 
query. By contrast, squid does the A record query for deb.debian.org and 
gets a CNAME for debian.map.fastly.net. Almost the same, but since it's 
a CDN with many IP addresses, enough different that they happen to not 
both return the same address and then validation fails.


Meanwhile wget does the same A record query as squid and gets the same 
address.


The question then becomes what to do about it. Maybe if squid fails the 
validation for the A query, it should try the SRV query and accept the 
address as valid if it matches that. Another possibility would be a 
config option to have squid completely ignore the address the client 
used and always use the address it gets by doing its own DNS query for 
the host, in which case the result would be safe to cache.


But these are obviously changes requiring a new version of squid. Is 
there any way to make it work without that?

___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users


Re: [squid-users] Squid transparent not caching apt requests from deb.debian.org

2020-04-08 Thread Alex Rousskov
On 4/7/20 8:48 PM, zrm wrote:

> https://www.trustiosity.com/squid/cache-debug.log.xz

I found the reason for the difference.

After the destination IP address of your apt requests fails Host header
validation, Squid marks the request as "not cachable":

> hostHeaderIpVerify: IP 151.101.248.204:80 does not match from-Host IP 
> 151.101.202.133
> hostHeaderIpVerify: FAILED to validate IP 151.101.248.204:80
> clientInterpretRequestHeaders: REQ_CACHABLE = NOT SET


After the destination IP address of your wget requests passes Host
header validation, Squid marks the request as "cachable":

> hostHeaderIpVerify: validated IP 151.101.202.133:80
> clientInterpretRequestHeaders: REQ_CACHABLE = SET


N.B. The log lines above have been slightly adjusted for readability
(this particular raw output is rather difficult to interpret correctly
IMO), but you can easily find raw lines if you look for the preserved
function names.


I hope others on the list will guide you towards a resolution of this
problem.


HTH,

Alex.

> On 4/6/20 11:49, Alex Rousskov wrote:
>> On 4/4/20 8:02 PM, zrm wrote:
>>> Attached cache.log excerpt for wget-wget-apt-apt-wget-wget. It answers
>>> the apt requests from the cache once it's in there, it just won't cache
>>> it to begin with when apt makes the request
>>
>> Thank you for sharing this log. I agree with your conclusion. The apt
>> query results in cache revalidation and does not purge the already
>> cached copy. This conclusion eliminates a few suspects.
>>
>> There is probably something special about the combination of an apt
>> request and a 200 OK miss response that prevents Squid from caching that
>> response. I do not see anything wrong in the logs you have already
>> already posted. Perhaps others will spot something.
>>
>> If you get no better responses, please post a link to a compressed
>> apt-apt-wget-wget log, starting from a cache that does not contain the
>> response in question and after enabling elevated Squid debugging with
>> "squid -k debug" or similar. You can find more instructions about
>> debugging individual transactions at
>> https://wiki.squid-cache.org/SquidFaq/BugReporting#Debugging_a_single_transaction
>>
>>
>> A detailed apt-apt-wget-wget log should tell us why Squid is refusing to
>> cache a 200 OK response to the apt query but caches a very similar
>> response to a very similar wget query.
>>
>>
>> Thank you,
>>
>> Alex.
>>
>>
>>
>>> [wget] 1586041686.600    725 192.168.111.55 TCP_MISS/200 1281195 GET
>>> http://deb.debian.org/debian/pool/main/v/vim/vim_8.1.0875-5_amd64.deb -
>>> ORIGINAL_DST/199.232.66.133 application/x-debian-package
>>> [wget] 1586041710.518    107 192.168.111.55 TCP_REFRESH_UNMODIFIED/200
>>> 1281232 GET
>>> http://deb.debian.org/debian/pool/main/v/vim/vim_8.1.0875-5_amd64.deb -
>>> ORIGINAL_DST/199.232.66.133 application/x-debian-package
>>> [apt] 1586041733.058 69 192.168.111.55 TCP_REFRESH_UNMODIFIED/200
>>> 1281234 GET
>>> http://deb.debian.org/debian/pool/main/v/vim/vim_8.1.0875-5_amd64.deb -
>>> ORIGINAL_DST/151.101.200.204 application/x-debian-package
>>> [apt] 1586041753.971    101 192.168.111.55 TCP_REFRESH_UNMODIFIED/200
>>> 1281234 GET
>>> http://deb.debian.org/debian/pool/main/v/vim/vim_8.1.0875-5_amd64.deb -
>>> ORIGINAL_DST/151.101.200.204 application/x-debian-package
>>> [wget] 1586041769.162    160 192.168.111.55 TCP_REFRESH_UNMODIFIED/200
>>> 1281232 GET
>>> http://deb.debian.org/debian/pool/main/v/vim/vim_8.1.0875-5_amd64.deb -
>>> ORIGINAL_DST/199.232.66.133 application/x-debian-package
>>> [wget] 1586041786.916 71 192.168.111.55 TCP_REFRESH_UNMODIFIED/200
>>> 1281232 GET
>>> http://deb.debian.org/debian/pool/main/v/vim/vim_8.1.0875-5_amd64.deb -
>>> ORIGINAL_DST/151.101.250.133 application/x-debian-package
>>>
 BTW, you probably do not need to make ALL,2 logs pretty -- we can
 figure
 out what happens based on Squid messages if you submit one transaction
 at a time and disclose transaction sequence. You can just post (a link
 to) raw (or sanitized) logs. Compress them if they are too big.
>>>
 Before sharing the logs, please double check that the problem you want
 to address was reproduced during the test.
>>>
>>> In this case we start with wget and then it is already in the cache for
>>> the requests made by apt. The problem is the data not being cached when
>>> apt makes the request and it isn't already there. The apt requests do
>>> get answered from the cache if it is already there.
>>>
>>> The headers from the previous email show what happens when apt makes the
>>> request and it's not already in the cache.
>>>
> Last-Modified: Sat, 15 Jun 2019 17:46:35 GMT
> ETag: "1389dc-58b605823fa6e"
> Cache-Control: public, max-age=2592000
> Content-Length: 1280476
> Age: 4248100

 FWIW: The object is 4'248'100 seconds old. The object max-age is
 2'592'000 seconds. Your Squid is probably using an internal max-age of
 259'200 seconds, so Squid will 

Re: [squid-users] sometimes intermediate certificates were not downloaded when using sslbump

2020-04-08 Thread Dieter Bloms
Hello Louis,

thank you for your answer.

It is not my webserver. Am a user who wants to connect to the webserver.
I know that the certificate chain is incomplete.
As far as I know squid should be able to fetch the missing intermediate
certificates on its own with the help of Authority Information Access (AIA) to 
get the complete list.
So squid should be able to verify the server certificate even the
webserver doesn't deliver the intermediate certificates.

On Wed, Apr 08, L.P.H. van Belle wrote:

> This is a simple one. 
> 
> The certificate chain of that website is incorrect. 
> As shown here : 
> https://www.ssllabs.com/ssltest/analyze.html?d=www.formulare%2dbfinv.de
>  
> 
> Check you webserver first and correct you ciphers in your apache webserver. 
> 
> Greetz, 
> 
> Louis
>  
> 
> > -Oorspronkelijk bericht-
> > Van: squid-users 
> > [mailto:squid-users-boun...@lists.squid-cache.org] Namens Dieter Bloms
> > Verzonden: woensdag 8 april 2020 13:37
> > Aan: squid-users@lists.squid-cache.org
> > Onderwerp: [squid-users] sometimes intermediate certificates 
> > were not downloaded when using sslbump
> > 
> > Hello,
> > 
> > I use a self compiled squid 4.10 compiled as follow:
> > 
> > ~# squid --version
> > Squid Cache: Version 4.10
> > Service Name: squid
> > 
> > This binary uses OpenSSL 1.1.1d  10 Sep 2019. For legal 
> > restrictions on distribution see 
> > https://www.openssl.org/source/license.html
> > 
> > configure options:  '--prefix=/usr' '--sysconfdir=/etc/squid' 
> > '--bindir=/usr/sbin' '--sbindir=/usr/sbin' 
> > '--localstatedir=/var' '--libexecdir=/usr/sbin' 
> > '--datadir=/usr/share/squid' '--mandir=/usr/share/man' 
> > '--with-default-user=squid' '--with-filedescriptors=131072' 
> > '--with-logdir=/var/log/squid' '--disable-auto-locale' 
> > '--disable-auth-negotiate' '--disable-auth-ntlm' 
> > '--disable-eui' '--disable-carp' '--disable-htcp' 
> > '--disable-ident-lookups' '--disable-loadable-modules' 
> > '--disable-translation' '--disable-wccp' '--disable-wccpv2' 
> > '--enable-async-io=128' '--enable-auth' 
> > '--enable-auth-basic=LDAP NCSA' '--enable-auth-digest=LDAP 
> > file' '--enable-epoll' '--enable-log-daemon-helpers=file' 
> > '--enable-icap-client' '--enable-inline' '--enable-snmp' 
> > '--enable-disk-io=AIO,DiskThreads,IpcIo,Blocking' 
> > '--enable-storeio=ufs,aufs,rock' '--enable-referer-log' 
> > '--enable-useragent-log' '--enable-large-cache-files' 
> > '--enable-removal-policies=lru,heap' 
> > '--enable-follow-x-forwarded-for' '--enable-ssl-crtd' '--with-openssl'
> > 
> > in squid.conf I set following acl at the very benning of acl section:
> > 
> > # allow fetching of missing intermediate certificates
> > acl fetch_intermediate_certificate transaction_initiator 
> > certificate-fetching
> > cache allow fetch_intermediate_certificate
> > cache deny all
> > http_access allow fetch_intermediate_certificate
> > 
> > and squid fetches intermediate certificates for websites 
> > like: https://incomplete-chain.badssl.com/
> > But squid doesn't fetch the intermediate certificates for the 
> > site https://www.formulare-bfinv.de/
> > and I don't know why.
> > 
> > I checked all AiA entries in the certificates and it looks good to me.
> > 
> > Can anybody try the site https://www.formulare-bfinv.de/ with 
> > enabled sslbump,
> > so I can see whether my installation is broken or the 
> > webserver configuration isn't correct ?
> > 
> > Thank you very much.
> > 
> > -- 
> > Best regards
> > 
> >   Dieter Bloms
> > 
> > --
> > I do not get viruses because I do not use MS software.
> > If you use Outlook then please do not put my email address in your
> > address-book so that WHEN you get a virus it won't use my 
> > address in the
> > From field.
> > ___
> > squid-users mailing list
> > squid-users@lists.squid-cache.org
> > http://lists.squid-cache.org/listinfo/squid-users
> > 
> 

-- 
Gruß

  Dieter

--
I do not get viruses because I do not use MS software.
If you use Outlook then please do not put my email address in your
address-book so that WHEN you get a virus it won't use my address in the
From field.
___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users


Re: [squid-users] sometimes intermediate certificates were not downloaded when using sslbump

2020-04-08 Thread L . P . H . van Belle
This is a simple one. 

The certificate chain of that website is incorrect. 
As shown here : 
https://www.ssllabs.com/ssltest/analyze.html?d=www.formulare%2dbfinv.de 

Check you webserver first and correct you ciphers in your apache webserver. 

Greetz, 

Louis
 

> -Oorspronkelijk bericht-
> Van: squid-users 
> [mailto:squid-users-boun...@lists.squid-cache.org] Namens Dieter Bloms
> Verzonden: woensdag 8 april 2020 13:37
> Aan: squid-users@lists.squid-cache.org
> Onderwerp: [squid-users] sometimes intermediate certificates 
> were not downloaded when using sslbump
> 
> Hello,
> 
> I use a self compiled squid 4.10 compiled as follow:
> 
> ~# squid --version
> Squid Cache: Version 4.10
> Service Name: squid
> 
> This binary uses OpenSSL 1.1.1d  10 Sep 2019. For legal 
> restrictions on distribution see 
> https://www.openssl.org/source/license.html
> 
> configure options:  '--prefix=/usr' '--sysconfdir=/etc/squid' 
> '--bindir=/usr/sbin' '--sbindir=/usr/sbin' 
> '--localstatedir=/var' '--libexecdir=/usr/sbin' 
> '--datadir=/usr/share/squid' '--mandir=/usr/share/man' 
> '--with-default-user=squid' '--with-filedescriptors=131072' 
> '--with-logdir=/var/log/squid' '--disable-auto-locale' 
> '--disable-auth-negotiate' '--disable-auth-ntlm' 
> '--disable-eui' '--disable-carp' '--disable-htcp' 
> '--disable-ident-lookups' '--disable-loadable-modules' 
> '--disable-translation' '--disable-wccp' '--disable-wccpv2' 
> '--enable-async-io=128' '--enable-auth' 
> '--enable-auth-basic=LDAP NCSA' '--enable-auth-digest=LDAP 
> file' '--enable-epoll' '--enable-log-daemon-helpers=file' 
> '--enable-icap-client' '--enable-inline' '--enable-snmp' 
> '--enable-disk-io=AIO,DiskThreads,IpcIo,Blocking' 
> '--enable-storeio=ufs,aufs,rock' '--enable-referer-log' 
> '--enable-useragent-log' '--enable-large-cache-files' 
> '--enable-removal-policies=lru,heap' 
> '--enable-follow-x-forwarded-for' '--enable-ssl-crtd' '--with-openssl'
> 
> in squid.conf I set following acl at the very benning of acl section:
> 
> # allow fetching of missing intermediate certificates
> acl fetch_intermediate_certificate transaction_initiator 
> certificate-fetching
> cache allow fetch_intermediate_certificate
> cache deny all
> http_access allow fetch_intermediate_certificate
> 
> and squid fetches intermediate certificates for websites 
> like: https://incomplete-chain.badssl.com/
> But squid doesn't fetch the intermediate certificates for the 
> site https://www.formulare-bfinv.de/
> and I don't know why.
> 
> I checked all AiA entries in the certificates and it looks good to me.
> 
> Can anybody try the site https://www.formulare-bfinv.de/ with 
> enabled sslbump,
> so I can see whether my installation is broken or the 
> webserver configuration isn't correct ?
> 
> Thank you very much.
> 
> -- 
> Best regards
> 
>   Dieter Bloms
> 
> --
> I do not get viruses because I do not use MS software.
> If you use Outlook then please do not put my email address in your
> address-book so that WHEN you get a virus it won't use my 
> address in the
> From field.
> ___
> squid-users mailing list
> squid-users@lists.squid-cache.org
> http://lists.squid-cache.org/listinfo/squid-users
> 

___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users


[squid-users] sometimes intermediate certificates were not downloaded when using sslbump

2020-04-08 Thread Dieter Bloms
Hello,

I use a self compiled squid 4.10 compiled as follow:

~# squid --version
Squid Cache: Version 4.10
Service Name: squid

This binary uses OpenSSL 1.1.1d  10 Sep 2019. For legal restrictions on 
distribution see https://www.openssl.org/source/license.html

configure options:  '--prefix=/usr' '--sysconfdir=/etc/squid' 
'--bindir=/usr/sbin' '--sbindir=/usr/sbin' '--localstatedir=/var' 
'--libexecdir=/usr/sbin' '--datadir=/usr/share/squid' '--mandir=/usr/share/man' 
'--with-default-user=squid' '--with-filedescriptors=131072' 
'--with-logdir=/var/log/squid' '--disable-auto-locale' 
'--disable-auth-negotiate' '--disable-auth-ntlm' '--disable-eui' 
'--disable-carp' '--disable-htcp' '--disable-ident-lookups' 
'--disable-loadable-modules' '--disable-translation' '--disable-wccp' 
'--disable-wccpv2' '--enable-async-io=128' '--enable-auth' 
'--enable-auth-basic=LDAP NCSA' '--enable-auth-digest=LDAP file' 
'--enable-epoll' '--enable-log-daemon-helpers=file' '--enable-icap-client' 
'--enable-inline' '--enable-snmp' 
'--enable-disk-io=AIO,DiskThreads,IpcIo,Blocking' 
'--enable-storeio=ufs,aufs,rock' '--enable-referer-log' 
'--enable-useragent-log' '--enable-large-cache-files' 
'--enable-removal-policies=lru,heap' '--enable-follow-x-forwarded-for' 
'--enable-ssl-crtd' '--with-openssl'

in squid.conf I set following acl at the very benning of acl section:

# allow fetching of missing intermediate certificates
acl fetch_intermediate_certificate transaction_initiator certificate-fetching
cache allow fetch_intermediate_certificate
cache deny all
http_access allow fetch_intermediate_certificate

and squid fetches intermediate certificates for websites like: 
https://incomplete-chain.badssl.com/
But squid doesn't fetch the intermediate certificates for the site 
https://www.formulare-bfinv.de/
and I don't know why.

I checked all AiA entries in the certificates and it looks good to me.

Can anybody try the site https://www.formulare-bfinv.de/ with enabled sslbump,
so I can see whether my installation is broken or the webserver configuration 
isn't correct ?

Thank you very much.

-- 
Best regards

  Dieter Bloms

--
I do not get viruses because I do not use MS software.
If you use Outlook then please do not put my email address in your
address-book so that WHEN you get a virus it won't use my address in the
From field.
___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users