Re: [squid-users] chunked transfer over sslbump

2024-01-19 Thread Arun Kumar
 Sorry, due to organization policy not possible to upload the debug logs. 
Anything to look specifically in the debug logs?Also please suggest if we can 
tweak the below sslbump configuration, to make the chunked transfer work 
seamless.
http_port tcpkeepalive=60,30,3 ssl-bump generate-host-certificates=on 
dynamic_cert_mem_cache_size=20MB tls-cert= tls-key= 
cipher=... options=NO_TLSv1,... tls_dh=prime256v1:
ssl_bump stare all

PS: Any documentations/video available to understand the bump/stare/peek/splice 
better? Not understanding much from the squid-cache.org contents.
On Friday, January 12, 2024 at 02:10:40 PM EST, Alex Rousskov 
 wrote:  
 On 2024-01-12 09:21, Arun Kumar wrote:
> On Wednesday, January 10, 2024 at 11:09:48 AM EST, Alex Rousskov wrote:
> 
> 
> On 2024-01-10 09:21, Arun Kumar wrote:
>  >> i) Retry seems to fetch one chunk of the response and not the complete.
>  >> ii) Enabling sslbump and turning ICAP off, not helping.
>  >> iii)  gcc version is 7.3.1 (Red Hat 7.3.1-17)
> 
>  >GCC v7 has insufficient C++17 support. I recommend installing GCC v9 or
> better and then trying with Squid v6.6 or newer.
> 
> Arun: Compiled Squid 6.6 with gcc 11.4 and still seeing the same issue.

Glad you were able to upgrade to Squid v6.6!


>  > FWIW, if the problem persists in Squid v6, sharing debugging logs would
> be the next recommended step.
> 
> Arun: /debug_options ALL,6 /giving too much log. Any particular option 
> we can use to debug this issue?


Please share[^1] a pointer to compressed ALL,9 cache.log collected while 
reproducing the problem with Squid v6.6:

https://wiki.squid-cache.org/SquidFaq/BugReporting#debugging-a-single-transaction

Debugging logs are for developers. Developers can deal with large 
volumes of debugging information. You can use services like DropBox to 
share large compressed logs. Said that, the better you can isolate the 
problem/traffic, the higher are the chances that a developer will (have 
the time to) find the answer to your question in the noisy log.

[^1]: Please feel free to share privately if needed, especially if you 
are using sensitive configuration or transactions.

Alex.


>  > Also want to point out that, squid connects to another non-squid proxy
>  > to reach internet.
>  > cache_peer  parent  0 no-query default
>  >
>  > On Tuesday, January 9, 2024 at 02:18:14 PM EST, Alex Rousskov wrote:
>  >
>  >
>  > On 2024-01-09 11:51, Zhang, Jinshu wrote:
>  >
>  >  > Client got below response headers and body. Masked few details.
>  >
>  > Thank you.
>  >
>  >
>  >  > Retry seems to fetch data remaining.
>  >
>  > I would expect a successful retry to fetch the entire response, not just
>  > the remaining bytes, but perhaps that is what you meant. Thank you for
>  > sharing this info.
>  >
>  >
>  >  > Want to point out that removing sslbump everything is working fine,
>  >  > but we wanted to keep it for ICAP scanning.
>  >
>  > What if you keep SslBump enabled but disable any ICAP analysis
>  > ("icap_enable off")? This test may tell us if the problem is between
>  > Squid and the origin server or Squid and the ICAP service...
>  >
>  >
>  >  > We tried compiling 6.x in Amazon linux, using latest gcc, but facing
>  > similar error -
>  > 
> https://lists.squid-cache.org/pipermail/squid-users/2023-July/026016.html 
>  
> <[squid-users] compile error in squid v6.1 
> >
>  >
>  > What is the "latest gcc" version in your environment? I suspect it is
>  > not the latest GCC version available to folks running Amazon Linux, but
>  > you may need to install some packages to get a more recent GCC version.
>  > Unfortunately, I cannot give specific instructions for Amazon Linux
>  > right now.
>  >
>  >
>  > HTH,
>  >
>  > Alex.
>  >
>  >
>  >  > HTTP/1.1 200 OK
>  >  > Date: Tue, 09 Jan 2024 15:41:33 GMT
>  >  > Server: Apache/mod_perl/2.0.10 Perl
>  >  > Content-Type: application/download
>  >  > X-Cache: MISS from ip-x-y-z
>  >  > Transfer-Encoding: chunked
>  >  > Via: xxx (ICAP)
>  >  > Connection: keep-alive
>  >  >
>  >  > 1000
>  >  > File-Id: xyz.zip
>  >  > Local-Path: x/y/z.txt
>  >  > Content-Size: 2967
>  >  > < binary content >
>  >  >
>  >  >
>  >  > Access log(1st attempt):
>  >  > 1704814893.695    138 x.y.0.2 NONE_NONE/200 0 CONNECT a.b.com:443 -
>  > FIRSTUP_PARENT/10.x.y.z -
>  >  > 1704814900.491  6779 172.17.0.2 TCP_MISS/200 138996535 POST
>  > https://a.b.com/xyz   > - FIRSTUP_PARENT/10.x.y.z
>  > application/download
>  >  >
>  >  > Retry after 5 mins:
>  >  > 1704815201.530    189 x.y.0.2 NONE_NONE/200 0 CONNECT a.b.com:443 -
>  > FIRSTUP_PARENT/10.x.y.z -
>  >  > 1704815208.438  6896 x.y.0.2 TCP_MISS/200 138967930 POST
>  > https://a.b.com/xyz   > - 

Re: [squid-users] offline mode not working for me

2024-01-19 Thread Robin Carlisle
Thanks for the explanations Amos, much appreciated.

On Thu, 18 Jan 2024 at 16:24, Amos Jeffries  wrote:

> On 19/01/24 03:53, Robin Carlisle wrote:
> > Hi, Hoping someone can help me with this issue that I have been
> > struggling with for days now.   I am setting up squid on an ubuntu PC to
> > forward HTTPS requests to an API and an s3 bucket under my control on
> > amazon AWS.  The reason I am setting up the proxy is two-fold...
> >
> > 1) To reduce costs from AWS.
> > 2) To provide content to the client on the ubuntu PC if there is a
> > networking issue somewhere in between the ubuntu PC and AWS.
> >
> > Item 1 is going well so far.   Item 2 is not going well.   Setup details
> ...
> >
> ...
>
> >
> > When network connectivity is BAD, I get errors and a cache MISS.   In
> > this test case I unplugged the ethernet cable from the back on the
> > ubuntu-pc ...
> >
> > *# /var/log/squid/access.log*
> > 1705588717.420 11 127.0.0.1 NONE_NONE/200 0 CONNECT
> > stuff.amazonaws.com:443  -
> > HIER_DIRECT/3.135.162.228  -
> > 1705588717.420  0 127.0.0.1 NONE_NONE/503 4087 GET
> > https://stuff.amazonaws.com/api/v1/stuff/stuff.json
> >  - HIER_NONE/-
> > text/html
> >
> > *# extract from /usr/bin/proxy-test output*
> > < HTTP/1.1 503 Service Unavailable
> > < Server: squid/5.7
> > < Mime-Version: 1.0
> > < Date: Thu, 18 Jan 2024 14:38:37 GMT
> > < Content-Type: text/html;charset=utf-8
> > < Content-Length: 3692
> > < X-Squid-Error: ERR_CONNECT_FAIL 101
> > < Vary: Accept-Language
> > < Content-Language: en
> > < X-Cache: MISS from ubuntu-pc
> > < X-Cache-Lookup: NONE from ubuntu-pc:3129
> > < Via: 1.1 ubuntu-pc (squid/5.7)
> > < Connection: close
> >
> > I have also seen it error in a different way with a 502 but with the
> > same ultimate result.
> >
> > My expectation/hope is that squid would return the cached object on any
> > network failure in between ubuntu-pc and the AWS endpoint - and continue
> > to return this cached object forever.   Is this something squid can do?
> >It would seem that offline_mode should do this?
> >
>
>
> FYI,  offline_mode is not a guarantee that a URL will always HIT. It is
> simply a form of "greedy" caching - where Squid will take actions to
> ensure that full-size objects are fetched whenever it lacks one, and
> serve things as stale HITs when a) it is not specifically prohibited,
> and b) a refresh/fetch is not working.
>
>
> The URL you are testing with should meet your expected behaviour due to
> the "Cache-Control: public, stale-of-error" header alone.
>Regardless of offline_mode configuration.
>
>
> That said, getting a 5xx response when there is an object already in
> cache seems like something is buggy to me.
>
> A high level cache.log will be needed to figure out what is going on
> (see https://wiki.squid-cache.org/SquidFaq/BugReporting#full-debug-output
> ).
> Be aware this list does not permit large posts so please provide a link
> to download in your reply not attachment.
>
>
> Cheers
> Amos
> ___
> squid-users mailing list
> squid-users@lists.squid-cache.org
> https://lists.squid-cache.org/listinfo/squid-users
>
___
squid-users mailing list
squid-users@lists.squid-cache.org
https://lists.squid-cache.org/listinfo/squid-users


Re: [squid-users] offline mode not working for me

2024-01-19 Thread Robin Carlisle
Hi, thanks so much for the detailed response.  I chose to test option 2
from your recommendations as I am new to squid and I do not understand how
to set it up as a reverse proxy anyway.  I made the change to my squid.conf
:


#ssl_bump peek step1

ssl_bump bump step1

ssl_bump bump all


This made it work - which is great news.   My curl requests now are
satisfied by the cache when the pc is offline!


I do have 1 followup question which I think is unrelated, let me know if
etiquette demands I create a new post for this.  When I test using
chromium browser, chromium sends OPTION requests - which I think is
something to do with CORS.   These always cause cache MISS  from squid,.. I
think because the return code is 204...?


1705669236.776113 ::1 TCP_MISS/204 680 OPTIONS
https://stuff.amazonaws.com/api/v1/stuff/stuff.json - HIER_DIRECT/
3.135.146.17 application/json


I can prevent my chromium instance from making these (pointless?) OPTIONS
calls using the following args, but I would rather not have to do this.


--disable-web-security  --disable-features=IsolateOrigins,site-per-process


Any way I can get squid to cache these calls?


Thanks again and all the best,


Robin





On Thu, 18 Jan 2024 at 16:03, Alex Rousskov <
rouss...@measurement-factory.com> wrote:

> On 2024-01-18 09:53, Robin Carlisle wrote:
>
> > My expectation/hope is that squid would return the cached object on
> > any network failure in between ubuntu-pc and the AWS endpoint - and
> > continue to return this cached object forever.   Is this something
> > squid can do? It would seem that offline_mode should do this?
>
> Yes and yes. The reason you are getting errors are not related to cache
> hits or misses. Those errors happen _before_ Squid gets the requested
> resource URL and looks up that resource in Squid cache.
>
> > ssl_bump peek step1
> > ssl_bump bump all
>
> To get that URL (in your configuration), Squid must bump the connection.
> To bump the connection at step2, Squid must contact the origin server.
> When the cable is unplugged, Squid obviously cannot do that: The attempt
> to open a Squid-AWS connection fails.
>
>  > .../200 0 CONNECT stuff.amazonaws.com:443 - HIER_DIRECT
>  > .../503 4087 GET https://stuff.amazonaws.com/api/... - HIER_NONE
>
> Squid reports bumping errors to the client using HTTP responses. To do
> that, Squid remembers the error response, bumps the client connection,
> receives GET from the client on that bumped connection, and sends that
> error response to the client. This is why you see both CONNECT/200 and
> GET/503 access.log records. Note that Squid does not check whether the
> received GET request would have been a cache hit in this case -- the
> response to that request has been preordained by the earlier bumping
> failure.
>
>
> Solution candidates to consider include:
>
> * Stop bumping: https_port 443 cert=/etc/squid/stuff.pem
>
> Configure Squid as (a reverse HTTPS proxy for) the AWS service. Use
> https_port. No SslBump rules/options! The client would think that it is
> sending HTTPS requests directly to the service. Squid will forward
> client requests to the service. If this works (and I do not have enough
> information to know that this will work in your specific environment),
> then you will get a much simpler setup.
>
>
> * Bump at step1, before Squid contacts AWS: ssl_bump bump all
>
> Bugs notwithstanding, there will be no Squid-AWS connection for cache
> hits. The resulting certificate will not be based on AWS service info,
> but it looks like your client is ignorant enough to ignore related
> certificate problems.
>
>
> HTH,
>
> Alex.
>
>
> > Hi, Hoping someone can help me with this issue that I have been
> > struggling with for days now.   I am setting up squid on an ubuntu PC to
> > forward HTTPS requests to an API and an s3 bucket under my control on
> > amazon AWS.  The reason I am setting up the proxy is two-fold...
> >
> > 1) To reduce costs from AWS.
> > 2) To provide content to the client on the ubuntu PC if there is a
> > networking issue somewhere in between the ubuntu PC and AWS.
> >
> > Item 1 is going well so far.   Item 2 is not going well.   Setup details
> ...
> >
> > *# squid - setup cache folder*
> > mkdir -p /var/cache/squid
> > chown -R proxy:proxy  /var/cache/squid
> >
> > *# ssl - generate key*
> > apt --yes install squid-openssl libnss3-tools
> > openssl req -new -newkey rsa:2048 -days 365 -nodes -x509 \
> >-subj "/C=US/ST=Denial/L=Springfield/O=Dis/CN=www.example.com
> > " \
> >-keyout /etc/squid/stuff.pem -out /etc/squid/stuff.pem
> > chown root:proxy /etc/squid/stuff.pem
> > chmod 644  /etc/squid/stuff.pem
> >
> > *# ssl - ssl DB*
> > mkdir -p /var/lib/squid
> > rm -rf /var/lib/squid/ssl_db
> > /usr/lib/squid/security_file_certgen -c -s /var/lib/squid/ssl_db -M 4MB
> > chown -R proxy:proxy /var/lib/squid/ssl_db
> >
> > *# /etc/squid/squid.conf :*
> > acl to_aws dstdomain .amazonaws.com 
> > acl 

[squid-users] Ubuntu 22.04 LTS repository for Squid 6.6 (rebuilt from sources in Debian unstable)

2024-01-19 Thread Rafael Akchurin
Hello everyone,

Online repository with latest Squid 6.6 (rebuilt from sources in Debian 
unstable) for Ubuntu 22.04 LTS 64-bit is available at 
https://squid66.diladele.com/.
Github repo https://github.com/diladele/squid-ubuntu/tree/master/src/ubuntu22 
contains all the scripts we used to make this compilation.

Here are simple instructions how to use the repo. For more information see 
readme at https://github.com/diladele/squid-ubuntu .

# add diladele apt key
wget -qO - https://packages.diladele.com/diladele_pub.asc | sudo apt-key add -

# add new repo
echo "deb https://squid66.diladele.com/ubuntu/ jammy main" \
> /etc/apt/sources.list.d/squid66.diladele.com.list

# and install
apt-get update && apt-get install -y \
squid-common \
squid-openssl \
squidclient \
libecap3 libecap3-dev

This version of Squid will now be part of Web Safety 9.0 coming out in March 
2024.  If you have some spare time and are interested in Admin UI for Squid and 
ICAP web filtering, consider downloading an appliance for VMware 
ESXi/vSphere or 
Microsoft 
Hyper-V or 
even deploy directly on Microsoft 
Azure
 and Amazon AWS.

Hope you will find this useful.

Best regards,
Rafael Akchurin
Diladele B.V.

<>___
squid-users mailing list
squid-users@lists.squid-cache.org
https://lists.squid-cache.org/listinfo/squid-users