[squid-users] bye from me for now
hi all, just want to say bye from me for now as the company i work for have stopped using squid i implemented for them, they made good use out of it (since march 2020) but they found an alternative which is called Ericom ZTedge which went live Q4 2023 in a way im sad/upset/happy as i worked so hard to research/learn/implement it and then they finally realise what i have done is so good that they get a third party company to offer there alternative services there using ZTEdge for app call home activations (like what i did with squid) and also for end users to browse the web (they didnt want me to implement this but i could had done) i attach my script if interested thanks, rob -- Regards, Robert K Wild. squid.sh Description: Binary data ___ squid-users mailing list squid-users@lists.squid-cache.org https://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] log_db_daemon errors
> > Use of uninitialized value $DBI::errstr in concatenation (.) or > > string at /usr/lib64/squid/log_db_daemon line 403. > > You're trying to use an uninitialized variable when outputting(?) the > error message. Fix that first. I'm guessing you're using the `errstr` > function wrong there, see the official documentation for hints: > https://metacpan.org/pod/DBD::MariaDB > > > Cannot connect to database: at /usr/lib64/squid/log_db_daemon line > > 403. > > And then you should see what error you're actually getting here. My > guess is that it will be a permission issue. User not allowed to > connect from this host, or process not allowed to access the socket or > something similar. My apologies, I missed that that might not be a script you've written. I guess it is a ready-made script? ___ squid-users mailing list squid-users@lists.squid-cache.org https://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] log_db_daemon errors
> Use of uninitialized value $DBI::errstr in concatenation (.) or > string at /usr/lib64/squid/log_db_daemon line 403. You're trying to use an uninitialized variable when outputting(?) the error message. Fix that first. I'm guessing you're using the `errstr` function wrong there, see the official documentation for hints: https://metacpan.org/pod/DBD::MariaDB > Cannot connect to database: at /usr/lib64/squid/log_db_daemon line > 403. And then you should see what error you're actually getting here. My guess is that it will be a permission issue. User not allowed to connect from this host, or process not allowed to access the socket or something similar. ___ squid-users mailing list squid-users@lists.squid-cache.org https://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] Outgoing traffic through certain device instead of IP?
> AFAIK, dante can detect which IP address different interface use and > use those address (so you can define "eth0" instead of 192.0.2.1), ... How would that work? `tcp_outgoing_address` and `udp_outgoing_address` only accept IP addresses, I'm not seeing another option which would match those and allow to use a device name instead (my original requirement). ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] Outgoing traffic through certain device instead of IP?
> Squid is limited to selecting certain details of the TCP packets. > Device routing details are up to the operating system. I thought as much, thanks for clarifying. > You can have Squid can set dst-IP or TOS/QoS marking on packets. The > OS routing services should be able to use those to do its selection. That would also be a possibility, yes. That would be `tcp_outgoing_mark` and `tcp_outgoing_tos`, right? I'm not seeing an option to do that for UDP, though (DNS requests). ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
[squid-users] Outgoing traffic through certain device instead of IP?
I'd like to send all the outgoing traffic from Squid through a certain network device instead of an IP. There's `tcp_outgoing_address` and `udp_outgoing_address` which only accepts an IP as parameter, but there's no way to use a certain device? I just wanted to verify that there is currently no way to have a certain network device specified because I couldn't find anything about it in the documentation. My use-case here is that I have multiple OpenVPN tunnels open and use Squid to funnel traffic through them (including DNS queries which works great!). These OpenVPN tunnels all have their own network device, but the IP address might or might not change at some point, and when that happens Squid won't be able to forward traffic anymore. Of course I can work around that (OpenVPN `--ipchange` to fire a script when the IP changes), but I just wanted to check whether I've missed something here. ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
[squid-users] Making squid into socks proxy
Hi all, Sorry for the dumb question but what is the point of making squid into a socks5 proxy and if I have already built squid how do I add it to squid https://serverfault.com/questions/820578/how-to-enable-socks5-for-squid-proxy#820612 Thanks, Rob ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] correct regular expression to use to capture all
Thank you guys, Tbh I didn't think regex would be more CPU intense so thanks for that On Sun, 9 Jul 2023, 16:04 Matus UHLAR - fantomas, wrote: > On 08.07.23 13:07, robert k Wild wrote: > >True but I don't want to create two ACL lists, one for "ssl name" and one > >for "ssl name regex" > > try only create one for ssl name. You rarely need regex. > performance will thank youl > > >On Sat, 8 Jul 2023, 12:57 Matus UHLAR - fantomas, > wrote: > > > >> On 07.07.23 21:13, robert k Wild wrote: > >> >i know ive been talking about this before but i want to understand why > i > >> >cant use this regex > >> > > >> >(^|.*)redshift3d.com$ > >> > > >> >instead i have to use this > >> > > >> >(^|\.)redshift3d.com$ OR > >> > > >> >(^|\.)redshift3d\.com$ > >> > > >> >for strings > >> > > >> >www.redshift3d.com > >> >activate.redshift3d.com > >> >redshift3d.com > >> > >> AFAIK "dstdomain .redshift3d.com" > >> > >> matches the same, but without complicated, cpu-expensive and hardly > >> readable > >> regular expressions > > -- > Matus UHLAR - fantomas, uh...@fantomas.sk ; http://www.fantomas.sk/ > Warning: I wish NOT to receive e-mail advertising to this address. > Varovanie: na tuto adresu chcem NEDOSTAVAT akukolvek reklamnu postu. > Depression is merely anger without enthusiasm. > ___ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users > ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] correct regular expression to use to capture all
So will this (^|\.)redshift3d\.com$ I know it will match Blah.redshift3d.com and redshift3d.com But what about Blah.Blah.redshift3d.com On Sat, 8 Jul 2023, 22:32 Alex Rousskov, wrote: > On 7/8/23 06:14, Stuart Henderson wrote: > > On 2023-07-07, robert k Wild wrote: > >> --===6398075081121841451== > >> Content-Type: multipart/alternative; > boundary="a03dcc05ffeb4428" > >> > >> --a03dcc05ffeb4428 > >> Content-Type: text/plain; charset="UTF-8" > >> > >> hi all, > >> > >> i know ive been talking about this before but i want to understand why i > >> cant use this regex > >> > >> (^|.*)redshift3d.com$ > > > > this matches anythingredshift3d[any single character or nothing]com > > Correction: A regular expression "." does _not_ match "nothing" (i.e. > zero characters). > > In some contexts, "." may not even match some special single characters, > but those contexts are probably not applicable to this thread scope. > > > >> instead i have to use this > >> > >> (^|\.)redshift3d.com$ OR > > > > this matches redshift3d[any single character or nothing]com > > or anything.redshift3d[any single character or nothing]com > > ... or .redshift3d.com > > Same "nothing" correction here. > > > >> (^|\.)redshift3d\.com$ > > > > this only matches redshift3d.com or anything.redshift3d.com > > ... or .redshift3d.com > > > Alex. > > > > So, if you only want to match on things exactly in the redshift3d.com > > domain and no others, you need the last one. > > > > ___ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users > ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] correct regular expression to use to capture all
True but I don't want to create two ACL lists, one for "ssl name" and one for "ssl name regex" On Sat, 8 Jul 2023, 12:57 Matus UHLAR - fantomas, wrote: > On 07.07.23 21:13, robert k Wild wrote: > >i know ive been talking about this before but i want to understand why i > >cant use this regex > > > >(^|.*)redshift3d.com$ > > > >instead i have to use this > > > >(^|\.)redshift3d.com$ OR > > > >(^|\.)redshift3d\.com$ > > > >for strings > > > >www.redshift3d.com > >activate.redshift3d.com > >redshift3d.com > > AFAIK "dstdomain .redshift3d.com" > > matches the same, but without complicated, cpu-expensive and hardly > readable > regular expressions > > -- > Matus UHLAR - fantomas, uh...@fantomas.sk ; http://www.fantomas.sk/ > Warning: I wish NOT to receive e-mail advertising to this address. > Varovanie: na tuto adresu chcem NEDOSTAVAT akukolvek reklamnu postu. > Despite the cost of living, have you noticed how popular it remains? > ___ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users > ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
[squid-users] correct regular expression to use to capture all
hi all, i know ive been talking about this before but i want to understand why i cant use this regex (^|.*)redshift3d.com$ instead i have to use this (^|\.)redshift3d.com$ OR (^|\.)redshift3d\.com$ for strings www.redshift3d.com activate.redshift3d.com redshift3d.com thanks, rob -- Regards, Robert K Wild. ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] squid 6.1 new lines when i do squid -k reconfigure
Thanks Alex On Fri, 7 Jul 2023, 15:40 Alex Rousskov, wrote: > On 7/7/23 09:50, robert k Wild wrote: > > > i think this is a new feature in squid as i never saw it in squid 4/5 > > It is, essentially, a bug fix: Depending on the Squid version, these > lines were previously either logged to unlocked cache.log or not logged > at all. > > > > basically when you do > > > > /usr/local/squid/sbin/squid -k reconfigure > > > > you get back a blank line but now you get information regarding the > > reconfigure which is a nice touch but how do you get rid of it, ie only > > show back errors > > Unfortunately, Squid developers lack consensus regarding which lines > should be logged at debug levels 0 and 1. Fortunately, you can now > control many of these messages using cache_log_message directive: > > http://www.squid-cache.org/Doc/config/cache_log_message/ > > For example, the following configuration will raise the level of the > reconfiguration messages sampled below to 2: > > cache_log_message id=4 level=2 > cache_log_message id=68 level=2 > > > > 2023/07/07 09:43:54| Processing Configuration File: > /usr/local/squid/etc/squid.conf (depth 0) > > 2023/07/07 09:43:54| Processing Configuration File: > /usr/local/squid/etc/squidrules.conf (depth 1) > > The above messages have ID 68. > > > > 2023/07/07 09:43:54| Set Current Directory to > /usr/local/squid/var/cache/squid > > The above message has ID 4. > > > See doc/debug-messages.dox for message IDs. If the message you want to > control is not covered, see > > https://wiki.squid-cache.org/SquidFaq/AboutSquid#how-to-add-a-new-squid-feature-enhance-of-fix-something > > > Please note that messages printed before Squid configuration is parsed > for the very first time are not controlled by that configuration. We > plan to address that limitation by accepting "early" configuration > options from the command line, but I cannot give you an ETA for that > feature delivery because the corresponding project is currently dormant. > > > HTH, > > Alex. > > ___ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users > ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
[squid-users] squid 6.1 new lines when i do squid -k reconfigure
hi all, i think this is a new feature in squid as i never saw it in squid 4/5 basically when you do /usr/local/squid/sbin/squid -k reconfigure you get back a blank line but now you get information regarding the reconfigure which is a nice touch but how do you get rid of it, ie only show back errors /usr/local/squid/sbin/squid -k reconfigure 2023/07/07 09:43:54| Processing Configuration File: /usr/local/squid/etc/squid.conf (depth 0) 2023/07/07 09:43:54| Processing Configuration File: /usr/local/squid/etc/squidrules.conf (depth 1) 2023/07/07 09:43:54| Set Current Directory to /usr/local/squid/var/cache/squid -- Regards, Robert K Wild. ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] make URL bypass squid proxy
:DE:C3:51:48:59:46:71:1F:B5:9B > Timestamp : Dec 14 15:34:39.124 2022 GMT > Extensions: none > Signature : ecdsa-with-SHA256 > > 30:45:02:20:2C:E2:85:9C:A6:54:1B:1C:31:E5:F8:37: > > E9:CD:09:8B:D8:26:29:E4:C7:65:94:9C:FF:32:D2:41: > > CD:16:A3:51:02:21:00:A0:2F:C3:F7:A6:55:3B:21:EB: > > 9B:CA:6E:4E:07:A2:8C:40:4B:E2:27:D6:82:44:0F:09: > C9:F7:7D:1B:72:6F:13 > Signature Algorithm: sha256WithRSAEncryption > 25:bd:bb:de:57:c0:7f:07:5e:18:62:2e:0b:d3:03:54:a7:45: > ab:c6:1f:e2:f6:58:ff:6e:8e:6b:4f:09:9a:87:66:32:81:7f: > f4:35:4f:7e:65:e5:6a:04:d6:62:62:ff:d9:3a:f2:6f:19:ba: > fa:e6:35:0e:2a:44:5c:3b:ee:9d:97:72:05:86:0c:4c:01:c1: > f0:8c:21:c1:c4:84:54:d8:a8:05:25:18:72:db:f7:53:9b:f1: > 13:d6:0b:bc:92:6e:01:e3:fd:de:a1:45:e9:29:37:e1:2e:64: > 36:b4:4d:38:c1:60:02:6a:17:3d:87:a2:5f:33:3b:86:eb:0d: > cc:dd:fa:d4:43:58:50:43:e7:b7:ec:0a:4f:86:72:15:e5:30: > c9:bb:5f:0b:83:9c:26:6f:60:49:dd:1a:7c:92:45:45:4e:b5: > ce:cd:64:8c:12:83:e9:3d:5c:6b:65:97:75:99:4c:66:eb:d0: > 3a:ca:18:62:8a:08:07:16:ab:09:66:bd:65:43:94:00:d9:79: > 3e:84:b4:60:7d:7e:f9:09:3c:fe:2d:ad:98:94:17:0c:24:8f: > e1:a2:74:b6:3b:68:c0:01:f9:67:e8:b9:d2:6a:65:9e:99:a3: > 4a:5f:39:31:ae:c1:59:02:7b:ef:db:b2:94:06:f8:1a:74:c1: > d7:5b:5b:6a > > So the DNS names it will check are: > Subject: CN = activate.redshift3d.com > DNS:activate.redshift3d.com, DNS: > www.activate.redshift3d.com > > So to summarize the checks of ssl::server_name/ will be done on: > * activate.redshift3d.com > * activate.redshift3d.com > * www.activate.redshift3d.com > > So . redshift3d.com ssl::server_name should match the certificate. > If for any reason it doesn't work you can try ssl::server_name_regex with > something like: > (^|\.)activate\.redshift3d\.com$ > > Or just to verify if there is a bug in squid code try: > (^|\.)activate\.redshift3d\.com > > Now, the splice should be able to take into account also dstdomain and > dstdom_regex but it should match them only if they exist in a plain text > form like in any simple forward proxy CONNECT request. > If for any reason it doesn’t work we should investigate what might cause > this issue. > > I hope the scroll I wrote make sense to you and with hopes it will clear > out the doubts about the wiki article you mentioned. > I believe this is considered a summary of the subject and if Alex and > others might think so it can be converted into an example article in the > wiki. > > Let me know if this makes sense and resolve the issue. > > Yours, > Eliezer > > From: robert k Wild > Sent: Thursday, June 29, 2023 12:18 > To: ngtech1...@gmail.com > Cc: Squid Users > Subject: Re: [squid-users] make URL bypass squid proxy > > very clever, so you bunch all the acls up > > acl NoSSLInterceptAnyOf any-of NoSSLInterceptDstDom > NoSSLInterceptDstDomFile NoSSLInterceptRegEx NoSSLInterceptRegExFile > > the key word is "any-of" ie if the url hits any one do that first > > what about instead of making it > > ssl::server_name_regex > > i make it > > dstdom_regex > > On Thu, 29 Jun 2023 at 01:38, <mailto:ngtech1...@gmail.com> wrote: > Hey Rob, > > The first thing is to allow the domain in the http_acces just to be sure > and use a basic deny all bottom line. > Let me try to simplify your squid.conf > In a link: > https://gist.github.com/elico/b49f4a28d4b5db5ba882b10d40872d5e > > In plain text: > ## START OF FILE > # SSL Interception basic rules > acl DiscoverSNIHost at_step SslBump1 > > acl NoSSLInterceptRegEx ssl::server_name_regex (^|.*\.)redshift3d\.com$ > acl NoSSLInterceptRegExFile ssl::server_name_regex > "/usr/local/squid/etc/no-intercept-ssl-regex.txt" > > acl NoSSLInterceptDstDom ssl::server_name .redshift3d.com > acl NoSSLInterceptDstDomFile ssl::server_name > "/usr/local/squid/etc/no-intercept-ssl-dstdom.txt" > > ## Any of will test what ever rule match first in a first match/hit fasion > acl NoSSLInterceptAnyOf any-of NoSSLInterceptDstDom > NoSSLInterceptDstDomFile NoSSLInterceptRegEx NoSSLInterceptRegExFile > > ssl_bump peek DiscoverSNIHost > ssl_bump splice NoSSLInterceptAnyOf > ssl_bump bump all > > #SSL Bump port > http_port 3128 ssl-bump cert=/usr/local/squid/etc/ssl_cert/myCA.pem > generate-host-certificates=on dynamic_cert_mem_cache_size=4MB > sslcrtd_program /usr/local/squid/libexec/security_file_certgen -s > /var/lib/ssl_db -M 4MB > > ## http_access acls, will apply
Re: [squid-users] change squid proxy into intercept instead
Great thanks Amos for that clarification :) On Wed, 5 Jul 2023, 18:28 Amos Jeffries, wrote: > On 4/07/23 06:02, robert k Wild wrote: > > hi all, > > > > running squid proxy with SSL-bump and some sites dont like squid being > > MITM and it breaks the SSL cert, so i do peek/splice/bump so the SSL > > cert goes straight to the client without squid inspecting it > > > > do you think if i make squid into an intercept proxy it will resolve the > > peek/splice/bump issue ie i wont need it anymore > > > > It will not have any effect on your problem. > > > Cheers > Amos > ___ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users > ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
[squid-users] change squid proxy into intercept instead
hi all, running squid proxy with SSL-bump and some sites dont like squid being MITM and it breaks the SSL cert, so i do peek/splice/bump so the SSL cert goes straight to the client without squid inspecting it do you think if i make squid into an intercept proxy it will resolve the peek/splice/bump issue ie i wont need it anymore thanks, rob -- Regards, Robert K Wild. ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] acl follow_x_forwarded_for
thanks Antony On Mon, 3 Jul 2023 at 11:54, Antony Stone wrote: > On Monday 03 July 2023 at 11:46:20, robert k Wild wrote: > > > hi all, > > > > im reading this acl > > > > http://www.squid-cache.org/Doc/config/follow_x_forwarded_for/ > > > > is this to fool the dst server to think its coming from the client pc > > instead of squid proxy > > No; it tells Squid to accept connection requests from other proxies which > present the correct X-Forwarded-For header: > > "If a request reaches us from a source that is allowed by this directive, > then > we trust the information it provides regarding the IP of the client it > received from (if any)." > > > Antony. > > -- > Wanted: telepath. You know where to apply. > >Please reply to the > list; > please *don't* CC > me. > ___ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users > -- Regards, Robert K Wild. ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
[squid-users] acl follow_x_forwarded_for
hi all, im reading this acl http://www.squid-cache.org/Doc/config/follow_x_forwarded_for/ is this to fool the dst server to think its coming from the client pc instead of squid proxy thanks, rob -- Regards, Robert K Wild. ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] Getting ping to work via proxy
Yes I understand as squid is a http/Https proxy only, even tho if you make into an intercept proxy? Is there a way I can make some traffic use iptables to get to the URL instead of going through squid On Sun, 2 Jul 2023, 10:02 Matus UHLAR - fantomas, wrote: > On 01.07.23 23:29, robert k Wild wrote: > >Will it work if I make my squid into a intercept proxy instead like below > > no, because of the same reason. > > >>> On Saturday 01 July 2023 at 22:59:43, robert k Wild wrote: > >>> > Is there a way to get ping to work via the proxy. > > >> On Sat, 1 Jul 2023, 23:10 Antony Stone, < > antony.st...@squid.open.source.it> > >> wrote: > >>> There is no such thing as an ICMP proxy. > > -- > Matus UHLAR - fantomas, uh...@fantomas.sk ; http://www.fantomas.sk/ > Warning: I wish NOT to receive e-mail advertising to this address. > Varovanie: na tuto adresu chcem NEDOSTAVAT akukolvek reklamnu postu. > The early bird may get the worm, but the second mouse gets the cheese. > ___ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users > ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] Getting ping to work via proxy
Will it work if I make my squid into a intercept proxy instead like below https://wiki.squid-cache.org/SquidFaq/InterceptionProxy https://wiki.squid-cache.org/ConfigExamples/Intercept/SslBumpWithIntermediateCA Thanks On Sat, 1 Jul 2023, 23:15 robert k Wild, wrote: > So you can't get clients that go through the proxy server to ping to > destination servers > > On Sat, 1 Jul 2023, 23:10 Antony Stone, > wrote: > >> On Saturday 01 July 2023 at 22:59:43, robert k Wild wrote: >> >> > Hi all, >> > >> > Is there a way to get ping to work via the proxy. >> >> There is no such thing as an ICMP proxy. >> >> >> Antony. >> >> -- >> "Can you keep a secret?" >> "Well, I shouldn't really tell you this, but... no." >> >> >>Please reply to the >> list; >> please *don't* >> CC me. >> ___ >> squid-users mailing list >> squid-users@lists.squid-cache.org >> http://lists.squid-cache.org/listinfo/squid-users >> > ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] Getting ping to work via proxy
So you can't get clients that go through the proxy server to ping to destination servers On Sat, 1 Jul 2023, 23:10 Antony Stone, wrote: > On Saturday 01 July 2023 at 22:59:43, robert k Wild wrote: > > > Hi all, > > > > Is there a way to get ping to work via the proxy. > > There is no such thing as an ICMP proxy. > > > Antony. > > -- > "Can you keep a secret?" > "Well, I shouldn't really tell you this, but... no." > > >Please reply to the > list; > please *don't* CC > me. > ___ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users > ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
[squid-users] Getting ping to work via proxy
Hi all, Is there a way to get ping to work via the proxy. Thanks, Rob ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] make URL bypass squid proxy
very clever, so you bunch all the acls up acl NoSSLInterceptAnyOf any-of NoSSLInterceptDstDom NoSSLInterceptDstDomFile NoSSLInterceptRegEx NoSSLInterceptRegExFile the key word is "any-of" ie if the url hits any one do that first what about instead of making it ssl::server_name_regex i make it *dstdom_regex* On Thu, 29 Jun 2023 at 01:38, wrote: > Hey Rob, > > The first thing is to allow the domain in the http_acces just to be sure > and use a basic deny all bottom line. > Let me try to simplify your squid.conf > In a link: > https://gist.github.com/elico/b49f4a28d4b5db5ba882b10d40872d5e > > In plain text: > ## START OF FILE > # SSL Interception basic rules > acl DiscoverSNIHost at_step SslBump1 > > acl NoSSLInterceptRegEx ssl::server_name_regex (^|.*\.)redshift3d\.com$ > acl NoSSLInterceptRegExFile ssl::server_name_regex > "/usr/local/squid/etc/no-intercept-ssl-regex.txt" > > acl NoSSLInterceptDstDom ssl::server_name .redshift3d.com > acl NoSSLInterceptDstDomFile ssl::server_name > "/usr/local/squid/etc/no-intercept-ssl-dstdom.txt" > > ## Any of will test what ever rule match first in a first match/hit fasion > acl NoSSLInterceptAnyOf any-of NoSSLInterceptDstDom > NoSSLInterceptDstDomFile NoSSLInterceptRegEx NoSSLInterceptRegExFile > > ssl_bump peek DiscoverSNIHost > ssl_bump splice NoSSLInterceptAnyOf > ssl_bump bump all > > #SSL Bump port > http_port 3128 ssl-bump cert=/usr/local/squid/etc/ssl_cert/myCA.pem > generate-host-certificates=on dynamic_cert_mem_cache_size=4MB > sslcrtd_program /usr/local/squid/libexec/security_file_certgen -s > /var/lib/ssl_db -M 4MB > > ## http_access acls, will apply on incomming requests and not on responses > acl special_url_regex url_regex https?://(^|.*\.)redshift3d\.com\/ > acl special_url_regex_file url_regex > "/usr/local/squid/etc/special_url_regex.txt" > > acl special_url_dst_dom dstdomain .redshift3d.com > acl special_url_dst_dom_file dstdomain > "/usr/local/squid/etc/special_url_dstdom.txt" > > acl special_url_any_of any-of special_url_dst_dom special_url_dst_dom_file > special_url_regex special_url_regex_file > > acl localnet src 192.168.0.0/16 > acl localnet src 10.0.0.0/8 > > http_access allow localnet special_url_any_of > http_access deny all > ## END OF FILE > > Once the above will work try to add other http_access rule like reply > access rules > > Let me know what happens, > Eliezer > > From: robert k Wild > Sent: Tuesday, June 27, 2023 09:36 > To: ngtech1...@gmail.com > Cc: Squid Users > Subject: Re: [squid-users] make URL bypass squid proxy > > Hi Eliezer, > > this is a snippet of my whitelist and no intercept SSL config > > #SSL Interception > acl DiscoverSNIHost at_step SslBump1 > acl NoSSLIntercept ssl::server_name_regex > "/usr/local/squid/etc/interceptssl.txt" > ssl_bump peek DiscoverSNIHost > ssl_bump splice NoSSLIntercept > ssl_bump bump all > # > #SSL Bump > http_port 3128 ssl-bump cert=/usr/local/squid/etc/ssl_cert/myCA.pem > generate-host-certificates=on dynamic_cert_mem_cache_size=4MB > sslcrtd_program /usr/local/squid/libexec/security_file_certgen -s > /var/lib/ssl_db -M 4MB > # > #deny up MIME types > acl upmime req_mime_type "/usr/local/squid/etc/mimedeny.txt" > # > #deny URL links > acl url_links url_regex "/usr/local/squid/etc/linksurl.txt" > # > #allow special URL paths > acl special_url url_regex "/usr/local/squid/etc/urlspecial.txt" > # > #deny down MIME types > acl downmime rep_mime_type "/usr/local/squid/etc/mimedeny.txt" > # > http_reply_access allow special_url > http_reply_access deny downmime > #http_access deny upmime > #http_access deny url_links > # > #HTTP_HTTPS whitelist websites > acl whitelist ssl::server_name_regex "/usr/local/squid/etc/urlwhite.txt" > # > http_access allow activation whitelist > http_access deny all > > so basically no SSL interception > > #SSL Interception > acl DiscoverSNIHost at_step SslBump1 > acl NoSSLIntercept ssl::server_name_regex > "/usr/local/squid/etc/interceptssl.txt" > ssl_bump peek DiscoverSNIHost > ssl_bump splice NoSSLIntercept > ssl_bump bump all > > and whitelisting > > #HTTP_HTTPS whitelist websites > acl whitelist ssl::server_name_regex "/usr/local/squid/etc/urlwhite.txt" > > in both txt files ie > > /usr/local/squid/etc/interceptssl.txt > /usr/local/squid/etc/urlwhite.txt > > i have a URL that first i have to whitelist and then if i want squid not > to inspect the url traffic i put it in the SSL interception (i do this as > some websites dont like MITM )
Re: [squid-users] make URL bypass squid proxy
Ok I've literally commented out "http deny all" so the proxy isn't blocking anything and allowing everything http_access allow activation whitelist #http_access deny all And still it's not allowing this specific URL to go through the proxy activate.redshift3d.com Well it is but it isn't, as it's an activation URL it isn't activating the app via the proxy, as soon as I pop the pc on the internet, it activates the app Any ideas guys? Thanks, Rob On Tue, 27 Jun 2023, 07:36 robert k Wild, wrote: > Hi Eliezer, > > this is a snippet of my whitelist and no intercept SSL config > > #SSL Interception > acl DiscoverSNIHost at_step SslBump1 > acl NoSSLIntercept ssl::server_name_regex > "/usr/local/squid/etc/interceptssl.txt" > ssl_bump peek DiscoverSNIHost > ssl_bump splice NoSSLIntercept > ssl_bump bump all > # > #SSL Bump > http_port 3128 ssl-bump cert=/usr/local/squid/etc/ssl_cert/myCA.pem > generate-host-certificates=on dynamic_cert_mem_cache_size=4MB > sslcrtd_program /usr/local/squid/libexec/security_file_certgen -s > /var/lib/ssl_db -M 4MB > # > #deny up MIME types > acl upmime req_mime_type "/usr/local/squid/etc/mimedeny.txt" > # > #deny URL links > acl url_links url_regex "/usr/local/squid/etc/linksurl.txt" > # > #allow special URL paths > acl special_url url_regex "/usr/local/squid/etc/urlspecial.txt" > # > #deny down MIME types > acl downmime rep_mime_type "/usr/local/squid/etc/mimedeny.txt" > # > http_reply_access allow special_url > http_reply_access deny downmime > #http_access deny upmime > #http_access deny url_links > # > #HTTP_HTTPS whitelist websites > acl whitelist ssl::server_name_regex "/usr/local/squid/etc/urlwhite.txt" > # > http_access allow activation whitelist > http_access deny all > > so basically no SSL interception > > #SSL Interception > acl DiscoverSNIHost at_step SslBump1 > acl NoSSLIntercept ssl::server_name_regex > "/usr/local/squid/etc/interceptssl.txt" > ssl_bump peek DiscoverSNIHost > ssl_bump splice NoSSLIntercept > ssl_bump bump all > > and whitelisting > > #HTTP_HTTPS whitelist websites > acl whitelist ssl::server_name_regex "/usr/local/squid/etc/urlwhite.txt" > > in both txt files ie > > /usr/local/squid/etc/interceptssl.txt > /usr/local/squid/etc/urlwhite.txt > > i have a URL that first i have to whitelist and then if i want squid not > to inspect the url traffic i put it in the SSL interception (i do this as > some websites dont like MITM ) > > but even putting the URL in question in both files im still having issues > with this website ie its still being detected that its passing through a > proxy > > thanks, > rob > > On Mon, 26 Jun 2023 at 23:35, wrote: > >> Hey Robert, >> >> >> >> I am not sure what forward proxy setup you have there. >> >> A simple forward proxy? >> >> What tool are you using for whitelisting? >> >> You can use an external acl helper to allow dynamic updates of the >> whitelists or >> to periodic update your lists and reload. >> It will depend on the size of your lists. >> What OS are you using for your squid proxy? >> >> >> >> More details will help us help you. >> >> >> >> Eliezer >> >> >> >> *From:* squid-users *On >> Behalf Of *robert k Wild >> *Sent:* Monday, June 26, 2023 22:25 >> *To:* Squid Users >> *Subject:* [squid-users] make URL bypass squid proxy >> >> >> >> hi all, >> >> >> >> i have set up squid for url whitelisting and no intercept SSL (see below) >> >> >> >> https://wiki.squid-cache.org/ConfigExamples/Caching/AdobeProducts >> >> >> >> but some websites i want the client to bypass the squid proxy and go >> straight to the website as i think this is why a url isnt working even when >> i add the url to both files ie urlwhite and no intercept SSL >> >> >> >> >> >> >> >> thanks, >> >> rob >> >> >> -- >> >> Regards, >> >> Robert K Wild. >> > > > -- > Regards, > > Robert K Wild. > ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] make URL bypass squid proxy
Hi Eliezer, this is a snippet of my whitelist and no intercept SSL config #SSL Interception acl DiscoverSNIHost at_step SslBump1 acl NoSSLIntercept ssl::server_name_regex "/usr/local/squid/etc/interceptssl.txt" ssl_bump peek DiscoverSNIHost ssl_bump splice NoSSLIntercept ssl_bump bump all # #SSL Bump http_port 3128 ssl-bump cert=/usr/local/squid/etc/ssl_cert/myCA.pem generate-host-certificates=on dynamic_cert_mem_cache_size=4MB sslcrtd_program /usr/local/squid/libexec/security_file_certgen -s /var/lib/ssl_db -M 4MB # #deny up MIME types acl upmime req_mime_type "/usr/local/squid/etc/mimedeny.txt" # #deny URL links acl url_links url_regex "/usr/local/squid/etc/linksurl.txt" # #allow special URL paths acl special_url url_regex "/usr/local/squid/etc/urlspecial.txt" # #deny down MIME types acl downmime rep_mime_type "/usr/local/squid/etc/mimedeny.txt" # http_reply_access allow special_url http_reply_access deny downmime #http_access deny upmime #http_access deny url_links # #HTTP_HTTPS whitelist websites acl whitelist ssl::server_name_regex "/usr/local/squid/etc/urlwhite.txt" # http_access allow activation whitelist http_access deny all so basically no SSL interception #SSL Interception acl DiscoverSNIHost at_step SslBump1 acl NoSSLIntercept ssl::server_name_regex "/usr/local/squid/etc/interceptssl.txt" ssl_bump peek DiscoverSNIHost ssl_bump splice NoSSLIntercept ssl_bump bump all and whitelisting #HTTP_HTTPS whitelist websites acl whitelist ssl::server_name_regex "/usr/local/squid/etc/urlwhite.txt" in both txt files ie /usr/local/squid/etc/interceptssl.txt /usr/local/squid/etc/urlwhite.txt i have a URL that first i have to whitelist and then if i want squid not to inspect the url traffic i put it in the SSL interception (i do this as some websites dont like MITM ) but even putting the URL in question in both files im still having issues with this website ie its still being detected that its passing through a proxy thanks, rob On Mon, 26 Jun 2023 at 23:35, wrote: > Hey Robert, > > > > I am not sure what forward proxy setup you have there. > > A simple forward proxy? > > What tool are you using for whitelisting? > > You can use an external acl helper to allow dynamic updates of the > whitelists or > to periodic update your lists and reload. > It will depend on the size of your lists. > What OS are you using for your squid proxy? > > > > More details will help us help you. > > > > Eliezer > > > > *From:* squid-users *On > Behalf Of *robert k Wild > *Sent:* Monday, June 26, 2023 22:25 > *To:* Squid Users > *Subject:* [squid-users] make URL bypass squid proxy > > > > hi all, > > > > i have set up squid for url whitelisting and no intercept SSL (see below) > > > > https://wiki.squid-cache.org/ConfigExamples/Caching/AdobeProducts > > > > but some websites i want the client to bypass the squid proxy and go > straight to the website as i think this is why a url isnt working even when > i add the url to both files ie urlwhite and no intercept SSL > > > > > > > > thanks, > > rob > > > -- > > Regards, > > Robert K Wild. > -- Regards, Robert K Wild. ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
[squid-users] make URL bypass squid proxy
hi all, i have set up squid for url whitelisting and no intercept SSL (see below) https://wiki.squid-cache.org/ConfigExamples/Caching/AdobeProducts but some websites i want the client to bypass the squid proxy and go straight to the website as i think this is why a url isnt working even when i add the url to both files ie urlwhite and no intercept SSL thanks, rob -- Regards, Robert K Wild. ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] Are you using ESI?
ESI in Squid hasn't had investment done to it for neigh on twenty years now. Not a bad run . I have no idea if there are users out there... Certainly no one has reached out to me about it for many years. I think newer models of edge compute like WASM and ICAP are generally better: narrower interfaces that unlock huge potential. The implementation in Squid should be savable if folk are interested - I don't remember it being all that bad - but I certainly don't have time to offer to do that. HTH, Rob On Fri, 24 Feb 2023 at 23:50, Alex Rousskov < rouss...@measurement-factory.com> wrote: > Hello, > > ESI support in Squid has been a significant source of problems for > many years. One of the biggest problems is affecting a lot of code that > has nothing to do with the ESI module! I see no signs of a significant > ESI user base, but some users may still exist. Before proposing to > remove ESI support from Squid, I would like to better estimate the > negative impact of that removal on existing Squid installations. > > If your Squid installation uses ESI features, please respond (in private > if necessary). How would ESI removal affect your Squids? Would you be > willing and able to rewrite the ESI module integration with Squid > primary APIs (or hire a developer capable of such a serious project)? > > > Thank you, > > Alex. > ___ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users > ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] repos/packages to install squid from source
sorry i forgot to mention, its Alma Linux On Thu, 16 Feb 2023 at 19:38, Alex Rousskov < rouss...@measurement-factory.com> wrote: > On 2/16/23 11:41, robert k Wild wrote: > > > installing squid from source and just want to know what default > > repos/packages i should have before i install > > > > atm i have these > > > > #install epel repositry > > dnf install -y epel-release > > #install squid packages > > dnf install -y gcc-c++ gcc g++ binutils make sudo wget tar automake > > autoconf perl libxml2-devel libcap-devel openssl openssl-devel libarchive > > #install clamAV packages > > dnf install -y clamd clamav clamav-data clamav-devel clamav-lib > > clamav-update > > > > do i need anymore? > > Squid does not really have a concept of "default" dependencies. In > practice, the list of dependencies is determined by what you need your > Squid to do. > > FWIW, some of the Squid CI tests running on Ubuntu install the following > packages, but this is not a complete list of everything that a > particular Squid may need or may use: > > comerr-dev:amd64 > debhelper > dh-apparmor > dh-autoreconf > dh-strip-nondeterminism > dwz > gettext > intltool-debian > krb5-multidev:amd64 > libarchive-zip-perl > libcap-dev:amd64 > libcppunit-1.15-0:amd64 > libcppunit-dev:amd64 > libdebhelper-perl > libecap3-dev:amd64 > libecap3:amd64 > libevent-2.1-7:amd64 > libfile-stripnondeterminism-perl > libfl2:amd64 > libgnutls-dane0:amd64 > libgnutls-openssl27:amd64 > libgnutls28-dev:amd64 > libgnutlsxx28:amd64 > libgssrpc4:amd64 > libidn2-dev:amd64 > libkadm5clnt-mit12:amd64 > libkadm5srv-mit12:amd64 > libkdb5-10:amd64 > libkrb5-dev:amd64 > libldap-dev:amd64 > libldap2-dev > libnetfilter-conntrack-dev:amd64 > libnfnetlink-dev > libosp5 > libp11-kit-dev:amd64 > libpam0g-dev:amd64 > libsasl2-dev > libsub-override-perl > libsystemd-dev:amd64 > libtasn1-6-dev:amd64 > libtdb-dev:amd64 > libunbound8:amd64 > linuxdoc-tools > nettle-dev:amd64 > opensp > po-debconf > sgml-base > sgml-data > xml-core > > > Most of the above come from "apt-get build-dep squid" executed in that > CI environment > > > HTH, > > Alex. > > ___ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users > -- Regards, Robert K Wild. ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
[squid-users] repos/packages to install squid from source
hi all, installing squid from source and just want to know what default repos/packages i should have before i install atm i have these #install epel repositry dnf install -y epel-release #install squid packages dnf install -y gcc-c++ gcc g++ binutils make sudo wget tar automake autoconf perl libxml2-devel libcap-devel openssl openssl-devel libarchive #install clamAV packages dnf install -y clamd clamav clamav-data clamav-devel clamav-lib clamav-update do i need anymore? thanks, rob -- Regards, Robert K Wild. ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
[squid-users] packages for alma linux
hi all, before i compile squid with ssl bump c-icap, squidclamav i want to get the pre req packages first, let me know if i have missed any out #install epel repositry dnf install -y epel-release #install squid packages dnf install -y gcc-c++ gcc g++ binutils make sudo wget tar automake autoconf perl libxml2-devel libcap-devel openssl openssl-devel #install clamAV packages dnf install -y clamd clamav clamav-data clamav-devel clamav-lib clamav-update thanks, rob -- Regards, Robert K Wild. ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] server_name_regex acl doesnt work anymore
I've sorted it, I had to put quotes around my file path to the URL whitelist On Thu, 12 Jan 2023, 15:22 robert k Wild, wrote: > hi all, > > i have no idea why but my acl for url whitelist doesnt work anymore > > this is the output of my parse > > /usr/local/squid/sbin/squid -k parse > 2023/01/12 15:10:56| Startup: Initializing Authentication Schemes ... > 2023/01/12 15:10:56| Startup: Initialized Authentication Scheme 'basic' > 2023/01/12 15:10:56| Startup: Initialized Authentication Scheme 'digest' > 2023/01/12 15:10:56| Startup: Initialized Authentication Scheme 'negotiate' > 2023/01/12 15:10:56| Startup: Initialized Authentication Scheme 'ntlm' > 2023/01/12 15:10:56| Startup: Initialized Authentication. > 2023/01/12 15:10:56| Processing Configuration File: > /usr/local/squid/etc/squid.conf (depth 0) > 2023/01/12 15:10:56| Processing: acl localnet src 0.0.0.1-0.255.255.255 # > RFC 1122 "this" network (LAN) > 2023/01/12 15:10:56| Processing: acl localnet src 10.0.0.0/8# > RFC 1918 local private network (LAN) > 2023/01/12 15:10:56| Processing: acl localnet src 100.64.0.0/10 # > RFC 6598 shared address space (CGN) > 2023/01/12 15:10:56| Processing: acl localnet src 169.254.0.0/16# > RFC 3927 link-local (directly plugged) machines > 2023/01/12 15:10:56| Processing: acl localnet src 172.16.0.0/12 # > RFC 1918 local private network (LAN) > 2023/01/12 15:10:56| Processing: acl localnet src 192.168.0.0/16 ># RFC 1918 local private network (LAN) > 2023/01/12 15:10:56| Processing: acl localnet src fc00::/7 # > RFC 4193 local private network range > 2023/01/12 15:10:56| Processing: acl localnet src fe80::/10 # > RFC 4291 link-local (directly plugged) machines > 2023/01/12 15:10:56| Processing: acl SSL_ports port 443 > 2023/01/12 15:10:56| Processing: acl Safe_ports port 80 # http > 2023/01/12 15:10:56| Processing: acl Safe_ports port 21 # ftp > 2023/01/12 15:10:56| Processing: acl Safe_ports port 443# > https > 2023/01/12 15:10:56| Processing: acl Safe_ports port 70 # gopher > 2023/01/12 15:10:56| Processing: acl Safe_ports port 210# > wais > 2023/01/12 15:10:56| Processing: acl Safe_ports port 1025-65535 # > unregistered ports > 2023/01/12 15:10:56| Processing: acl Safe_ports port 280# > http-mgmt > 2023/01/12 15:10:56| Processing: acl Safe_ports port 488# > gss-http > 2023/01/12 15:10:56| Processing: acl Safe_ports port 591# > filemaker > 2023/01/12 15:10:56| Processing: acl Safe_ports port 777# > multiling http > 2023/01/12 15:10:56| Processing: acl CONNECT method CONNECT > 2023/01/12 15:10:56| Processing: http_access allow localhost manager > 2023/01/12 15:10:56| Processing: http_access deny manager > 2023/01/12 15:10:56| Processing: include > /usr/local/squid/etc/squidrules.conf > 2023/01/12 15:10:56| Processing Configuration File: > /usr/local/squid/etc/squidrules.conf (depth 1) > 2023/01/12 15:10:56| Processing: acl DiscoverSNIHost at_step SslBump1 > 2023/01/12 15:10:56| Processing: acl NoSSLIntercept ssl::server_name_regex > /usr/local/squid/etc/pubkey.txt > 2023/01/12 15:10:56| Processing: ssl_bump peek DiscoverSNIHost > 2023/01/12 15:10:56| Processing: ssl_bump splice NoSSLIntercept > 2023/01/12 15:10:56| Processing: ssl_bump bump all > 2023/01/12 15:10:56| Processing: http_port 3128 ssl-bump > cert=/usr/local/squid/etc/ssl_cert/myCA.pem generate-host-certificates=on > dynamic_cert_mem_cache_size=4MB > 2023/01/12 15:10:56| Processing: sslcrtd_program > /usr/local/squid/libexec/security_file_certgen -s /var/lib/ssl_db -M 4MB > 2023/01/12 15:10:56| Processing: acl upmime req_mime_type > /usr/local/squid/etc/mimedeny.txt > 2023/01/12 15:10:56| Processing: acl url_links url_regex > /usr/local/squid/etc/linksurl.txt > 2023/01/12 15:10:56| Processing: acl special_url url_regex > /usr/local/squid/etc/urlspecial.txt > 2023/01/12 15:10:56| Processing: acl downmime rep_mime_type > /usr/local/squid/etc/mimedeny.txt > 2023/01/12 15:10:56| Processing: http_reply_access allow special_url > 2023/01/12 15:10:56| Processing: http_reply_access deny downmime > 2023/01/12 15:10:56| Processing: acl whitelist ssl::server_name_regex > /usr/local/squid/etc/urlwhite.txt > 2023/01/12 15:10:56| Processing: acl activation port 80 443 > 2023/01/12 15:10:56| Processing: http_access allow activation whitelist > 2023/01/12 15:10:56| Processing: http_access deny all > 2023/01/12 15:10:56| Processing: http_access allow localnet > 2023/01/12 15:10:56| Processing: http_access allow localhost > 2023/01/12 15:10:56| Processing: http_access deny all > 2023/01/12 15:10:56| Processing: coredum
[squid-users] server_name_regex acl doesnt work anymore
2 15:10:56| Processing: icap_send_client_username on 2023/01/12 15:10:56| Processing: icap_client_username_header X-Authenticated-User 2023/01/12 15:10:56| Processing: icap_service service_req reqmod_precache bypass=0 icap://127.0.0.1:1344/squidclamav 2023/01/12 15:10:56| Processing: adaptation_access service_req allow all 2023/01/12 15:10:56| Processing: icap_service service_resp respmod_precache bypass=0 icap://127.0.0.1:1344/squidclamav 2023/01/12 15:10:56| Processing: adaptation_access service_resp allow all 2023/01/12 15:10:56| Initializing https:// proxy context 2023/01/12 15:10:56| Initializing http_port [::]:3128 TLS contexts 2023/01/12 15:10:56| Using certificate in /usr/local/squid/etc/ssl_cert/myCA.pem 2023/01/12 15:10:56| Using certificate chain in /usr/local/squid/etc/ssl_cert/myCA.pem 2023/01/12 15:10:56| Adding issuer CA: /C=XX/L=Default City/O=Default Company Ltd 2023/01/12 15:10:56| Using key in /usr/local/squid/etc/ssl_cert/myCA.pem acl whitelist ssl::server_name_regex /usr/local/squid/etc/urlwhite.txt and in the url whitelist file is adobe.com (^|\.)adobe.com$ but when i try to access on my browser "adobe.com" i get the proxy access denied page can anyone shed some light as im struggling to sort this out thanks, rob -- Regards, Robert K Wild. ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
[squid-users] eicar website not reachable
hi all, i have made a whitelist to go to some website, atm i have temp commented out #http_access deny to go to any website when i browse the web i can go on gmail,o365,hsbc,facebook etc but when i try to go to "eicar.org" i get the following error The following error was encountered while trying to retrieve the URL: https://www.eicar.org/* *Connection to 2a00:1828:1000:2497::2 failed.* The system returned: *(101) Network is unreachable* The remote host or network may be down. Please try the request again. Your cache administrator is webmaster . im running squid 5.7 any help would be greatly appreciated thanks, rob -- Regards, Robert K Wild. ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] ubuntu ecap clamAV adapter
Thanks Rafael, Do you think I should compile all packages from source IE squid, ecap and ecap clamAV module or just download the first 2 via apt and just compile the last one Also I imagine I need to also install from apt clamAV Thanks, rob On Thu, 24 Nov 2022, 18:35 Rafael Akchurin, wrote: > Hello Robert, > > > > May be this will be of any help – this is how we compile the eCAP for > Squid - > https://github.com/diladele/websafety/blob/master/core.ubuntu20/03_clamav.sh > > If you need to compile the Squid too – also look at > https://github.com/diladele/squid-ubuntu > > > > Best regards, > > Rafael > > > > *From:* squid-users *On > Behalf Of *robert k Wild > *Sent:* Thursday, November 24, 2022 7:23 PM > *To:* Squid Users > *Subject:* [squid-users] ubuntu ecap clamAV adapter > > > > hi all, > > > > so im trying to install squid, ecap with the clamAV adapter > > > > i noticed when i install squid it comes already with libecap, so all i > need to do is install the clamAV adapter > > > > so do i just need to do this > > > > wget https://www.e-cap.org/archive/ecap_clamav_adapter-2.0.0.tar.gz > > > > but when i do > > > > ./configure > > > > i get this error > > > > configure: error: in `/root/ecap_clamav_adapter-2.0.0': > configure: error: The pkg-config script could not be found or is too old. > Make sure it > is in your PATH or set the PKG_CONFIG environment variable to the full > path to pkg-config. > > Alternatively, you may set the environment variables LIBECAP_CFLAGS > and LIBECAP_LIBS to avoid the need to call pkg-config. > See the pkg-config man page for more details. > > To get pkg-config, see <http://pkg-config.freedesktop.org/>. > See `config.log' for more details > > > > ive looked at there website but its for the sample adapter > > > > any help would be great > > > > thanks, > > rob > > > > -- > > Regards, > > Robert K Wild. > ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
[squid-users] ubuntu ecap clamAV adapter
hi all, so im trying to install squid, ecap with the clamAV adapter i noticed when i install squid it comes already with libecap, so all i need to do is install the clamAV adapter so do i just need to do this wget https://www.e-cap.org/archive/ecap_clamav_adapter-2.0.0.tar.gz but when i do ./configure i get this error configure: error: in `/root/ecap_clamav_adapter-2.0.0': configure: error: The pkg-config script could not be found or is too old. Make sure it is in your PATH or set the PKG_CONFIG environment variable to the full path to pkg-config. Alternatively, you may set the environment variables LIBECAP_CFLAGS and LIBECAP_LIBS to avoid the need to call pkg-config. See the pkg-config man page for more details. To get pkg-config, see <http://pkg-config.freedesktop.org/>. See `config.log' for more details ive looked at there website but its for the sample adapter any help would be great thanks, rob -- Regards, Robert K Wild. ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] ubuntu squid - proxy refusing
LOL ok sorted i uncommented out http_access allow localnet and reloaded the service and job done On Thu, 24 Nov 2022 at 17:40, robert k Wild wrote: > hi all, > > so i have installed squid on ubuntu 22.04 and i havnt made any changes, > all i have done is > > apt install squid > systemctl start squid > > its obviously not like centos 7 where out the box it just works ie > doesnt block anything > > on certain websites i get the squid access is denied > > but most of the time i get > > The proxy server is refusing connections > > why is this please? > > thanks, > rob > > -- > Regards, > > Robert K Wild. > -- Regards, Robert K Wild. ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
[squid-users] ubuntu squid - proxy refusing
hi all, so i have installed squid on ubuntu 22.04 and i havnt made any changes, all i have done is apt install squid systemctl start squid its obviously not like centos 7 where out the box it just works ie doesnt block anything on certain websites i get the squid access is denied but most of the time i get The proxy server is refusing connections why is this please? thanks, rob -- Regards, Robert K Wild. ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] moving squid from centos 7 to ubuntu 22.04
thanks Amos, sorry im so used to downloading and compiling packages as centos never comes with the latest packages but i still need to download/install package clamAV as i imagine the clamAV adapter needs the clamAV package still On Mon, 21 Nov 2022 at 14:08, Matus UHLAR - fantomas wrote: > On 21.11.22 13:55, robert k Wild wrote: > >so to get ecap working instead if icap > > > >from this > > > >wget http://www.squid-cache.org/Versions/v4/squid-4.17.tar.gz > >wget > > > http://sourceforge.net/projects/c-icap/files/c-icap/0.5.x/c_icap-0.5.10.tar.gz > >--no-check-certificate > >wget > > > http://sourceforge.net/projects/c-icap/files/c-icap-modules/0.5.x/c_icap_modules-0.5.5.tar.gz > >--no-check-certificate > >wget > > > https://sourceforge.net/projects/squidclamav/files/squidclamav/7.1/squidclamav-7.1.tar.gz > >--no-check-certificate > > > >to this > > > >wget http://www.squid-cache.org/Versions/v4/squid-4.17.tar.gz > >wget https://www.e-cap.org/archive/libecap-1.0.0.tar.gz > >wget https://www.e-cap.org/archive/ecap_clamav_adapter-2.0.0.tar.gz > > > >what guide is best to follow for ecap as theres two > > > >https://wiki.squid-cache.org/Features/eCAP > > > >https://wiki.squid-cache.org/ConfigExamples/ContentAdaptation/eCAP > > if you are moving to ubuntu, shouldn't you check if squid package in > ubuntu > doesn't already support it? > > https://packages.ubuntu.com/jammy/squid > seems to support icap/ecap and is newer version > > https://packages.ubuntu.com/jammy/libc-icap-mod-virus-scan > seems to support clamav and icap > > > > >On Fri, 18 Nov 2022 at 07:33, robert k Wild > wrote: > >> thats fine, youve been more than helpful, thank you > >> this is where i learnt how to run squid with cicap > >> https://squidclamav.darold.net/documentation.html > >> have you got a good how to about running squid with e cap > >> whats the difference anyway between Icap and Ecap? > > >>> On 17/11/2022 9:21 pm, robert k Wild wrote: > >>> > Wow thanks Amos so much for this, > >>> > > >>> > You think if I build it on rocky Linux, it would be easier? > > >> On Thu, 17 Nov 2022 at 09:29, Amos Jeffries > wrote: > >>> I am not familiar with Rocky Linux beyond its existence. > >>> I expect it would be similar to CentOS since both are in the RHEL > family. > > > -- > Matus UHLAR - fantomas, uh...@fantomas.sk ; http://www.fantomas.sk/ > Warning: I wish NOT to receive e-mail advertising to this address. > Varovanie: na tuto adresu chcem NEDOSTAVAT akukolvek reklamnu postu. > Micro$oft random number generator: 0, 0, 0, 4.33e+67, 0, 0, 0... > ___ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users > -- Regards, Robert K Wild. ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] moving squid from centos 7 to ubuntu 22.04
so to get ecap working instead if icap from this wget http://www.squid-cache.org/Versions/v4/squid-4.17.tar.gz wget http://sourceforge.net/projects/c-icap/files/c-icap/0.5.x/c_icap-0.5.10.tar.gz --no-check-certificate wget http://sourceforge.net/projects/c-icap/files/c-icap-modules/0.5.x/c_icap_modules-0.5.5.tar.gz --no-check-certificate wget https://sourceforge.net/projects/squidclamav/files/squidclamav/7.1/squidclamav-7.1.tar.gz --no-check-certificate to this wget http://www.squid-cache.org/Versions/v4/squid-4.17.tar.gz wget https://www.e-cap.org/archive/libecap-1.0.0.tar.gz wget https://www.e-cap.org/archive/ecap_clamav_adapter-2.0.0.tar.gz what guide is best to follow for ecap as theres two https://wiki.squid-cache.org/Features/eCAP https://wiki.squid-cache.org/ConfigExamples/ContentAdaptation/eCAP On Fri, 18 Nov 2022 at 07:33, robert k Wild wrote: > Hi Amos, > > thats fine, youve been more than helpful, thank you > > this is where i learnt how to run squid with cicap > > https://squidclamav.darold.net/documentation.html > > have you got a good how to about running squid with e cap > > whats the difference anyway between Icap and Ecap? > > On Thu, 17 Nov 2022 at 09:29, Amos Jeffries wrote: > >> On 17/11/2022 9:21 pm, robert k Wild wrote: >> > Wow thanks Amos so much for this, >> > >> > You think if I build it on rocky Linux, it would be easier? >> > >> >> I am not familiar with Rocky Linux beyond its existence. >> I expect it would be similar to CentOS since both are in the RHEL family. >> >> Amos >> >> ___ >> squid-users mailing list >> squid-users@lists.squid-cache.org >> http://lists.squid-cache.org/listinfo/squid-users >> > > > -- > Regards, > > Robert K Wild. > -- Regards, Robert K Wild. ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] moving squid from centos 7 to ubuntu 22.04
Hi Amos, thats fine, youve been more than helpful, thank you this is where i learnt how to run squid with cicap https://squidclamav.darold.net/documentation.html have you got a good how to about running squid with e cap whats the difference anyway between Icap and Ecap? On Thu, 17 Nov 2022 at 09:29, Amos Jeffries wrote: > On 17/11/2022 9:21 pm, robert k Wild wrote: > > Wow thanks Amos so much for this, > > > > You think if I build it on rocky Linux, it would be easier? > > > > I am not familiar with Rocky Linux beyond its existence. > I expect it would be similar to CentOS since both are in the RHEL family. > > Amos > > ___ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users > -- Regards, Robert K Wild. ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] moving squid from centos 7 to ubuntu 22.04
Wow thanks Amos so much for this, You think if I build it on rocky Linux, it would be easier? On Thu, 17 Nov 2022, 06:07 Amos Jeffries, wrote: > On 16/11/2022 6:31 am, robert k Wild wrote: > > hi all, > > > > atm i have written a script, once you have built a centos 7 VM, you > > just run the script and after the reboot its a complete running > > squidclamAV server > > > > i'm going to be moving the script to a ubuntu server as centos 7 is > > dead now (as i run clamAV on it, clamAV will stop getting virus > > definitions 2024 as i use this for virus scanning of internet packets) > > > > just want to know what lines i need to adjust to work with ubuntu > > instead of centos, obviously i know instead of yum install its apt > > install > > > > My comments below assume that you want to keep the exact versions as-is > and custom build. > > Otherwise, if you are okay following Ubuntu's official packages and > security fixes things could be a lot different (and simpler). > > > > heres my long script > > > > #!/bin/bash > > # > > #this script will download/install and configure the following packages > > # > > #squid - proxy server > > #squid ssl bump - intercept HTTPS traffic > > #clamAV - antivirus engine inc trojans,viruses,malware > > #c-icap - icap server > > #squidclamav - that integrates all the above in squid > > You may not be aware squidclamav has been replaced with eCAP ClamAV module: > <https://www.e-cap.org/downloads/> > > Ubuntu provides libecap package and Squid has support auto-enabled for it. > So all you should need to do is build the ecap-clamav adaptor and > configure it for use. > > > > #whitelist URL's > > #deny MIME types > > # > > #on the PROD host you only need squid > > # > > #first things first lets disable firewalld and SElinux > > # > > systemctl stop firewalld > > systemctl disable firewalld > > sed -i -e 's/SELINUX=enforcing/SELINUX=disabled/g' /etc/selinux/config > > # > > #squid packages > > # > > yum install -y epel-release screen rsync net-tools ethtool swaks sed > > tar zip unzip curl telnet openssl openssl-devel bzip2-devel libarchive > > libarchive-devel perl perl-Data-Dumper gcc gcc-c++ binutils autoconf > > automake make sudo wget libxml2-devel libcap-devel libtool-ltdl-devel > > # > > Drop "epel-release" as irrelevant on Ubuntu. > > Ubuntu developer packages have "-dev" suffix instead of "-devel". So all > those should change. > > To get access to simpler source building I recommend altering the apt > configuration like so: > > sudo sed --in-place -E 's/# (deb-src.*updates main)/ \1/g' > /etc/apt/sources.list > sudo apt-get --quiet=2 update > > > There are some trivial package naming differences. When apt complains > about not finding a package you can use > <https://packages.ubuntu.com/search> to search for the Ubuntu naming > and/or any alternatives. > > > Many of those are not related to Squid in any way. Perhapse separate > them into a different install command? > > After the above deb-src change the packages needed to build Squid for > Ubuntu can be installed like so: > > sudo apt-get --quiet=2 build-dep squid > > Similar commands also for clamav, c-icap any others which Ubuntu > provides packages for. > > After that build-dep command you only need to install dependencies if > the Ubuntu package lacks support. > For example, Ubuntu older than 21.10 lack openssl natively, so "apt > install libssl-dev" may be needed specially. > > > > #clamAV packages > > # > > yum install -y clamav-server clamav-data clamav-update > > clamav-filesystem clamav clamav-scanner-systemd clamav-devel > > clamav-lib clamav-server-systemd > > # > > > > #download and compile from source > > # > > cd /tmp > > wget http://www.squid-cache.org/Versions/v4/squid-4.17.tar.gz > > wget > > > http://sourceforge.net/projects/c-icap/files/c-icap/0.5.x/c_icap-0.5.10.tar.gz > > --no-check-certificate > > wget > > > http://sourceforge.net/projects/c-icap/files/c-icap-modules/0.5.x/c_icap_modules-0.5.5.tar.gz > > --no-check-certificate > > wget > > > https://sourceforge.net/projects/squidclamav/files/squidclamav/7.1/squidclamav-7.1.tar.gz > > --no-check-certificate > > # > > for f in *.tar.gz; do tar xf "$f"; done > > # > > cd /tmp/squid-4.17 > > ./configure --with-openssl --enable-ssl-crtd --enable-icap-client > > --enable-http-violations &&
[squid-users] moving squid from centos 7 to ubuntu 22.04
l/cron/root echo "*/15 * * * * /usr/local/squid/sbin/squid -k reconfigure" >> /var/spool/cron/root echo "0 21 * * 0 /usr/local/squid/sbin/squid -k rotate" >> /var/spool/cron/root # #c-icap and c-icap modules # sed -i -e 's%#.*User wwwrun%User root%g' /etc/c-icap.conf sed -i -e 's%#.*Group nogroup%Group root%g' /etc/c-icap.conf sed -i -e 's%#.*Service echo_service srv_echo.so%Service squidclamav squidclamav.so%g' /etc/c-icap.conf sed -i -e 's%DebugLevel 1%DebugLevel 0%g' /etc/c-icap.conf sed -i -e 's%StartServers 3%StartServers 1%g' /etc/c-icap.conf sed -i -e 's%MaxServers 10%MaxServers 20%g' /etc/c-icap.conf sed -i -e 's%MaxRequestsPerChild 0%MaxRequestsPerChild 100%g' /etc/c-icap.conf sed -i '520iacl localhost src 127.0.0.1/255.255.255.255' /etc/c-icap.conf sed -i '521iacl PERMIT_REQUESTS type REQMOD RESPMOD' /etc/c-icap.conf sed -i '522iicap_access allow localhost PERMIT_REQUESTS' /etc/c-icap.conf sed -i '523iicap_access deny all' /etc/c-icap.conf echo "clamav_mod.TmpDir /var/tmp" >> /etc/clamav_mod.conf echo "clamav_mod.MaxFilesInArchive 1000" >> /etc/clamav_mod.conf echo "clamav_mod.MaxScanSize 5M" >> /etc/clamav_mod.conf echo "clamav_mod.HeuristicScanPrecedence on" >> /etc/clamav_mod.conf echo "clamav_mod.OLE2BlockMacros on" >> /etc/clamav_mod.conf echo "virus_scan.ScanFileTypes TEXT DATA EXECUTABLE ARCHIVE DOCUMENT" >> /etc/virus_scan.conf echo "virus_scan.SendPercentData 5" >> /etc/virus_scan.conf echo "virus_scan.PassOnError on" >> /etc/virus_scan.conf echo "virus_scan.MaxObjectSize 5M" >> /etc/virus_scan.conf echo "virus_scan.DefaultEngine clamav" >> /etc/virus_scan.conf echo "Include clamav_mod.conf" >> /etc/virus_scan.conf echo "Include virus_scan.conf" >> /etc/c-icap.conf # #make c-icap service # echo "[Unit]" >> /usr/lib/systemd/system/c-icap.service echo "Description=c-icap service" >> /usr/lib/systemd/system/c-icap.service echo "After=network.target" >> /usr/lib/systemd/system/c-icap.service echo "[Service]" >> /usr/lib/systemd/system/c-icap.service echo "Type=forking" >> /usr/lib/systemd/system/c-icap.service echo "PIDFile=/var/run/c-icap/c-icap.pid" >> /usr/lib/systemd/system/c-icap.service echo "ExecStart=/usr/local/bin/c-icap -f /etc/c-icap.conf" >> /usr/lib/systemd/system/c-icap.service echo "KillMode=process" >> /usr/lib/systemd/system/c-icap.service echo "[Install]" >> /usr/lib/systemd/system/c-icap.service echo "WantedBy=multi-user.target" >> /usr/lib/systemd/system/c-icap.service systemctl enable c-icap reboot thanks, rob -- Regards, Robert K Wild. ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] regex for normal websites
wow thanks Eliezer so much for that video, that website looks awesome, ive bookmarked it already On Thu, 4 Aug 2022 at 09:59, wrote: > Hey Robert, > > > > I recorded this video for you: > > https://cloud1.ngtech.co.il/static/squid-data/regex-for-robert.mp4 > > > > This is what I did when I reviewed the question. > > I hope it will help you and others use this tool: > > https://rubular.com/ > > > > and squid. > > > > If you have any question regarding REGEX here we are welcoming every > question. > > > > All The Bests and Hope This Helps, > > Eliezer > > > > > > > > Eliezer Croitoru > > NgTech, Tech Support > > Mobile: +972-5-28704261 > > Email: ngtech1...@gmail.com > > Web: https://ngtech.co.il/ > > My-Tube: https://tube.ngtech.co.il/ > > > > *From:* squid-users *On > Behalf Of *robert k Wild > *Sent:* Wednesday, 3 August 2022 14:52 > *To:* Squid Users > *Subject:* Re: [squid-users] regex for normal websites > > > > thanks Amos for this greatly appreciated > > > > On Wed, 3 Aug 2022 at 09:35, Matus UHLAR - fantomas > wrote: > > On 03.08.22 14:12, Amos Jeffries wrote: > >IMO, what you are looking for is actually this ACL definition: > > > > acl adobe ssl::server_name .adobe.com > > > >or its regex equivalent, > > > > acl adobe ssl::server_name_regex (^|\.)adobe\.com$ > > this is what I was searching for. Squid FAQ says: > > > https://wiki.squid-cache.org/SquidFaq/SquidAcl#Squid_doesn.27t_match_my_subdomains > > www.example.com matches the exact host www.example.com, while .example.com > matches the entire domain example.com (including example.com alone) > > > but I wasn't sure if this matching also applies to ssl::server_name. > > thanks > -- > Matus UHLAR - fantomas, uh...@fantomas.sk ; http://www.fantomas.sk/ > Warning: I wish NOT to receive e-mail advertising to this address. > Varovanie: na tuto adresu chcem NEDOSTAVAT akukolvek reklamnu postu. > I don't have lysdexia. The Dog wouldn't allow that. > ___ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users > > > > -- > > Regards, > > Robert K Wild. > ___ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users > -- Regards, Robert K Wild. ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] regex for normal websites
thanks Amos for this greatly appreciated On Wed, 3 Aug 2022 at 09:35, Matus UHLAR - fantomas wrote: > On 03.08.22 14:12, Amos Jeffries wrote: > >IMO, what you are looking for is actually this ACL definition: > > > > acl adobe ssl::server_name .adobe.com > > > >or its regex equivalent, > > > > acl adobe ssl::server_name_regex (^|\.)adobe\.com$ > > this is what I was searching for. Squid FAQ says: > > > https://wiki.squid-cache.org/SquidFaq/SquidAcl#Squid_doesn.27t_match_my_subdomains > > www.example.com matches the exact host www.example.com, while .example.com > matches the entire domain example.com (including example.com alone) > > > but I wasn't sure if this matching also applies to ssl::server_name. > > thanks > -- > Matus UHLAR - fantomas, uh...@fantomas.sk ; http://www.fantomas.sk/ > Warning: I wish NOT to receive e-mail advertising to this address. > Varovanie: na tuto adresu chcem NEDOSTAVAT akukolvek reklamnu postu. > I don't have lysdexia. The Dog wouldn't allow that. > ___ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users > -- Regards, Robert K Wild. ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] regex for normal websites
Mmm, maybe I should try dstdom_regex Instead of ssl::server_name_regex But when you using ssl bump in your squid.conf, isn't it best to use ssl::server_name_regex On Tue, 2 Aug 2022, 17:21 Matus UHLAR - fantomas, wrote: > On 02.08.22 16:41, robert k Wild wrote: > >thats incorrect as > > > >adobe\.com$ works but > > it works for hackadobe.com too. > > >.adobe\.com$ or > >\.adobe\.com$ > > > >doesnt work so i just want to know why > > these two don't match adobe.com. > the first marched hadobe.com, the seconda matches anything .adobe.com > > so, again you must use: > > ^adobe\.com$ > \.adobe\.com$ > > first will match "adobe.com", the second will match its subdomains. > > again, from what I remember, using "dstdomain" .adobe.com should match > both (haven't tried), and I was hoping it should apply for ssl::servername > (or how is that names). > > > >On Tue, 2 Aug 2022 at 16:32, Antony Stone < > antony.st...@squid.open.source.it> > >wrote: > > > >> On Tuesday 02 August 2022 at 17:23:51, robert k Wild wrote: > >> > >> > mmm... so i just want to know and really sorry for the dumb question, > so > >> > > >> > adobe\.com$ > >> > > >> > works but then again if a website was eg > >> > > >> > hackadobe\.com$ > >> > > >> > that would work as well probably, so i want to do something like this > >> > > >> > \.adobe\.com$ > >> > > >> > ie put a dot . infront of adobe so > >> > > >> > www.adobe.com or > >> > account.adobe.com > >> > > >> > would work but then > >> > > >> > hackadobe\.com$ > >> > > >> > would no longer work > >> > >> ... and neither would plain "adobe.com", missing the leading dot. > > -- > Matus UHLAR - fantomas, uh...@fantomas.sk ; http://www.fantomas.sk/ > Warning: I wish NOT to receive e-mail advertising to this address. > Varovanie: na tuto adresu chcem NEDOSTAVAT akukolvek reklamnu postu. > I don't have lysdexia. The Dog wouldn't allow that. > ___ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users > ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] regex for normal websites
no problem Eliezer im just doing few test of my own on this to see why On Tue, 2 Aug 2022 at 16:41, wrote: > Hey Robert, > > > > It’s not a dumb question. > > It’s a really fine question. > > I want to answer to your question but I have couple obligations. > > If you are willing to wait couple days I will probably be much free and > will be able to sit and understand what the answer and then to answer > properly. > > > > For a great question deserves a great answer. > > > > Yours, > > Eliezer > > > > > > Eliezer Croitoru > > NgTech, Tech Support > > Mobile: +972-5-28704261 > > Email: ngtech1...@gmail.com > > Web: https://ngtech.co.il/ > > My-Tube: https://tube.ngtech.co.il/ > > > > *From:* robert k Wild > *Sent:* Tuesday, 2 August 2022 18:24 > *To:* Eliezer Croitoru > *Cc:* Squid Users > *Subject:* Re: [squid-users] regex for normal websites > > > > mmm... so i just want to know and really sorry for the dumb question, so > > > > adobe\.com$ > > > > works but then again if a website was eg > > > > hackadobe\.com$ > > > > that would work as well probably, so i want to do something like this > > > > \.adobe\.com$ > > > > ie put a dot . infront of adobe so > > > > www.adobe.com or > > account.adobe.com > > > > would work but then > > > > hackadobe\.com$ > > > > would no longer work > > > > > > > > > > On Tue, 2 Aug 2022 at 15:27, wrote: > > Hey Robert, > > > > I will test this with latest squid and my Apps helper and will verify. > > > > Thanks, > > Eliezer > > > > > > Eliezer Croitoru > > NgTech, Tech Support > > Mobile: +972-5-28704261 > > Email: ngtech1...@gmail.com > > Web: https://ngtech.co.il/ > > My-Tube: https://tube.ngtech.co.il/ > > > > *From:* robert k Wild > *Sent:* Tuesday, 2 August 2022 15:15 > *To:* Eliezer Croitoru > *Cc:* Squid Users > *Subject:* Re: [squid-users] regex for normal websites > > > > ok i have tested and this works > > > > adobe\.com$ > > > > i found it weird this didnt work > > > > \.adobe\.com > > > > just curious thats all > > > > On Tue, 2 Aug 2022 at 13:05, wrote: > > I believe it should have been: > > ^adobe\.com$ > > ^.*\.adobe\.com$ > > ^\*\.adobe\.com$ > > > > But I don’t know the code to this depth. > > If I would have written the match I think it would have been something a > bit different. > >- A match for SNI >- A joker match for SAN ie *.adobe.com SAN should catch both >www.www.adobe.com > > > > But for some reason it’s not like that, I assume the browsers and the > libraries doesn’t implement it for an unknown reason. > > > > If Alex or anyone else from Factory knows the details of the ACL they can > answer more then me. > > > > Thanks, > > Eliezer > > > > > > Eliezer Croitoru > > NgTech, Tech Support > > Mobile: +972-5-28704261 > > Email: ngtech1...@gmail.com > > Web: https://ngtech.co.il/ > > My-Tube: https://tube.ngtech.co.il/ > > > > *From:* robert k Wild > *Sent:* Tuesday, 2 August 2022 14:51 > *To:* Eliezer Croitoru > *Cc:* Squid Users > *Subject:* Re: [squid-users] regex for normal websites > > > > thanks Eliezer > > > > so it should be > > > > adobe\.com > > > > not > > > > .adobe.\com or > > > > ^.*adobe.com > > > > as the ^.* could include > > > > blahadobe.com > > > > > > > > On Thu, 28 Jul 2022 at 08:14, wrote: > > Hey Robert, > > > > The docs at http://www.squid-cache.org/Doc/config/acl/ states: > > > > acl aclname ssl::server_name_regex [-i] \.foo\.com ... > > # regex matches server name obtained from various sources [fast] > > > > Which and I do not know exactly what it means but it will not work with a > helper in most cases. > > I have found the in the git the next sources: > > > https://github.com/squid-cache/squid/blob/bf95c10aa95bf8e56d9d8d1545cb5a3aafab0d2c/doc/release-notes/release-3.5.sgml#L414 > > > > New types ssl::server_name and ssl::server_name_regex > >to match server name from various sources (CONNECT > authority name, > >TLS SNI domain, or X.509 certificate Subject Name). > > > > Which means that there is a set of checks which the acl
Re: [squid-users] regex for normal websites
thats incorrect as adobe\.com$ works but .adobe\.com$ or \.adobe\.com$ doesnt work so i just want to know why On Tue, 2 Aug 2022 at 16:32, Antony Stone wrote: > On Tuesday 02 August 2022 at 17:23:51, robert k Wild wrote: > > > mmm... so i just want to know and really sorry for the dumb question, so > > > > adobe\.com$ > > > > works but then again if a website was eg > > > > hackadobe\.com$ > > > > that would work as well probably, so i want to do something like this > > > > \.adobe\.com$ > > > > ie put a dot . infront of adobe so > > > > www.adobe.com or > > account.adobe.com > > > > would work but then > > > > hackadobe\.com$ > > > > would no longer work > > ... and neither would plain "adobe.com", missing the leading dot. > > > Antony. > > -- > All generalisations are inaccurate. > >Please reply to the > list; > please *don't* CC > me. > ___ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users > -- Regards, Robert K Wild. ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] regex for normal websites
mmm... so i just want to know and really sorry for the dumb question, so adobe\.com$ works but then again if a website was eg hackadobe\.com$ that would work as well probably, so i want to do something like this \.adobe\.com$ ie put a dot . infront of adobe so www.adobe.com or account.adobe.com would work but then hackadobe\.com$ would no longer work On Tue, 2 Aug 2022 at 15:27, wrote: > Hey Robert, > > > > I will test this with latest squid and my Apps helper and will verify. > > > > Thanks, > > Eliezer > > > > > > Eliezer Croitoru > > NgTech, Tech Support > > Mobile: +972-5-28704261 > > Email: ngtech1...@gmail.com > > Web: https://ngtech.co.il/ > > My-Tube: https://tube.ngtech.co.il/ > > > > *From:* robert k Wild > *Sent:* Tuesday, 2 August 2022 15:15 > *To:* Eliezer Croitoru > *Cc:* Squid Users > *Subject:* Re: [squid-users] regex for normal websites > > > > ok i have tested and this works > > > > adobe\.com$ > > > > i found it weird this didnt work > > > > \.adobe\.com > > > > just curious thats all > > > > On Tue, 2 Aug 2022 at 13:05, wrote: > > I believe it should have been: > > ^adobe\.com$ > > ^.*\.adobe\.com$ > > ^\*\.adobe\.com$ > > > > But I don’t know the code to this depth. > > If I would have written the match I think it would have been something a > bit different. > >- A match for SNI >- A joker match for SAN ie *.adobe.com SAN should catch both >www.www.adobe.com > > > > But for some reason it’s not like that, I assume the browsers and the > libraries doesn’t implement it for an unknown reason. > > > > If Alex or anyone else from Factory knows the details of the ACL they can > answer more then me. > > > > Thanks, > > Eliezer > > > > > > Eliezer Croitoru > > NgTech, Tech Support > > Mobile: +972-5-28704261 > > Email: ngtech1...@gmail.com > > Web: https://ngtech.co.il/ > > My-Tube: https://tube.ngtech.co.il/ > > > > *From:* robert k Wild > *Sent:* Tuesday, 2 August 2022 14:51 > *To:* Eliezer Croitoru > *Cc:* Squid Users > *Subject:* Re: [squid-users] regex for normal websites > > > > thanks Eliezer > > > > so it should be > > > > adobe\.com > > > > not > > > > .adobe.\com or > > > > ^.*adobe.com > > > > as the ^.* could include > > > > blahadobe.com > > > > > > > > On Thu, 28 Jul 2022 at 08:14, wrote: > > Hey Robert, > > > > The docs at http://www.squid-cache.org/Doc/config/acl/ states: > > > > acl aclname ssl::server_name_regex [-i] \.foo\.com ... > > # regex matches server name obtained from various sources [fast] > > > > Which and I do not know exactly what it means but it will not work with a > helper in most cases. > > I have found the in the git the next sources: > > > https://github.com/squid-cache/squid/blob/bf95c10aa95bf8e56d9d8d1545cb5a3aafab0d2c/doc/release-notes/release-3.5.sgml#L414 > > > > New types ssl::server_name and ssl::server_name_regex > >to match server name from various sources (CONNECT > authority name, > >TLS SNI domain, or X.509 certificate Subject Name). > > > > Which means that there is a set of checks which the acl does and not just > a domain name. > > It’s also even possible that the domain name is not know in the CONNECT > state of the connection. > > If I remember correctly there is a possibility for browsers to use the > same exact connection for multiple domains but > I have not seen this yet in production. > > With Squid once you bump the connection to HTTP/1.x you can make 100% sure > the features of the Host header request. > > > > At Servername.cc ie: > > > https://github.com/squid-cache/squid/blob/aee3523a768aff4d1e6c1195c4a401b4ef5688a0/src/acl/ServerName.cc#L81 > > > > There is a specific logic of what is done and what is matched but I am not > sure what would be used in the case of: > > *.adobe.com > > > > Certificate SAN. > > > > Specifically This part of the Common Names ie SAN: > > > https://github.com/squid-cache/squid/blob/aee3523a768aff4d1e6c1195c4a401b4ef5688a0/src/acl/ServerName.cc#L105 > > > > which to my understanding points to: > > > https://github.com/squid-cache/squid/blob/d146da3bfe7083381ae7ab38640cbfd0d2542374/src/ssl/support.cc#L195 > > > > doesn’t make any sense to me.( didn’t tried that much to understan
Re: [squid-users] regex for normal websites
ok i have tested and this works adobe\.com$ i found it weird this didnt work \.adobe\.com just curious thats all On Tue, 2 Aug 2022 at 13:05, wrote: > I believe it should have been: > > ^adobe\.com$ > > ^.*\.adobe\.com$ > > ^\*\.adobe\.com$ > > > > But I don’t know the code to this depth. > > If I would have written the match I think it would have been something a > bit different. > >- A match for SNI >- A joker match for SAN ie *.adobe.com SAN should catch both >www.www.adobe.com > > > > But for some reason it’s not like that, I assume the browsers and the > libraries doesn’t implement it for an unknown reason. > > > > If Alex or anyone else from Factory knows the details of the ACL they can > answer more then me. > > > > Thanks, > > Eliezer > > > > > > Eliezer Croitoru > > NgTech, Tech Support > > Mobile: +972-5-28704261 > > Email: ngtech1...@gmail.com > > Web: https://ngtech.co.il/ > > My-Tube: https://tube.ngtech.co.il/ > > > > *From:* robert k Wild > *Sent:* Tuesday, 2 August 2022 14:51 > *To:* Eliezer Croitoru > *Cc:* Squid Users > *Subject:* Re: [squid-users] regex for normal websites > > > > thanks Eliezer > > > > so it should be > > > > adobe\.com > > > > not > > > > .adobe.\com or > > > > ^.*adobe.com > > > > as the ^.* could include > > > > blahadobe.com > > > > > > > > On Thu, 28 Jul 2022 at 08:14, wrote: > > Hey Robert, > > > > The docs at http://www.squid-cache.org/Doc/config/acl/ states: > > > > acl aclname ssl::server_name_regex [-i] \.foo\.com ... > > # regex matches server name obtained from various sources [fast] > > > > Which and I do not know exactly what it means but it will not work with a > helper in most cases. > > I have found the in the git the next sources: > > > https://github.com/squid-cache/squid/blob/bf95c10aa95bf8e56d9d8d1545cb5a3aafab0d2c/doc/release-notes/release-3.5.sgml#L414 > > > > New types ssl::server_name and ssl::server_name_regex > >to match server name from various sources (CONNECT > authority name, > >TLS SNI domain, or X.509 certificate Subject Name). > > > > Which means that there is a set of checks which the acl does and not just > a domain name. > > It’s also even possible that the domain name is not know in the CONNECT > state of the connection. > > If I remember correctly there is a possibility for browsers to use the > same exact connection for multiple domains but > I have not seen this yet in production. > > With Squid once you bump the connection to HTTP/1.x you can make 100% sure > the features of the Host header request. > > > > At Servername.cc ie: > > > https://github.com/squid-cache/squid/blob/aee3523a768aff4d1e6c1195c4a401b4ef5688a0/src/acl/ServerName.cc#L81 > > > > There is a specific logic of what is done and what is matched but I am not > sure what would be used in the case of: > > *.adobe.com > > > > Certificate SAN. > > > > Specifically This part of the Common Names ie SAN: > > > https://github.com/squid-cache/squid/blob/aee3523a768aff4d1e6c1195c4a401b4ef5688a0/src/acl/ServerName.cc#L105 > > > > which to my understanding points to: > > > https://github.com/squid-cache/squid/blob/d146da3bfe7083381ae7ab38640cbfd0d2542374/src/ssl/support.cc#L195 > > > > doesn’t make any sense to me.( didn’t tried that much to understand) > > > > If someone might be able to make sense of things in a synchronic fashion > it would help. > > (I do not see any debugs usage there or any helping comment ) > > > > Thanks, > > Eliezer > > > > > > Eliezer Croitoru > > NgTech, Tech Support > > Mobile: +972-5-28704261 > > Email: ngtech1...@gmail.com > > Web: https://ngtech.co.il/ > > My-Tube: https://tube.ngtech.co.il/ > > > > *From:* squid-users *On > Behalf Of *robert k Wild > *Sent:* Wednesday, 27 July 2022 13:52 > *To:* Squid Users > *Subject:* Re: [squid-users] regex for normal websites > > > > that's the weird thing, when i try this in "ssl::server_name_regex" > > .adobe.com > > > > it doesnt work > > > > you mean escape ie the \ character > > > > > > > > > > > > On Wed, 27 Jul 2022 at 11:05, Matus UHLAR - fantomas > wrote: > > On 27.07.22 10:54, robert k Wild wrote: > >think i got it right but just want t
Re: [squid-users] regex for normal websites
thanks Eliezer so it should be adobe\.com not .adobe.\com or ^.*adobe.com as the ^.* could include blahadobe.com On Thu, 28 Jul 2022 at 08:14, wrote: > Hey Robert, > > > > The docs at http://www.squid-cache.org/Doc/config/acl/ states: > > > > acl aclname ssl::server_name_regex [-i] \.foo\.com ... > > # regex matches server name obtained from various sources [fast] > > > > Which and I do not know exactly what it means but it will not work with a > helper in most cases. > > I have found the in the git the next sources: > > > https://github.com/squid-cache/squid/blob/bf95c10aa95bf8e56d9d8d1545cb5a3aafab0d2c/doc/release-notes/release-3.5.sgml#L414 > > > > New types ssl::server_name and ssl::server_name_regex > >to match server name from various sources (CONNECT > authority name, > >TLS SNI domain, or X.509 certificate Subject Name). > > > > Which means that there is a set of checks which the acl does and not just > a domain name. > > It’s also even possible that the domain name is not know in the CONNECT > state of the connection. > > If I remember correctly there is a possibility for browsers to use the > same exact connection for multiple domains but > I have not seen this yet in production. > > With Squid once you bump the connection to HTTP/1.x you can make 100% sure > the features of the Host header request. > > > > At Servername.cc ie: > > > https://github.com/squid-cache/squid/blob/aee3523a768aff4d1e6c1195c4a401b4ef5688a0/src/acl/ServerName.cc#L81 > > > > There is a specific logic of what is done and what is matched but I am not > sure what would be used in the case of: > > *.adobe.com > > > > Certificate SAN. > > > > Specifically This part of the Common Names ie SAN: > > > https://github.com/squid-cache/squid/blob/aee3523a768aff4d1e6c1195c4a401b4ef5688a0/src/acl/ServerName.cc#L105 > > > > which to my understanding points to: > > > https://github.com/squid-cache/squid/blob/d146da3bfe7083381ae7ab38640cbfd0d2542374/src/ssl/support.cc#L195 > > > > doesn’t make any sense to me.( didn’t tried that much to understand) > > > > If someone might be able to make sense of things in a synchronic fashion > it would help. > > (I do not see any debugs usage there or any helping comment ) > > > > Thanks, > > Eliezer > > > > > > Eliezer Croitoru > > NgTech, Tech Support > > Mobile: +972-5-28704261 > > Email: ngtech1...@gmail.com > > Web: https://ngtech.co.il/ > > My-Tube: https://tube.ngtech.co.il/ > > > > *From:* squid-users *On > Behalf Of *robert k Wild > *Sent:* Wednesday, 27 July 2022 13:52 > *To:* Squid Users > *Subject:* Re: [squid-users] regex for normal websites > > > > that's the weird thing, when i try this in "ssl::server_name_regex" > > .adobe.com > > > > it doesnt work > > > > you mean escape ie the \ character > > > > > > > > > > > > On Wed, 27 Jul 2022 at 11:05, Matus UHLAR - fantomas > wrote: > > On 27.07.22 10:54, robert k Wild wrote: > >think i got it right but just want to double check with you guys > > > >so in my "ssl::server_name" i had > >.adobe.com > > > >that worked but i want to mix normal website and regex websites together > so > >i just have one list for all > > didn't the above work? AFAIK it should, IIRC domain matching in squid > matches "domain.com" if you check for ".domain.com". > > >i now have this for "ssl::server_name_regex" > >^.*adobe.com$ > > > >it works, so im guessing its right > > the dot should be escaped > > > -- > Matus UHLAR - fantomas, uh...@fantomas.sk ; http://www.fantomas.sk/ > Warning: I wish NOT to receive e-mail advertising to this address. > Varovanie: na tuto adresu chcem NEDOSTAVAT akukolvek reklamnu postu. > BSE = Mad Cow Desease ... BSA = Mad Software Producents Desease > ___ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users > > > > > -- > > Regards, > > Robert K Wild. > ___ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users > -- Regards, Robert K Wild. ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] regex for normal websites
nice one thanks Amos i dont understand as in regex the terms ^ - start of line . - any single character * - repetition of character before $ - end of line so going by this it should be ^.*adobe.com$ how come just adobe.com$ as it doesnt know before adobe ie http(s)://www sorry for the stupid question On Wed, 27 Jul 2022 at 12:28, Amos Jeffries wrote: > On 27/07/22 21:54, robert k Wild wrote: > > hi all, > > > > think i got it right but just want to double check with you guys > > > > so in my "ssl::server_name" i had > > .adobe.com > > > > that worked but i want to mix normal website and regex websites together > > What do you mean "normal website" ? and "regex websites" ? > > > > so i just have one list for all > > > > i now have this for "ssl::server_name_regex" > > ^.*adobe.com$ > > > > it works, so im guessing its right > > > > Many things "work" in regex when they are not right. > > As Matus said the dot in the domain needs to be escaped: > >^.*adobe\.com$ > > > Also, the "^.*" in your pattern does nothing useful, it should be omitted. > > Leaving you with the pattern: > > adobe\.com$ > > > HTH > Amos > ___ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users > -- Regards, Robert K Wild. ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] regex for normal websites
that's the weird thing, when i try this in "ssl::server_name_regex" .adobe.com it doesnt work you mean escape ie the \ character On Wed, 27 Jul 2022 at 11:05, Matus UHLAR - fantomas wrote: > On 27.07.22 10:54, robert k Wild wrote: > >think i got it right but just want to double check with you guys > > > >so in my "ssl::server_name" i had > >.adobe.com > > > >that worked but i want to mix normal website and regex websites together > so > >i just have one list for all > > didn't the above work? AFAIK it should, IIRC domain matching in squid > matches "domain.com" if you check for ".domain.com". > > >i now have this for "ssl::server_name_regex" > >^.*adobe.com$ > > > >it works, so im guessing its right > > the dot should be escaped > > > -- > Matus UHLAR - fantomas, uh...@fantomas.sk ; http://www.fantomas.sk/ > Warning: I wish NOT to receive e-mail advertising to this address. > Varovanie: na tuto adresu chcem NEDOSTAVAT akukolvek reklamnu postu. > BSE = Mad Cow Desease ... BSA = Mad Software Producents Desease > ___ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users > -- Regards, Robert K Wild. ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
[squid-users] regex for normal websites
hi all, think i got it right but just want to double check with you guys so in my "ssl::server_name" i had .adobe.com that worked but i want to mix normal website and regex websites together so i just have one list for all i now have this for "ssl::server_name_regex" ^.*adobe.com$ it works, so im guessing its right thanks, rob -- Regards, Robert K Wild. ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] fool windows into thinking it has internet access
nice thanks Eliezer, i was missing few registry entries - HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\NlaSvc\Parameters\Internet\EnableActiveProbing - Key Type: DWORD - Value: Decimal 0 (False) - HKLM\Software\Policies\Microsoft\Windows\NetworkConnectivityStatusIndicator\NoActiveProbe - Key Type: DWORD - Value: Decimal 1 (True) HKLM\Software\Policies\Microsoft\Windows\NetworkConnectivityStatusIndicator\DisablePassivePolling - Key Type: DWORD - Value: Decimal 1 (True) On Fri, 22 Jul 2022 at 11:32, wrote: > Hey Robert, > > > > The internet reachability test is composed of couple parts. > > Only one of them is HTTP. > > There is also an ICMP and DNS part to it. > > You can customize it on the registry at: > > > Computer\HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\NlaSvc\Parameters\Internet > > > > The windows internet access check doesn’t affect other software, they do > not rely on any of windows checks but do this by themselves. > > > > I hope this have helped you. > I can try to test it locally but it’s better that you will first make a > tiny effort to dump with wireshark the traffic on the interface > (after you have flushed the dns cahce) to verify what traffic windows use > to test the internet connectivity. > > > > Eliezer > > > > > > Eliezer Croitoru > > NgTech, Tech Support > > Mobile: +972-5-28704261 > > Email: ngtech1...@gmail.com > > Web: https://ngtech.co.il/ > > My-Tube: https://tube.ngtech.co.il/ > > > > *From:* robert k Wild > *Sent:* Friday, 22 July 2022 13:23 > *To:* Eliezer Croitoru > *Cc:* Squid Users > *Subject:* Re: [squid-users] fool windows into thinking it has internet > access > > > > so i need to whitelist > > > > internetbeacon.msedge.net > > ? > > > > On Thu, 21 Jul 2022 at 20:53, wrote: > > Take a peek at: > > > https://docs.microsoft.com/en-us/powershell/module/nettcpip/test-netconnection?view=windowsserver2022-ps > > > > This will highlight your issue and will probably make more sense into what > you see. > > > > Eliezer > > > > > > Eliezer Croitoru > > NgTech, Tech Support > > Mobile: +972-5-28704261 > > Email: ngtech1...@gmail.com > > Web: https://ngtech.co.il/ > > My-Tube: https://tube.ngtech.co.il/ > > > > *From:* squid-users *On > Behalf Of *robert k Wild > *Sent:* Wednesday, 20 July 2022 18:23 > *To:* Squid Users > *Subject:* [squid-users] fool windows into thinking it has internet access > > > > hi all, > > > > trying to fool windows it has internet access and not just network access > > > > looking at this guide > > > > > https://docs.microsoft.com/en-US/troubleshoot/windows-client/networking/internet-explorer-edge-open-connect-corporate-public-network > > > > i have put in my white list > > > > .msftconnecttest.com > .msftncsi.com > > > > i have whitelisted ports 80 and 443 by default > > > > on my windows client i have enabled the proxy via settings > proxy > > manual proxy setup > > > > and also i have enabled the winhttp proxy putting this in cmd > > > > netsh winhttp set proxy ip_address:port > > > > and i get back a connection test text saying "microsoft connect test" if i > go to > > > > http://www.msftconnecttest.com/connecttest.txt > > > > but my network icon is still saying just network access only not internet > access > > > > can anyone help me out please > > > > thanks, > > rob > > > -- > > Regards, > > Robert K Wild. > > ___ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users > > > > -- > > Regards, > > Robert K Wild. > -- Regards, Robert K Wild. ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] fool windows into thinking it has internet access
so i need to whitelist internetbeacon.msedge.net ? On Thu, 21 Jul 2022 at 20:53, wrote: > Take a peek at: > > > https://docs.microsoft.com/en-us/powershell/module/nettcpip/test-netconnection?view=windowsserver2022-ps > > > > This will highlight your issue and will probably make more sense into what > you see. > > > > Eliezer > > > > > > Eliezer Croitoru > > NgTech, Tech Support > > Mobile: +972-5-28704261 > > Email: ngtech1...@gmail.com > > Web: https://ngtech.co.il/ > > My-Tube: https://tube.ngtech.co.il/ > > > > *From:* squid-users *On > Behalf Of *robert k Wild > *Sent:* Wednesday, 20 July 2022 18:23 > *To:* Squid Users > *Subject:* [squid-users] fool windows into thinking it has internet access > > > > hi all, > > > > trying to fool windows it has internet access and not just network access > > > > looking at this guide > > > > > https://docs.microsoft.com/en-US/troubleshoot/windows-client/networking/internet-explorer-edge-open-connect-corporate-public-network > > > > i have put in my white list > > > > .msftconnecttest.com > .msftncsi.com > > > > i have whitelisted ports 80 and 443 by default > > > > on my windows client i have enabled the proxy via settings > proxy > > manual proxy setup > > > > and also i have enabled the winhttp proxy putting this in cmd > > > > netsh winhttp set proxy ip_address:port > > > > and i get back a connection test text saying "microsoft connect test" if i > go to > > > > http://www.msftconnecttest.com/connecttest.txt > > > > but my network icon is still saying just network access only not internet > access > > > > can anyone help me out please > > > > thanks, > > rob > > > -- > > Regards, > > Robert K Wild. > ___ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users > -- Regards, Robert K Wild. ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] fool windows into thinking it has internet access
Good question, as on my squid proxy I have whitelisted all of the Adobe URLs needed for Adobe cc On windows clients, to even open the Adobe cc app, it needs internet access, now as this pc is on a production network, it can't have internet access so I have fooled it and now I can open the cc app On Wed, 20 Jul 2022, 18:44 Antony Stone, wrote: > On Wednesday 20 July 2022 at 19:19:22, robert k Wild wrote: > > > ok i have realised something, my client cant resolve this address > > > > C:\Users\rkw>ping dns.msftncsi.com > > Ping request could not find host dns.msftncsi.com. Please check the name > > and try again. > > > > is there anyway i can enable ICMP/ping via the proxy so this works? > > No, but you could add that name, either to the machine which wants to > contact > it, or to your local DNS server, so that it resolves to something on your > network (or localhost if you prefer). > > Out of interest, what is the purpose for making a Windows computer think > it > has Internet access when it doesn't? What useful difference does that > make? > > > Antony. > > -- > "Tannenbaumschmuck" is a perfectly reasonable German word > meaning Christmas tree decorations, and is not a quote from Linus Torvalds. > >Please reply to the > list; > please *don't* CC > me. > ___ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users > ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] fool windows into thinking it has internet access
ok i have realised something, my client cant resolve this address C:\Users\rkw>ping dns.msftncsi.com Ping request could not find host dns.msftncsi.com. Please check the name and try again. is there anyway i can enable ICMP/ping via the proxy so this works? thanks, rob On Wed, 20 Jul 2022 at 16:23, robert k Wild wrote: > hi all, > > trying to fool windows it has internet access and not just network access > > looking at this guide > > > https://docs.microsoft.com/en-US/troubleshoot/windows-client/networking/internet-explorer-edge-open-connect-corporate-public-network > > i have put in my white list > > .msftconnecttest.com > .msftncsi.com > > i have whitelisted ports 80 and 443 by default > > on my windows client i have enabled the proxy via settings > proxy > > manual proxy setup > > and also i have enabled the winhttp proxy putting this in cmd > > netsh winhttp set proxy ip_address:port > > and i get back a connection test text saying "microsoft connect test" if i > go to > > http://www.msftconnecttest.com/connecttest.txt > > but my network icon is still saying just network access only not internet > access > > can anyone help me out please > > thanks, > rob > > -- > Regards, > > Robert K Wild. > -- Regards, Robert K Wild. ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
[squid-users] fool windows into thinking it has internet access
hi all, trying to fool windows it has internet access and not just network access looking at this guide https://docs.microsoft.com/en-US/troubleshoot/windows-client/networking/internet-explorer-edge-open-connect-corporate-public-network i have put in my white list .msftconnecttest.com .msftncsi.com i have whitelisted ports 80 and 443 by default on my windows client i have enabled the proxy via settings > proxy > manual proxy setup and also i have enabled the winhttp proxy putting this in cmd netsh winhttp set proxy ip_address:port and i get back a connection test text saying "microsoft connect test" if i go to http://www.msftconnecttest.com/connecttest.txt but my network icon is still saying just network access only not internet access can anyone help me out please thanks, rob -- Regards, Robert K Wild. ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] Logrotate question
So what your saying is change the -k reconfigure to -k rotate Even tho I've configured squid from source On Sat, 18 Jun 2022, 00:17 Amos Jeffries, wrote: > On 18/06/22 07:06, robert k Wild wrote: > > i understand it now > > > > Er, no. > > > cat /etc/logrotate.d/squid > > /usr/local/squid/var/logs/*.log { > ... > > postrotate > > /usr/local/squid/sbin/squid -k reconfigure > > endscript > > } > > > > needed the sharedscripts to run the postrotate just once for all logs > > > > didnt need the squid -k rotate as already handled by logrotate > > > > The "-k rotate" tells Squid to open and start writing to the new log > files created by logrotate tool. > > That "-k reconfigure" you have chosen is a far more complicated and slow > operation. It is not guaranteed to change the files Squid is writing to. > > > PS. you also need to configure "logfile_rotate 0" for self-built Squid > to prevent it doing any log file renumbering when logrotated is used. > > Amos > ___ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users > ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] Logrotate question
i understand it now cat /etc/logrotate.d/squid /usr/local/squid/var/logs/*.log { monthly rotate 4 compress delaycompress missingok sharedscripts dateext dateformat -%d%m%Y su root root postrotate /usr/local/squid/sbin/squid -k reconfigure endscript } needed the sharedscripts to run the postrotate just once for all logs didnt need the squid -k rotate as already handled by logrotate On Fri, 17 Jun 2022 at 19:27, robert k Wild wrote: > just a question, im just playing with logrotate > > /usr/local/squid/var/logs/*.log { > monthly > rotate 4 > compress > delaycompress > missingok > dateext > dateformat -%d%m%Y > su root root > postrotate > /usr/local/squid/sbin/squid -k reconfigure > endscript > } > > why is the "squid -k rotate" listed under "postrotate" as logrotate is > handling it > > On Thu, 16 Jun 2022 at 14:06, robert k Wild wrote: > >> Thanks Eliezer, really appreciate it :) >> >> On Thu, 16 Jun 2022, 14:03 , wrote: >> >>> So just create the file I sent you before or extract the file from the >>> squid RPM using “rpm2cpio squid…rpm |cpio -dimv” in some tmp dir. >>> >>> You will just need to copy the file into the proper location, disable >>> the cron you have created and if the squid binary is in a specific >>> different folder change the path of the squid binary in the squid >>> logrotate file accordingly. >>> >>> >>> >>> All The Bests, >>> >>> Eliezer >>> >>> >>> >>> *From:* robert k Wild >>> *Sent:* Thursday, 16 June 2022 15:24 >>> *To:* Eliezer Croitoru >>> *Cc:* Squid Users >>> *Subject:* Re: [squid-users] Logrotate question >>> >>> >>> >>> No squid isn't sorry it is compiled from source, I forgot to add it >>> sorry about that >>> >>> >>> >>> On Thu, 16 Jun 2022, 13:19 , wrote: >>> >>> Since this one is from yum install it’s very simple to just change the >>> config files of squid and logrotate. >>> >>> >>> >>> If you need more assistance let me know. >>> >>> >>> >>> Eliezer >>> >>> >>> >>> *From:* robert k Wild >>> *Sent:* Thursday, 16 June 2022 14:52 >>> *To:* Eliezer Croitoru >>> *Cc:* Squid Users >>> *Subject:* Re: [squid-users] Logrotate question >>> >>> >>> >>> Self compiled from source with others ie >>> >>> >>> >>> Squidclamav >>> >>> Cicap >>> >>> Cicap modules >>> >>> >>> >>> And clamav but did this one via yum install >>> >>> >>> >>> >>> >>> On Thu, 16 Jun 2022, 12:27 , wrote: >>> >>> How did you installed squid on CentOS 7? >>> >>> From my packages or the OS default or self compiled or another source? >>> >>> >>> >>> Eliezer >>> >>> >>> >>> *From:* robert k Wild >>> *Sent:* Thursday, 16 June 2022 14:05 >>> *To:* Eliezer Croitoru >>> *Cc:* Squid Users >>> *Subject:* Re: [squid-users] Logrotate question >>> >>> >>> >>> Oops sorry you did say that, sorry I didn't see that at first >>> >>> >>> >>> On Thu, 16 Jun 2022, 12:04 robert k Wild, wrote: >>> >>> I imagine Eliezer that's what I need to put in logrotate.conf file >>> >>> >>> >>> On Thu, 16 Jun 2022, 12:01 , wrote: >>> >>> Oops, >>> >>> >>> >>> The next is the file: /etc/logrotate.d/squid >>> >>> ##START >>> >>> /var/log/squid/*.log { >>> >>> weekly >>> >>> rotate 5 >>> >>> compress >>> >>> notifempty >>> >>> missingok >>> >>> nocreate >>> >>> sharedscripts >>> >>> postrotate >>> >>> # Asks squid to reopen its logs. (logfile_rotate 0 is set in >>> squid.conf) >>> >>> # errors redirected to make it silent if squid is not running >>> >>> /usr/sbin/squid -k rotate 2>/dev/null >>> >>> # Wait a little to allow Squid to catch up before the logs is >>> compressed >>> >>> sleep 1 >>> >>> endscript &g
Re: [squid-users] Logrotate question
just a question, im just playing with logrotate /usr/local/squid/var/logs/*.log { monthly rotate 4 compress delaycompress missingok dateext dateformat -%d%m%Y su root root postrotate /usr/local/squid/sbin/squid -k reconfigure endscript } why is the "squid -k rotate" listed under "postrotate" as logrotate is handling it On Thu, 16 Jun 2022 at 14:06, robert k Wild wrote: > Thanks Eliezer, really appreciate it :) > > On Thu, 16 Jun 2022, 14:03 , wrote: > >> So just create the file I sent you before or extract the file from the >> squid RPM using “rpm2cpio squid…rpm |cpio -dimv” in some tmp dir. >> >> You will just need to copy the file into the proper location, disable the >> cron you have created and if the squid binary is in a specific different >> folder change the path of the squid binary in the squid logrotate file >> accordingly. >> >> >> >> All The Bests, >> >> Eliezer >> >> >> >> *From:* robert k Wild >> *Sent:* Thursday, 16 June 2022 15:24 >> *To:* Eliezer Croitoru >> *Cc:* Squid Users >> *Subject:* Re: [squid-users] Logrotate question >> >> >> >> No squid isn't sorry it is compiled from source, I forgot to add it sorry >> about that >> >> >> >> On Thu, 16 Jun 2022, 13:19 , wrote: >> >> Since this one is from yum install it’s very simple to just change the >> config files of squid and logrotate. >> >> >> >> If you need more assistance let me know. >> >> >> >> Eliezer >> >> >> >> *From:* robert k Wild >> *Sent:* Thursday, 16 June 2022 14:52 >> *To:* Eliezer Croitoru >> *Cc:* Squid Users >> *Subject:* Re: [squid-users] Logrotate question >> >> >> >> Self compiled from source with others ie >> >> >> >> Squidclamav >> >> Cicap >> >> Cicap modules >> >> >> >> And clamav but did this one via yum install >> >> >> >> >> >> On Thu, 16 Jun 2022, 12:27 , wrote: >> >> How did you installed squid on CentOS 7? >> >> From my packages or the OS default or self compiled or another source? >> >> >> >> Eliezer >> >> >> >> *From:* robert k Wild >> *Sent:* Thursday, 16 June 2022 14:05 >> *To:* Eliezer Croitoru >> *Cc:* Squid Users >> *Subject:* Re: [squid-users] Logrotate question >> >> >> >> Oops sorry you did say that, sorry I didn't see that at first >> >> >> >> On Thu, 16 Jun 2022, 12:04 robert k Wild, wrote: >> >> I imagine Eliezer that's what I need to put in logrotate.conf file >> >> >> >> On Thu, 16 Jun 2022, 12:01 , wrote: >> >> Oops, >> >> >> >> The next is the file: /etc/logrotate.d/squid >> >> ##START >> >> /var/log/squid/*.log { >> >> weekly >> >> rotate 5 >> >> compress >> >> notifempty >> >> missingok >> >> nocreate >> >> sharedscripts >> >> postrotate >> >> # Asks squid to reopen its logs. (logfile_rotate 0 is set in >> squid.conf) >> >> # errors redirected to make it silent if squid is not running >> >> /usr/sbin/squid -k rotate 2>/dev/null >> >> # Wait a little to allow Squid to catch up before the logs is >> compressed >> >> sleep 1 >> >> endscript >> >> } >> >> ##END >> >> >> >> So you need to change the rotate to 92+ and also change the squid number >> of logs to the same number. >> >> >> >> Let me know if you it’s helpful. >> >> >> >> Eliezer >> >> >> >> *From:* ngtech1...@gmail.com >> *Sent:* Thursday, 16 June 2022 14:00 >> *To:* 'robert k Wild' ; 'Squid Users' < >> squid-users@lists.squid-cache.org> >> *Subject:* RE: [squid-users] Logrotate question >> >> >> >> Hey Rob, >> >> >> >> The next is the file: >> >> >> >> >> >> *From:* squid-users *On >> Behalf Of *robert k Wild >> *Sent:* Thursday, 16 June 2022 13:27 >> *To:* Squid Users >> *Subject:* Re: [squid-users] Logrotate question >> >> >> >> Cool, so I will rotate daily and delete after 91 days, thanks guys >> >> >> >> On Thu, 16 Jun 2022, 11:14 Matus UHLAR - fantomas, >> wrote: &
Re: [squid-users] Logrotate question
Thanks Eliezer, really appreciate it :) On Thu, 16 Jun 2022, 14:03 , wrote: > So just create the file I sent you before or extract the file from the > squid RPM using “rpm2cpio squid…rpm |cpio -dimv” in some tmp dir. > > You will just need to copy the file into the proper location, disable the > cron you have created and if the squid binary is in a specific different > folder change the path of the squid binary in the squid logrotate file > accordingly. > > > > All The Bests, > > Eliezer > > > > *From:* robert k Wild > *Sent:* Thursday, 16 June 2022 15:24 > *To:* Eliezer Croitoru > *Cc:* Squid Users > *Subject:* Re: [squid-users] Logrotate question > > > > No squid isn't sorry it is compiled from source, I forgot to add it sorry > about that > > > > On Thu, 16 Jun 2022, 13:19 , wrote: > > Since this one is from yum install it’s very simple to just change the > config files of squid and logrotate. > > > > If you need more assistance let me know. > > > > Eliezer > > > > *From:* robert k Wild > *Sent:* Thursday, 16 June 2022 14:52 > *To:* Eliezer Croitoru > *Cc:* Squid Users > *Subject:* Re: [squid-users] Logrotate question > > > > Self compiled from source with others ie > > > > Squidclamav > > Cicap > > Cicap modules > > > > And clamav but did this one via yum install > > > > > > On Thu, 16 Jun 2022, 12:27 , wrote: > > How did you installed squid on CentOS 7? > > From my packages or the OS default or self compiled or another source? > > > > Eliezer > > > > *From:* robert k Wild > *Sent:* Thursday, 16 June 2022 14:05 > *To:* Eliezer Croitoru > *Cc:* Squid Users > *Subject:* Re: [squid-users] Logrotate question > > > > Oops sorry you did say that, sorry I didn't see that at first > > > > On Thu, 16 Jun 2022, 12:04 robert k Wild, wrote: > > I imagine Eliezer that's what I need to put in logrotate.conf file > > > > On Thu, 16 Jun 2022, 12:01 , wrote: > > Oops, > > > > The next is the file: /etc/logrotate.d/squid > > ##START > > /var/log/squid/*.log { > > weekly > > rotate 5 > > compress > > notifempty > > missingok > > nocreate > > sharedscripts > > postrotate > > # Asks squid to reopen its logs. (logfile_rotate 0 is set in > squid.conf) > > # errors redirected to make it silent if squid is not running > > /usr/sbin/squid -k rotate 2>/dev/null > > # Wait a little to allow Squid to catch up before the logs is > compressed > > sleep 1 > > endscript > > } > > ##END > > > > So you need to change the rotate to 92+ and also change the squid number > of logs to the same number. > > > > Let me know if you it’s helpful. > > > > Eliezer > > > > *From:* ngtech1...@gmail.com > *Sent:* Thursday, 16 June 2022 14:00 > *To:* 'robert k Wild' ; 'Squid Users' < > squid-users@lists.squid-cache.org> > *Subject:* RE: [squid-users] Logrotate question > > > > Hey Rob, > > > > The next is the file: > > > > > > *From:* squid-users *On > Behalf Of *robert k Wild > *Sent:* Thursday, 16 June 2022 13:27 > *To:* Squid Users > *Subject:* Re: [squid-users] Logrotate question > > > > Cool, so I will rotate daily and delete after 91 days, thanks guys > > > > On Thu, 16 Jun 2022, 11:14 Matus UHLAR - fantomas, > wrote: > > On 16.06.22 10:54, robert k Wild wrote: > >Basically I want to keep logs for 3 months then rotate so it overwrites > >them with another 3 months, if that makes sense > > in fact, it does not. > > I guess you are supposed to keep 3 months of logs, which mean, you always > need to have 3 months of logs available. > > Each day, you can delete log files over 3 months old. > > If you rotated lof once in 3 months, you would have single file with 3 > months of logs in it, and could remove it 3 months after rotating, when > first logs would be 6 months old. > > As we already told you, rotate daily and remove old logs after 92 days. > and use logrotate config. > > -- > Matus UHLAR - fantomas, uh...@fantomas.sk ; http://www.fantomas.sk/ > Warning: I wish NOT to receive e-mail advertising to this address. > Varovanie: na tuto adresu chcem NEDOSTAVAT akukolvek reklamnu postu. > Linux - It's now safe to turn on your computer. > Linux - Teraz mozete pocitac bez obav zapnut. > ___ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users > > ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] Logrotate question
No squid isn't sorry it is compiled from source, I forgot to add it sorry about that On Thu, 16 Jun 2022, 13:19 , wrote: > Since this one is from yum install it’s very simple to just change the > config files of squid and logrotate. > > > > If you need more assistance let me know. > > > > Eliezer > > > > *From:* robert k Wild > *Sent:* Thursday, 16 June 2022 14:52 > *To:* Eliezer Croitoru > *Cc:* Squid Users > *Subject:* Re: [squid-users] Logrotate question > > > > Self compiled from source with others ie > > > > Squidclamav > > Cicap > > Cicap modules > > > > And clamav but did this one via yum install > > > > > > On Thu, 16 Jun 2022, 12:27 , wrote: > > How did you installed squid on CentOS 7? > > From my packages or the OS default or self compiled or another source? > > > > Eliezer > > > > *From:* robert k Wild > *Sent:* Thursday, 16 June 2022 14:05 > *To:* Eliezer Croitoru > *Cc:* Squid Users > *Subject:* Re: [squid-users] Logrotate question > > > > Oops sorry you did say that, sorry I didn't see that at first > > > > On Thu, 16 Jun 2022, 12:04 robert k Wild, wrote: > > I imagine Eliezer that's what I need to put in logrotate.conf file > > > > On Thu, 16 Jun 2022, 12:01 , wrote: > > Oops, > > > > The next is the file: /etc/logrotate.d/squid > > ##START > > /var/log/squid/*.log { > > weekly > > rotate 5 > > compress > > notifempty > > missingok > > nocreate > > sharedscripts > > postrotate > > # Asks squid to reopen its logs. (logfile_rotate 0 is set in > squid.conf) > > # errors redirected to make it silent if squid is not running > > /usr/sbin/squid -k rotate 2>/dev/null > > # Wait a little to allow Squid to catch up before the logs is > compressed > > sleep 1 > > endscript > > } > > ##END > > > > So you need to change the rotate to 92+ and also change the squid number > of logs to the same number. > > > > Let me know if you it’s helpful. > > > > Eliezer > > > > *From:* ngtech1...@gmail.com > *Sent:* Thursday, 16 June 2022 14:00 > *To:* 'robert k Wild' ; 'Squid Users' < > squid-users@lists.squid-cache.org> > *Subject:* RE: [squid-users] Logrotate question > > > > Hey Rob, > > > > The next is the file: > > > > > > *From:* squid-users *On > Behalf Of *robert k Wild > *Sent:* Thursday, 16 June 2022 13:27 > *To:* Squid Users > *Subject:* Re: [squid-users] Logrotate question > > > > Cool, so I will rotate daily and delete after 91 days, thanks guys > > > > On Thu, 16 Jun 2022, 11:14 Matus UHLAR - fantomas, > wrote: > > On 16.06.22 10:54, robert k Wild wrote: > >Basically I want to keep logs for 3 months then rotate so it overwrites > >them with another 3 months, if that makes sense > > in fact, it does not. > > I guess you are supposed to keep 3 months of logs, which mean, you always > need to have 3 months of logs available. > > Each day, you can delete log files over 3 months old. > > If you rotated lof once in 3 months, you would have single file with 3 > months of logs in it, and could remove it 3 months after rotating, when > first logs would be 6 months old. > > As we already told you, rotate daily and remove old logs after 92 days. > and use logrotate config. > > -- > Matus UHLAR - fantomas, uh...@fantomas.sk ; http://www.fantomas.sk/ > Warning: I wish NOT to receive e-mail advertising to this address. > Varovanie: na tuto adresu chcem NEDOSTAVAT akukolvek reklamnu postu. > Linux - It's now safe to turn on your computer. > Linux - Teraz mozete pocitac bez obav zapnut. > ___ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users > > ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] Logrotate question
Self compiled from source with others ie Squidclamav Cicap Cicap modules And clamav but did this one via yum install On Thu, 16 Jun 2022, 12:27 , wrote: > How did you installed squid on CentOS 7? > > From my packages or the OS default or self compiled or another source? > > > > Eliezer > > > > *From:* robert k Wild > *Sent:* Thursday, 16 June 2022 14:05 > *To:* Eliezer Croitoru > *Cc:* Squid Users > *Subject:* Re: [squid-users] Logrotate question > > > > Oops sorry you did say that, sorry I didn't see that at first > > > > On Thu, 16 Jun 2022, 12:04 robert k Wild, wrote: > > I imagine Eliezer that's what I need to put in logrotate.conf file > > > > On Thu, 16 Jun 2022, 12:01 , wrote: > > Oops, > > > > The next is the file: /etc/logrotate.d/squid > > ##START > > /var/log/squid/*.log { > > weekly > > rotate 5 > > compress > > notifempty > > missingok > > nocreate > > sharedscripts > > postrotate > > # Asks squid to reopen its logs. (logfile_rotate 0 is set in > squid.conf) > > # errors redirected to make it silent if squid is not running > > /usr/sbin/squid -k rotate 2>/dev/null > > # Wait a little to allow Squid to catch up before the logs is > compressed > > sleep 1 > > endscript > > } > > ##END > > > > So you need to change the rotate to 92+ and also change the squid number > of logs to the same number. > > > > Let me know if you it’s helpful. > > > > Eliezer > > > > *From:* ngtech1...@gmail.com > *Sent:* Thursday, 16 June 2022 14:00 > *To:* 'robert k Wild' ; 'Squid Users' < > squid-users@lists.squid-cache.org> > *Subject:* RE: [squid-users] Logrotate question > > > > Hey Rob, > > > > The next is the file: > > > > > > *From:* squid-users *On > Behalf Of *robert k Wild > *Sent:* Thursday, 16 June 2022 13:27 > *To:* Squid Users > *Subject:* Re: [squid-users] Logrotate question > > > > Cool, so I will rotate daily and delete after 91 days, thanks guys > > > > On Thu, 16 Jun 2022, 11:14 Matus UHLAR - fantomas, > wrote: > > On 16.06.22 10:54, robert k Wild wrote: > >Basically I want to keep logs for 3 months then rotate so it overwrites > >them with another 3 months, if that makes sense > > in fact, it does not. > > I guess you are supposed to keep 3 months of logs, which mean, you always > need to have 3 months of logs available. > > Each day, you can delete log files over 3 months old. > > If you rotated lof once in 3 months, you would have single file with 3 > months of logs in it, and could remove it 3 months after rotating, when > first logs would be 6 months old. > > As we already told you, rotate daily and remove old logs after 92 days. > and use logrotate config. > > -- > Matus UHLAR - fantomas, uh...@fantomas.sk ; http://www.fantomas.sk/ > Warning: I wish NOT to receive e-mail advertising to this address. > Varovanie: na tuto adresu chcem NEDOSTAVAT akukolvek reklamnu postu. > Linux - It's now safe to turn on your computer. > Linux - Teraz mozete pocitac bez obav zapnut. > ___ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users > > ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] Logrotate question
Oops sorry you did say that, sorry I didn't see that at first On Thu, 16 Jun 2022, 12:04 robert k Wild, wrote: > I imagine Eliezer that's what I need to put in logrotate.conf file > > On Thu, 16 Jun 2022, 12:01 , wrote: > >> Oops, >> >> >> >> The next is the file: /etc/logrotate.d/squid >> >> ##START >> >> /var/log/squid/*.log { >> >> weekly >> >> rotate 5 >> >> compress >> >> notifempty >> >> missingok >> >> nocreate >> >> sharedscripts >> >> postrotate >> >> # Asks squid to reopen its logs. (logfile_rotate 0 is set in >> squid.conf) >> >> # errors redirected to make it silent if squid is not running >> >> /usr/sbin/squid -k rotate 2>/dev/null >> >> # Wait a little to allow Squid to catch up before the logs is >> compressed >> >> sleep 1 >> >> endscript >> >> } >> >> ##END >> >> >> >> So you need to change the rotate to 92+ and also change the squid number >> of logs to the same number. >> >> >> >> Let me know if you it’s helpful. >> >> >> >> Eliezer >> >> >> >> *From:* ngtech1...@gmail.com >> *Sent:* Thursday, 16 June 2022 14:00 >> *To:* 'robert k Wild' ; 'Squid Users' < >> squid-users@lists.squid-cache.org> >> *Subject:* RE: [squid-users] Logrotate question >> >> >> >> Hey Rob, >> >> >> >> The next is the file: >> >> >> >> >> >> *From:* squid-users *On >> Behalf Of *robert k Wild >> *Sent:* Thursday, 16 June 2022 13:27 >> *To:* Squid Users >> *Subject:* Re: [squid-users] Logrotate question >> >> >> >> Cool, so I will rotate daily and delete after 91 days, thanks guys >> >> >> >> On Thu, 16 Jun 2022, 11:14 Matus UHLAR - fantomas, >> wrote: >> >> On 16.06.22 10:54, robert k Wild wrote: >> >Basically I want to keep logs for 3 months then rotate so it overwrites >> >them with another 3 months, if that makes sense >> >> in fact, it does not. >> >> I guess you are supposed to keep 3 months of logs, which mean, you always >> need to have 3 months of logs available. >> >> Each day, you can delete log files over 3 months old. >> >> If you rotated lof once in 3 months, you would have single file with 3 >> months of logs in it, and could remove it 3 months after rotating, when >> first logs would be 6 months old. >> >> As we already told you, rotate daily and remove old logs after 92 days. >> and use logrotate config. >> >> -- >> Matus UHLAR - fantomas, uh...@fantomas.sk ; http://www.fantomas.sk/ >> Warning: I wish NOT to receive e-mail advertising to this address. >> Varovanie: na tuto adresu chcem NEDOSTAVAT akukolvek reklamnu postu. >> Linux - It's now safe to turn on your computer. >> Linux - Teraz mozete pocitac bez obav zapnut. >> ___ >> squid-users mailing list >> squid-users@lists.squid-cache.org >> http://lists.squid-cache.org/listinfo/squid-users >> >> ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] Logrotate question
I imagine Eliezer that's what I need to put in logrotate.conf file On Thu, 16 Jun 2022, 12:01 , wrote: > Oops, > > > > The next is the file: /etc/logrotate.d/squid > > ##START > > /var/log/squid/*.log { > > weekly > > rotate 5 > > compress > > notifempty > > missingok > > nocreate > > sharedscripts > > postrotate > > # Asks squid to reopen its logs. (logfile_rotate 0 is set in > squid.conf) > > # errors redirected to make it silent if squid is not running > > /usr/sbin/squid -k rotate 2>/dev/null > > # Wait a little to allow Squid to catch up before the logs is > compressed > > sleep 1 > > endscript > > } > > ##END > > > > So you need to change the rotate to 92+ and also change the squid number > of logs to the same number. > > > > Let me know if you it’s helpful. > > > > Eliezer > > > > *From:* ngtech1...@gmail.com > *Sent:* Thursday, 16 June 2022 14:00 > *To:* 'robert k Wild' ; 'Squid Users' < > squid-users@lists.squid-cache.org> > *Subject:* RE: [squid-users] Logrotate question > > > > Hey Rob, > > > > The next is the file: > > > > > > *From:* squid-users *On > Behalf Of *robert k Wild > *Sent:* Thursday, 16 June 2022 13:27 > *To:* Squid Users > *Subject:* Re: [squid-users] Logrotate question > > > > Cool, so I will rotate daily and delete after 91 days, thanks guys > > > > On Thu, 16 Jun 2022, 11:14 Matus UHLAR - fantomas, > wrote: > > On 16.06.22 10:54, robert k Wild wrote: > >Basically I want to keep logs for 3 months then rotate so it overwrites > >them with another 3 months, if that makes sense > > in fact, it does not. > > I guess you are supposed to keep 3 months of logs, which mean, you always > need to have 3 months of logs available. > > Each day, you can delete log files over 3 months old. > > If you rotated lof once in 3 months, you would have single file with 3 > months of logs in it, and could remove it 3 months after rotating, when > first logs would be 6 months old. > > As we already told you, rotate daily and remove old logs after 92 days. > and use logrotate config. > > -- > Matus UHLAR - fantomas, uh...@fantomas.sk ; http://www.fantomas.sk/ > Warning: I wish NOT to receive e-mail advertising to this address. > Varovanie: na tuto adresu chcem NEDOSTAVAT akukolvek reklamnu postu. > Linux - It's now safe to turn on your computer. > Linux - Teraz mozete pocitac bez obav zapnut. > ___ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users > > ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] Logrotate question
Ok sorry as normally 3 months is 90 days and I just added a 1 to it, but I get what you mean if months have 31 I should really do 93 as I want to give it extra room incase On Thu, 16 Jun 2022, 11:41 Antony Stone, wrote: > On Thursday 16 June 2022 at 11:26:37, robert k Wild wrote: > > > Cool, so I will rotate daily and delete after 91 days, thanks guys > > Why did you change the recommended 92 days into 91? > > Consider June, July and August: > > June has 30 days > July has 31 days > August has 31 days > > So, on September 1st, June 1st is 92 days ago, and you can delete the logs > for > May 31st, which are older than 92 days. > > If you deleted older than 91 days, you would be deleting June 1st on > September > 1st, and one day this might be significant to someone. > > > Antony. > > > On Thu, 16 Jun 2022, 11:14 Matus UHLAR - fantomas wrote: > > > On 16.06.22 10:54, robert k Wild wrote: > > > >Basically I want to keep logs for 3 months then rotate so it > overwrites > > > >them with another 3 months, if that makes sense > > > > > > in fact, it does not. > > > > > > I guess you are supposed to keep 3 months of logs, which mean, you > always > > > need to have 3 months of logs available. > > > > > > Each day, you can delete log files over 3 months old. > > > > > > If you rotated lof once in 3 months, you would have single file with 3 > > > months of logs in it, and could remove it 3 months after rotating, when > > > first logs would be 6 months old. > > > > > > As we already told you, rotate daily and remove old logs after 92 days. > > > and use logrotate config. > > -- > I just got a new mobile phone, and I called it Titanic. It's already > syncing. > >Please reply to the > list; > please *don't* CC > me. > ___ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users > ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] Logrotate question
Cool, so I will rotate daily and delete after 91 days, thanks guys On Thu, 16 Jun 2022, 11:14 Matus UHLAR - fantomas, wrote: > On 16.06.22 10:54, robert k Wild wrote: > >Basically I want to keep logs for 3 months then rotate so it overwrites > >them with another 3 months, if that makes sense > > in fact, it does not. > > I guess you are supposed to keep 3 months of logs, which mean, you always > need to have 3 months of logs available. > > Each day, you can delete log files over 3 months old. > > If you rotated lof once in 3 months, you would have single file with 3 > months of logs in it, and could remove it 3 months after rotating, when > first logs would be 6 months old. > > As we already told you, rotate daily and remove old logs after 92 days. > and use logrotate config. > > -- > Matus UHLAR - fantomas, uh...@fantomas.sk ; http://www.fantomas.sk/ > Warning: I wish NOT to receive e-mail advertising to this address. > Varovanie: na tuto adresu chcem NEDOSTAVAT akukolvek reklamnu postu. > Linux - It's now safe to turn on your computer. > Linux - Teraz mozete pocitac bez obav zapnut. > ___ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users > ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] Logrotate question
Hi Eliezer, Basically I want to keep logs for 3 months then rotate so it overwrites them with another 3 months, if that makes sense On Thu, 16 Jun 2022, 10:39 , wrote: > Hey Rob, > > > > First there is a difference between rotation and deletion. > > If it’s not a loaded system then 3 month is ok but… in most use cases > it’s better to rotate every day but to delete after 3 month. > > You have the choice to compress the files or to leave them in plain text > but it’s only a choice of resources preservation. > > > > Let me see, I will look at my CentOS 7 system and will try to find the > right way to do it. > > > > Eliezer > > > > *From:* robert k Wild > *Sent:* Thursday, 16 June 2022 11:28 > *To:* Eliezer Croitoru > *Cc:* Squid Users > *Subject:* Re: [squid-users] Logrotate question > > > > Thanks Eliezer > > > > I have centos 7 and I want it to rotate every 3 months as we need to keep > logs for every 3 months. > > > > Thanks, > > Rob > > > > On Thu, 16 Jun 2022, 08:11 , wrote: > > Rob, > > > > It will be different how you implement and use logrotate manually or with > the logrotate tools. > > What OS are you using? > > > > Eliezer > > > > *From:* squid-users *On > Behalf Of *robert k Wild > *Sent:* Wednesday, 15 June 2022 20:19 > *To:* Squid Users > *Subject:* [squid-users] Logrotate question > > > > Hi all, > > > > ATM to clear the logs, I do this in crontab, every 3 months > > > > 0 0 1 */3 * echo "" > /usr/local/squid/var/logs/access.log and do the same > for cache log > > > > It works but I want to really use log rotate ie > > > > 0 0 1 */3 * /usr/local/squid/sbin/squid -k rotate > > > > I hear log rotate keeps 10 files by default so does that mean I will have > 10 access logs etc and also will it keep the file the same ie won't change > the size or compress it to save space > > > > Thanks, > > Rob > > ___ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users > > ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] Logrotate question
Cool, Thanks all, il try the logrotate program instead of using squids one Thanks guys :) On Thu, 16 Jun 2022, 10:26 Matus UHLAR - fantomas, wrote: > On 16.06.22 10:23, robert k Wild wrote: > >So I can use the package logrotate instead of the squid one > > squid packages in debian comes configured for rotating logs with logrotate. > - logfile_rotate is set to 0 > - logrotate config file tells when/how to rotate > > perhaps it's the same with centos. > > > >On Thu, 16 Jun 2022, 10:22 Matus UHLAR - fantomas, > >wrote: > > > >> On 16.06.22 09:53, robert k Wild wrote: > >> >All I know is I need to keep a record of up to 3 months, worth of logs, > >> due > >> >to gdpr, how would you say I go about this > >> > >> keeping 3 months of log is very different from rotating each 3 months. > >> configure logrotate to rotate daily and keep 92 days worth of logs. > >> > >> I believe centos squid package comes with logrotate configured, should > be > >> in > >> /etc/logrotate.d/squid > > > -- > Matus UHLAR - fantomas, uh...@fantomas.sk ; http://www.fantomas.sk/ > Warning: I wish NOT to receive e-mail advertising to this address. > Varovanie: na tuto adresu chcem NEDOSTAVAT akukolvek reklamnu postu. > My mind is like a steel trap - rusty and illegal in 37 states. > ___ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users > ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] Logrotate question
So I can use the package logrotate instead of the squid one On Thu, 16 Jun 2022, 10:22 Matus UHLAR - fantomas, wrote: > On 16.06.22 09:53, robert k Wild wrote: > >All I know is I need to keep a record of up to 3 months, worth of logs, > due > >to gdpr, how would you say I go about this > > keeping 3 months of log is very different from rotating each 3 months. > configure logrotate to rotate daily and keep 92 days worth of logs. > > I believe centos squid package comes with logrotate configured, should be > in > /etc/logrotate.d/squid > > -- > Matus UHLAR - fantomas, uh...@fantomas.sk ; http://www.fantomas.sk/ > Warning: I wish NOT to receive e-mail advertising to this address. > Varovanie: na tuto adresu chcem NEDOSTAVAT akukolvek reklamnu postu. > I feel like I'm diagonally parked in a parallel universe. > ___ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users > ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] Logrotate question
Hi Antony, All I know is I need to keep a record of up to 3 months, worth of logs, due to gdpr, how would you say I go about this Thanks, Rob On Thu, 16 Jun 2022, 09:49 Antony Stone, wrote: > On Thursday 16 June 2022 at 09:27:32, robert k Wild wrote: > > > Thanks Eliezer > > > > I have centos 7 and I want it to rotate every 3 months as we need to keep > > logs for every 3 months. > > Do you really mean you "need to keep logs for every 3 months"? > > Or do you mean that you need to keep "the most recent 3 months' logs"? > > I would recommend that you rotate every month, and keep 4 months' logs. > > Firstly, there's no point in letting indivdual log files grow too large, > and > secondly, you then know that at all times you have the current month's > logs, > plus the previous 3 months, until the fourth one gets deleted by logrotate. > > > Antony. > > > On Thu, 16 Jun 2022, 08:11 , wrote: > > > Rob, > > > > > > It will be different how you implement and use logrotate manually or > with > > > the logrotate tools. > > > > > > What OS are you using? > > > > > > Eliezer > > > > > > *From:* squid-users *On > > > Behalf Of *robert k Wild > > > *Sent:* Wednesday, 15 June 2022 20:19 > > > *To:* Squid Users > > > *Subject:* [squid-users] Logrotate question > > > > > > Hi all, > > > > > > ATM to clear the logs, I do this in crontab, every 3 months > > > > > > 0 0 1 */3 * echo "" > /usr/local/squid/var/logs/access.log and do the > > > same for cache log > > > > > > It works but I want to really use log rotate ie > > > > > > 0 0 1 */3 * /usr/local/squid/sbin/squid -k rotate > > > > > > I hear log rotate keeps 10 files by default so does that mean I will > have > > > 10 access logs etc and also will it keep the file the same ie won't > > > change the size or compress it to save space > > > > > > > > > Thanks, > > > > > > Rob > > > ___ > > > squid-users mailing list > > > squid-users@lists.squid-cache.org > > > http://lists.squid-cache.org/listinfo/squid-users > > -- > The next sentence is untrue. > The previous sentence is also not true. > >Please reply to the > list; > please *don't* CC > me. > ___ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users > ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] Logrotate question
Thanks Eliezer I have centos 7 and I want it to rotate every 3 months as we need to keep logs for every 3 months. Thanks, Rob On Thu, 16 Jun 2022, 08:11 , wrote: > Rob, > > > > It will be different how you implement and use logrotate manually or with > the logrotate tools. > > What OS are you using? > > > > Eliezer > > > > *From:* squid-users *On > Behalf Of *robert k Wild > *Sent:* Wednesday, 15 June 2022 20:19 > *To:* Squid Users > *Subject:* [squid-users] Logrotate question > > > > Hi all, > > > > ATM to clear the logs, I do this in crontab, every 3 months > > > > 0 0 1 */3 * echo "" > /usr/local/squid/var/logs/access.log and do the same > for cache log > > > > It works but I want to really use log rotate ie > > > > 0 0 1 */3 * /usr/local/squid/sbin/squid -k rotate > > > > I hear log rotate keeps 10 files by default so does that mean I will have > 10 access logs etc and also will it keep the file the same ie won't change > the size or compress it to save space > > > > Thanks, > > Rob > ___ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users > ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
[squid-users] Logrotate question
Hi all, ATM to clear the logs, I do this in crontab, every 3 months 0 0 1 */3 * echo "" > /usr/local/squid/var/logs/access.log and do the same for cache log It works but I want to really use log rotate ie 0 0 1 */3 * /usr/local/squid/sbin/squid -k rotate I hear log rotate keeps 10 files by default so does that mean I will have 10 access logs etc and also will it keep the file the same ie won't change the size or compress it to save space Thanks, Rob ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] Regex for URL to include numbers special letters
Thanks Amos, What about if I wanted to put a normal URL in with the URL regex ones like ^zzz-iobuckets-io[0-9]+-[0-9a-z]+\.s3\.amazonaws\.com:[0-9]$ ^google\.com$ Would that work On Sat, 21 May 2022, 03:45 Amos Jeffries, wrote: > > Your solution may "works", but only partial. > > Diving back to your original request: > > On 20/05/22 02:25, robert k Wild wrote: > > hi all, > > > > want to make the below into a regex as after the io..., could be any > > number and letter, the - stays in the same position but to make it > > simple i just want to make anything a wildcard > > > > http://zzz-iobuckets-io50-1lnk65fe5gm7n.s3.amazonaws.com/ > > <http://zzz-iobuckets-io50-1lnk65fe5gm7n.s3.amazonaws.com/> > > > > something like this ive done but it doesnt work > > > > "^zzz-iobuckets-io.*.s3.amazonaws.com <http://s3.amazonaws.com>$" > > > > > Please notice that your regex does **not** match any valid "URL". > > It explicitly only matches strings that start without a scheme. This is > matching only URI. Specifically it matches URI-authority which HTTP only > sees in CONNECT request-target's. > > > I think what you actually want is this: > >^zzz-iobuckets-io[0-9]+-[0-9a-z]+\.s3\.amazonaws\.com:[0-9]$ > > > That will limit the successful matches to amazonaws.com sub-domains. > Preventing things like "zzz-iobuckets-io.s3.amazonaws.com.example.com" > > > FYI, The regex language supported by Squid is the original GNU regex. > The operators are ^, $, +, *, ?, |, \x, [^-], and (). No character > classes, back references, or repetition groups. > > > HTH > Amos > > ___ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users > ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] Put URLs and URL regex in one text file
Thanks Amos, So does that mean for all my SSL::server_name ACLs, I should be using SSL_bump and not http_access On Sat, 21 May 2022, 06:10 Amos Jeffries, wrote: > On 20/05/22 23:26, robert k Wild wrote: > > Sorry I'm a bit thick > > > > Don't be. These things beyond plain-text HTTP are unfortunately a bit > complex. > > The key thing to remember is that Squid is dealing with *layers* of > protocols wrapped around each other. > > This wiki page > <https://wiki.squid-cache.org/Features/SslPeekAndSplice#Terminology> > documents the process as well as we can. > > > So I've read SSL::server_name_regex which uses sni is better than > > dstdomain_regex > > > > So I think I'm better of using the sni one then ? > > > > Neither is "better". They check different things. > > Usually checking _both_ is useful since "HTTPS" is an HTTP request (with > domain) wrapped inside TLS (with SNI). The two values there are usually > supposed to be the same, but may not be. > > The ssl_bump access controls should check ssl::server_name* ACLs. > > The http_access should check dst* ACLs for HTTP message URL, and may > also check ssl::* ACLs for TLS details (including the TLS server name). > > > HTH > Amos > ___ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users > ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] Put URLs and URL regex in one text file
Sorry I'm a bit thick So I've read SSL::server_name_regex which uses sni is better than dstdomain_regex So I think I'm better of using the sni one then ? On Fri, 20 May 2022, 12:20 Matus UHLAR - fantomas, wrote: > On 20.05.22 11:21, robert k Wild wrote: > >So for SSL inspection, for squid to look into the URl headers, what's the > >better one > > > >Server name or > > > >DST domain > > I thought I have explained it: > dstdom_regex is from the request, not from the SSL data. > > >On Fri, 20 May 2022, 11:12 Matus UHLAR - fantomas, > >wrote: > > > >> On 19.05.22 19:29, robert k Wild wrote: > >> >Think I found it but, what the difference between these two > >> > > >> >acl aclname ssl::server_name_regex [-i] \.foo\.com ... > >> > >> this one is taken from SNI option when squid looks at SSL handshake > >> parameters. > >> > >> >acl aclname dstdom_regex [-n] [-i] \.foo\.com ... > >> > >> this one is the one provided in clients' request, where SSL requests > >> usually > >> look like: > >> > >> CONNECT www.google.com:443 HTTP/1.0 > > -- > Matus UHLAR - fantomas, uh...@fantomas.sk ; http://www.fantomas.sk/ > Warning: I wish NOT to receive e-mail advertising to this address. > Varovanie: na tuto adresu chcem NEDOSTAVAT akukolvek reklamnu postu. > Posli tento mail 100 svojim znamim - nech vidia aky si idiot > Send this email to 100 your friends - let them see what an idiot you are > ___ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users > ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] Put URLs and URL regex in one text file
So for SSL inspection, for squid to look into the URl headers, what's the better one Server name or DST domain Thanks, Rob On Fri, 20 May 2022, 11:12 Matus UHLAR - fantomas, wrote: > On 19.05.22 19:29, robert k Wild wrote: > >Think I found it but, what the difference between these two > > > >acl aclname ssl::server_name_regex [-i] \.foo\.com ... > > this one is taken from SNI option when squid looks at SSL handshake > parameters. > > >acl aclname dstdom_regex [-n] [-i] \.foo\.com ... > > this one is the one provided in clients' request, where SSL requests > usually > look like: > > CONNECT www.google.com:443 HTTP/1.0 > > -- > Matus UHLAR - fantomas, uh...@fantomas.sk ; http://www.fantomas.sk/ > Warning: I wish NOT to receive e-mail advertising to this address. > Varovanie: na tuto adresu chcem NEDOSTAVAT akukolvek reklamnu postu. > Chernobyl was an Windows 95 beta test site. > ___ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users > ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] Put URLs and URL regex in one text file
Think I found it but, what the difference between these two acl aclname ssl::server_name_regex [-i] \.foo\.com ... acl aclname dstdom_regex [-n] [-i] \.foo\.com ... On Thu, 19 May 2022, 19:01 robert k Wild, wrote: > Hi all, > > ATM in my squid.conf I have two acls, one for normal whitelist urls and > one for whitelist reg ex urls, like so > > #HTTP_HTTPS whitelist websites > > acl whitelist ssl::server_name "/usr/local/squid/etc/urlwhite.txt" > > # > > #HTTP_HTTPS whitelist websites regex > > acl whitelistreg ssl::server_name_regex > "/usr/local/squid/etc/urlregwhite.txt" > > # > > http_access allow activation whitelist > > http_access allow activation whitelistreg > > http_access deny all > > > urlwhite.txt > > > > .login.windows.net > > > urlregwhite.txt > > > > ^zzz-iobuckets-io.*.s3.amazonaws.com > > > How would I be able to combine both together to get rid of the whitelist > and just leave the whitelistreg > > Thanks, > Rob > ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
[squid-users] Put URLs and URL regex in one text file
Hi all, ATM in my squid.conf I have two acls, one for normal whitelist urls and one for whitelist reg ex urls, like so #HTTP_HTTPS whitelist websites acl whitelist ssl::server_name "/usr/local/squid/etc/urlwhite.txt" # #HTTP_HTTPS whitelist websites regex acl whitelistreg ssl::server_name_regex "/usr/local/squid/etc/urlregwhite.txt" # http_access allow activation whitelist http_access allow activation whitelistreg http_access deny all urlwhite.txt .login.windows.net urlregwhite.txt ^zzz-iobuckets-io.*.s3.amazonaws.com How would I be able to combine both together to get rid of the whitelist and just leave the whitelistreg Thanks, Rob ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] Regex for URL to include numbers special letters
Sorted it, deleted the "" and the $ at the end and now it works On Thu, 19 May 2022, 15:58 L.P.H. van Belle, wrote: > I think you can use *( in case and per example ) > acl SOMETHING .s3.amazonaws.com > > > > Depends bit on what type list your useing. > https://wiki.squid-cache.org/SquidFaq/SquidAcl > > I suggest have a look there also. > > Greetz, > > Louis > > > > > > > > *Van:* robert k Wild > *Verzonden:* donderdag 19 mei 2022 16:45 > *Aan:* L.P.H. van Belle > *CC:* Squid Users > *Onderwerp:* Re: [squid-users] Regex for URL to include numbers special > letters > > > > No I mean I just want to make the URL into a regex so it can handle any > numbers or letter after the iobuckets-io..S3.amazonaws.com > > > > On Thu, 19 May 2022, 15:40 L.P.H. van Belle, wrote: > > You cant make that certficate.. > at least, I hope, because, if you can, well then whole amazone has a > problem. > > If you want to “hide” that your an intercepting proxy. > > You need to create a RootCA, IntermediateCA and Certificate + key file for > the proxy. > And you need to publish the RootCA and IntermediateCA to you pc’s. > ( that’s easy done with a GPO) > > If that is what you mean.. > > I use XCA to create certificate. > > > Greetz, > > Louis > > > > > > *Van:* squid-users *Namens *robert k Wild > *Verzonden:* donderdag 19 mei 2022 16:25 > *Aan:* Squid Users > *Onderwerp:* [squid-users] Regex for URL to include numbers special > letters > > > > hi all, > > want to make the below into a regex as after the io..., could be any > number and letter, the - stays in the same position but to make it simple i > just want to make anything a wildcard > > http://zzz-iobuckets-io50-1lnk65fe5gm7n.s3.amazonaws.com/ > > something like this ive done but it doesnt work > > "^zzz-iobuckets-io.*.s3.amazonaws.com$" > > thanks, > > rob > > ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] Regex for URL to include numbers special letters
No I mean I just want to make the URL into a regex so it can handle any numbers or letter after the iobuckets-io..S3.amazonaws.com On Thu, 19 May 2022, 15:40 L.P.H. van Belle, wrote: > You cant make that certficate.. > at least, I hope, because, if you can, well then whole amazone has a > problem. > > If you want to “hide” that your an intercepting proxy. > > You need to create a RootCA, IntermediateCA and Certificate + key file for > the proxy. > And you need to publish the RootCA and IntermediateCA to you pc’s. > ( that’s easy done with a GPO) > > If that is what you mean.. > > I use XCA to create certificate. > > > Greetz, > > Louis > > > > > > *Van:* squid-users *Namens *robert k Wild > *Verzonden:* donderdag 19 mei 2022 16:25 > *Aan:* Squid Users > *Onderwerp:* [squid-users] Regex for URL to include numbers special > letters > > > > hi all, > > want to make the below into a regex as after the io..., could be any > number and letter, the - stays in the same position but to make it simple i > just want to make anything a wildcard > > http://zzz-iobuckets-io50-1lnk65fe5gm7n.s3.amazonaws.com/ > > something like this ive done but it doesnt work > > "^zzz-iobuckets-io.*.s3.amazonaws.com$" > > thanks, > > rob > ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
[squid-users] Regex for URL to include numbers special letters
hi all, want to make the below into a regex as after the io..., could be any number and letter, the - stays in the same position but to make it simple i just want to make anything a wildcard http://zzz-iobuckets-io50-1lnk65fe5gm7n.s3.amazonaws.com/ something like this ive done but it doesnt work "^zzz-iobuckets-io.*.s3.amazonaws.com$" thanks, rob ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] disable https inspection for licensing some apps
I worked it out, my "no Https interception" was working on websites if I put the URL in that txt file nointercept.txt But I needed to make a proxy.ini file as well on the host in question, for it to point to the proxy Once it pointed to the proxy I could then monitor the traffic and see what URL I needed to whitelist and to put in the no SSL interception Once I did that all good Thanks guys, much appreciated On Wed, 18 May 2022, 20:21 Eliezer Croitoru, wrote: > Hey Alex, > > I have started working on some external_acl helper that will probe the > server certificate like what ufdbguard does but will be written > probably in another language then C++ ... ie scripting or GoLang or Rust. > The idea is that there will be some cache or DB that will store information > about an IP+port paired with SNI. > A storage engine like a cache would help to "know" enough about the server > to ultimately decide if there is a risk in splicing this specific > connection. > It's also possible that the first time that the request will pass via thru > the proxy it will be bumped to probe the connection for more information > when possible. > > In general for commercial products there is either a CDN service or a > dedicated service. > These usually are not the risk for the proxy users and can be spliced. > The main issue is if one service on a specific IP serves more then one > domain that contains different content. > The best example is google CDN network that might serve on the same IP and > certificate and SNI(because of HTTP/2.0) different domains. > > Eliezer > > > Eliezer Croitoru > NgTech, Tech Support > Mobile: +972-5-28704261 > Email: ngtech1...@gmail.com > > -Original Message- > From: squid-users On Behalf Of > Alex Rousskov > Sent: Wednesday, May 18, 2022 21:39 > To: squid-users@lists.squid-cache.org > Subject: Re: [squid-users] disable https inspection for licensing some apps > > On 5/18/22 12:28, robert k Wild wrote: > > > acl DiscoverSNIHost at_step SslBump1 > > acl NoSSLIntercept ssl::server_name > "/usr/local/squid/etc/nointercept.txt" > > ssl_bump peek DiscoverSNIHost > > ssl_bump splice NoSSLIntercept > > ssl_bump bump all > > OK, the above configuration makes the splice/bump decision based on > plain text information provided by the TLS client. > > > > and in the nointercept.txt > > i have the url in there > > ssl::server_name needs a host/domain name, not a regular URL. No URLs > are exchanged in plain text between TLS client and the origin server. > > Please note that, even after adjusting nointercept.txt to contain domain > name(s), the above configuration may not always work in modern Squids: > It will work when the client sends a matching domain name > > * in the CONNECT request headers (and sends no TLS SNI at all) > * in the CONNECT request headers and in TLS SNI > * in TLS SNI (the CONNECT request headers should not matter). > > It will also work when a CONNECT request is using an IP address that > reverse-resolves to a matching domain name (which is not overwritten by > a mismatching SNI). > > In all other cases, Squid will bump traffic even if it is ultimately > going to the server named in nointercept.txt. > > There is no configuration that will address all possible cases in > general. TLS makes that impossible (at least not without probing TLS > origin servers which is something Squid does not do yet). > > > HTH, > > Alex. > > > >, also i have it in the url white list so it can actually see the url > > > > is there something else i need to add for this to work > > > > or maybe some websites ie license website just dont like it going through > a proxy > > > > > > On Wed, 18 May 2022 at 16:57, robert k Wild > <mailto:robertkw...@gmail.com>> wrote: > > > > hi all, > > > > i have squid proxy configured as ssl bump and i white list some > > websites only > > > > but for some websites i dont want to inspect https traffic as it > > breaks the cert when i want to license some apps via the url > > (whitelist url) > > > > how can i disable https inspection for some websites please > > > > many thanks, > > rob > > > > -- > > Regards, > > > > Robert K Wild. > > > > > > > > -- > > Regards, > > > > Robert K Wild. > > > > ___ > > squid-users mailing list > > squid-users@lists.squid-cache.org > > http://lists.squid-cache.org/listinfo/squid-users > > ___ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users > > ___ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users > ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] disable https inspection for licensing some apps
im using this # SSL bump rulesacl DiscoverSNIHost at_step SslBump1acl NoSSLIntercept ssl::server_name "/usr/local/squid/etc/nointercept.txt"ssl_bump peek DiscoverSNIHostssl_bump splice NoSSLInterceptssl_bump bump all and in the nointercept.txt i have the url in there, also i have it in the url white list so it can actually see the url is there something else i need to add for this to work or maybe some websites ie license website just dont like it going through a proxy On Wed, 18 May 2022 at 16:57, robert k Wild wrote: > hi all, > > i have squid proxy configured as ssl bump and i white list some websites > only > > but for some websites i dont want to inspect https traffic as it breaks > the cert when i want to license some apps via the url (whitelist url) > > how can i disable https inspection for some websites please > > many thanks, > rob > > -- > Regards, > > Robert K Wild. > -- Regards, Robert K Wild. ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
[squid-users] disable https inspection for licensing some apps
hi all, i have squid proxy configured as ssl bump and i white list some websites only but for some websites i dont want to inspect https traffic as it breaks the cert when i want to license some apps via the url (whitelist url) how can i disable https inspection for some websites please many thanks, rob -- Regards, Robert K Wild. ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] squid proxy really slow for web requests
No problem, glad I can be of small help to this awesome project On Tue, 22 Feb 2022, 09:48 Eliezer Croitoru, wrote: > Thanks Rob, > > > > A good catch. > > It’s a very hard one to find. > > > > All The Bests, > > > > > > Eliezer Croitoru > > NgTech, Tech Support > > Mobile: +972-5-28704261 > > Email: ngtech1...@gmail.com > > > > *From:* robert k Wild > *Sent:* Tuesday, February 22, 2022 10:38 > *To:* Eliezer Croitoru > *Cc:* Squid Users > *Subject:* Re: [squid-users] squid proxy really slow for web requests > > > > Hi Eliezer, > > > > Thanks for the reply, in the end I had to restart our firewall, as our > squid server is on the dmz and squid users/clients accessing the squid > server are on the lan, so they have to go through the firewall > > > > Once restarted I could access the webpages and I didn't get the timeout > error any more > > > > Thanks, > > Rob > > > > On Tue, 22 Feb 2022, 05:37 Eliezer Croitoru, wrote: > > Hey Rob, > > > > I really didn’t understood the situation? > > Since we are in 2022 I believe a screen capture(video/gif) for the > scenario would be useful. > > You can use the next tool to capture the scenario: > > https://getsharex.com/ > > > > (if you are using windows) > > > > Thanks, > > Eliezer > > > > > > Eliezer Croitoru > > NgTech, Tech Support > > Mobile: +972-5-28704261 > > Email: ngtech1...@gmail.com > > > > *From:* squid-users *On > Behalf Of *robert k Wild > *Sent:* Monday, February 21, 2022 18:42 > *To:* Squid Users > *Subject:* [squid-users] squid proxy really slow for web requests > > > > hi all, > > > > today my squid responding to web requests from different clients is really > slow > > > > for example when i go on firefox/chrome and open multiple tabs to > different websites, it normally shows the "error url page" as ive denied > all websites apart from some > > > > and some of the websites takes way to long i get "the connection has timed > out" > > > > on my squid server im running htop and pinging google and both seem fine > > > > anything else what it could be > > > > thanks, > > rob > > > -- > > Regards, > > Robert K Wild. > > ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] squid proxy really slow for web requests
Hi Eliezer, Thanks for the reply, in the end I had to restart our firewall, as our squid server is on the dmz and squid users/clients accessing the squid server are on the lan, so they have to go through the firewall Once restarted I could access the webpages and I didn't get the timeout error any more Thanks, Rob On Tue, 22 Feb 2022, 05:37 Eliezer Croitoru, wrote: > Hey Rob, > > > > I really didn’t understood the situation? > > Since we are in 2022 I believe a screen capture(video/gif) for the > scenario would be useful. > > You can use the next tool to capture the scenario: > > https://getsharex.com/ > > > > (if you are using windows) > > > > Thanks, > > Eliezer > > > > > > Eliezer Croitoru > > NgTech, Tech Support > > Mobile: +972-5-28704261 > > Email: ngtech1...@gmail.com > > > > *From:* squid-users *On > Behalf Of *robert k Wild > *Sent:* Monday, February 21, 2022 18:42 > *To:* Squid Users > *Subject:* [squid-users] squid proxy really slow for web requests > > > > hi all, > > > > today my squid responding to web requests from different clients is really > slow > > > > for example when i go on firefox/chrome and open multiple tabs to > different websites, it normally shows the "error url page" as ive denied > all websites apart from some > > > > and some of the websites takes way to long i get "the connection has timed > out" > > > > on my squid server im running htop and pinging google and both seem fine > > > > anything else what it could be > > > > thanks, > > rob > > > -- > > Regards, > > Robert K Wild. > ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
[squid-users] squid proxy really slow for web requests
hi all, today my squid responding to web requests from different clients is really slow for example when i go on firefox/chrome and open multiple tabs to different websites, it normally shows the "error url page" as ive denied all websites apart from some and some of the websites takes way to long i get "the connection has timed out" on my squid server im running htop and pinging google and both seem fine anything else what it could be thanks, rob -- Regards, Robert K Wild. ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] Vulnerabilities with squid 4.15
nice, i dont have any, thanks Amos i normally dont use parse, i normally use reconfigure and rotate On Sat, 12 Feb 2022 at 13:43, Amos Jeffries wrote: > On 11/02/22 23:04, robert k Wild wrote: > > thanks Amos and Eliezer! > > > > tbh i dont know if im using WCCP with my squid version, sorry, how do i > > find that out? > > > > If this produces any config lines: > >squid -k parse 2>&1 | grep wccp > > > Cheers > Amos > ___ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users > -- Regards, Robert K Wild. ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] Vulnerabilities with squid 4.15
OK I'm fine All Squid-4.x up to and including 4.16 built without --disable-wccpv2 and configured with wccp2_router in squid.conf are vulnerable. Thanks Amos for this link On Fri, 11 Feb 2022, 10:09 robert k Wild, wrote: > ok so build my squid 4.17 with this option > > --disable-wccpv2 > > as i have no lines in my squid.conf referencing wccp > > is that what i should do, tbh i dont even know if i do or dont need wccp > > On Fri, 11 Feb 2022 at 02:27, Amos Jeffries wrote: > >> On 11/02/22 07:55, robert k Wild wrote: >> > Hi all, >> > >> > Is there any security vulnerabilities with squid 4.15, should I update >> > to 4.17 or is it OK to still use as my squid proxy server >> > >> > Sorry for silly question >> > >> >> Not silly. >> >> There is this one for WCCP: >> < >> https://github.com/squid-cache/squid/security/advisories/GHSA-rgf3-9v3p-qp82 >> > >> >> However, be aware that the patch has been found to prevent all traffic >> from some routers. We are working on the fix for that. >> >> >> Amos >> ___ >> squid-users mailing list >> squid-users@lists.squid-cache.org >> http://lists.squid-cache.org/listinfo/squid-users >> > > > -- > Regards, > > Robert K Wild. > ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] Vulnerabilities with squid 4.15
ok so build my squid 4.17 with this option --disable-wccpv2 as i have no lines in my squid.conf referencing wccp is that what i should do, tbh i dont even know if i do or dont need wccp On Fri, 11 Feb 2022 at 02:27, Amos Jeffries wrote: > On 11/02/22 07:55, robert k Wild wrote: > > Hi all, > > > > Is there any security vulnerabilities with squid 4.15, should I update > > to 4.17 or is it OK to still use as my squid proxy server > > > > Sorry for silly question > > > > Not silly. > > There is this one for WCCP: > < > https://github.com/squid-cache/squid/security/advisories/GHSA-rgf3-9v3p-qp82 > > > > However, be aware that the patch has been found to prevent all traffic > from some routers. We are working on the fix for that. > > > Amos > ___ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users > -- Regards, Robert K Wild. ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] Vulnerabilities with squid 4.15
thanks Amos and Eliezer! tbh i dont know if im using WCCP with my squid version, sorry, how do i find that out? i am using SSL Bump ie SSL interception and a few websites im doing No ssl intercept with splice/peek/bump hope that helps On Fri, 11 Feb 2022 at 05:24, Eliezer Croitoru wrote: > Hey Robert, > > > > Don’t rush with the move from CentOS 7 to Ubuntu yet, CentOS 7 has good > support for at-least a year from now. > > I can try to help you by providing RPMs that has support for ecap which I > understand you need. > > Alternatively I can try to build an upgrade process for your self compiled > version. > > > > I can recommend on both: > >- Amazon Linux 2 >- Oracle Enterprise Linux 8\7 >- Open Suse > > > > As a general alternative which I can support the RPM builds for. > > I have also built binaries for Ubuntu and Debian but in a non deb package > file but will be signed by me. > > > > As Amos mentioned the current issue is with WCCP based setups. > > Can you please elaborate more if you are using WCCP in your setup? > > Also, Are you using SSL-BUMP by any chance? (I really don’t know about a > setup that doesn’t require this these days) > > > > If you would be able to share more information on your setup so I might be > able to clone such a setup it will help a lot. > > > > Thanks, > > Eliezer > > > > > > Eliezer Croitoru > > NgTech, Tech Support > > Mobile: +972-5-28704261 > > Email: ngtech1...@gmail.com > > > > *From:* robert k Wild > *Sent:* Thursday, February 10, 2022 21:28 > *To:* NgTech LTD > *Cc:* Squid Users > *Subject:* Re: [squid-users] Vulnerabilities with squid 4.15 > > > > I have squid running on centos 7.9, I will move to ubuntu 20 04 03 as > centos is officially dead to me > > > > I have compiled from source ie make make install as I'm running squid with > squidclamav cicap cicap modules > > > > All instances I have compiled from source ie make make install > > > > I did a yum install clamav > > > > On Thu, 10 Feb 2022, 19:20 NgTech LTD, wrote: > > Hey Robert, > > > > First: your question is not silly. > > The answer will defer based on the complexity of the upgrade process. > > What Os are you using and also, did you compiled squid from sources or > installed from a specific package? > > Also, what is your squid setup purpose? > > > > Eliezer > > > > בתאריך יום ה׳, 10 בפבר׳ 2022, 20:56, מאת robert k Wild < > robertkw...@gmail.com>: > > Hi all, > > > > Is there any security vulnerabilities with squid 4.15, should I update to > 4.17 or is it OK to still use as my squid proxy server > > > > Sorry for silly question > > > > Thanks, > > Rob > > ___ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users > > -- Regards, Robert K Wild. ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] Vulnerabilities with squid 4.15
I have squid running on centos 7.9, I will move to ubuntu 20 04 03 as centos is officially dead to me I have compiled from source ie make make install as I'm running squid with squidclamav cicap cicap modules All instances I have compiled from source ie make make install I did a yum install clamav On Thu, 10 Feb 2022, 19:20 NgTech LTD, wrote: > Hey Robert, > > First: your question is not silly. > The answer will defer based on the complexity of the upgrade process. > What Os are you using and also, did you compiled squid from sources or > installed from a specific package? > Also, what is your squid setup purpose? > > Eliezer > > בתאריך יום ה׳, 10 בפבר׳ 2022, 20:56, מאת robert k Wild < > robertkw...@gmail.com>: > >> Hi all, >> >> Is there any security vulnerabilities with squid 4.15, should I update to >> 4.17 or is it OK to still use as my squid proxy server >> >> Sorry for silly question >> >> Thanks, >> Rob >> ___ >> squid-users mailing list >> squid-users@lists.squid-cache.org >> http://lists.squid-cache.org/listinfo/squid-users >> > ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
[squid-users] Vulnerabilities with squid 4.15
Hi all, Is there any security vulnerabilities with squid 4.15, should I update to 4.17 or is it OK to still use as my squid proxy server Sorry for silly question Thanks, Rob ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] deny uploads acl not working
found out i need to block application/json but if i do that it blocks teams aswell as thats a json, maybe i will try this request body max size but theres another one i can use called request header max size whats the best one to use? On Thu, 28 Oct 2021 at 17:05, robert k Wild wrote: > hi all, > > trying to deny uploads but its not working, heres my config below > > #deny up MIME types > acl upmime req_mime_type "/usr/local/squid/etc/mimedeny.txt" > > #allow special URL paths > acl special_url url_regex "/usr/local/squid/etc/urlspecial.txt" > > #deny down MIME types > acl downmime rep_mime_type "/usr/local/squid/etc/mimedeny.txt" > > http_access deny upmime > http_reply_access allow special_url > http_reply_access deny downmime > > in my mimedeny.txt > > application/octet-stream > application/x-msi > application/zip > application/x-7z-compressed > application/vnd.ms-cab-compressed > application/x-msdownload > application/x-iso9660-image > application/x-tar > > but the deny upmime isnt working as on when i try to upload an exe on > teams website it allows me to upload it > > any ideas, > > thanks, > rob > > -- > Regards, > > Robert K Wild. > -- Regards, Robert K Wild. ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
[squid-users] deny uploads acl not working
hi all, trying to deny uploads but its not working, heres my config below #deny up MIME types acl upmime req_mime_type "/usr/local/squid/etc/mimedeny.txt" #allow special URL paths acl special_url url_regex "/usr/local/squid/etc/urlspecial.txt" #deny down MIME types acl downmime rep_mime_type "/usr/local/squid/etc/mimedeny.txt" http_access deny upmime http_reply_access allow special_url http_reply_access deny downmime in my mimedeny.txt application/octet-stream application/x-msi application/zip application/x-7z-compressed application/vnd.ms-cab-compressed application/x-msdownload application/x-iso9660-image application/x-tar but the deny upmime isnt working as on when i try to upload an exe on teams website it allows me to upload it any ideas, thanks, rob -- Regards, Robert K Wild. ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] cicap lines in squid.conf
in the end did this work, the reason why i thought it wasnt is because i went on the eicar test website and downloaded the txt fle and i could see it on the website reason was it was because it was cached and must have saved it in memory as "cache_dir" i have disabled, once i rebooted the squid server and tried to download the txt file again i got the squid virus page i did have to update my clamav service via yum as when i run "freshclam" it errored saying running an old version but once i updated all was good On Wed, 29 Sept 2021 at 13:46, robert k Wild wrote: > hi all, > > going by this link > > https://wiki.squid-cache.org/ConfigExamples/ContentAdaptation/C-ICAP > > theres two icap configuration options, what one should i use as atm i have > this in my squid.conf > > #ICAP > icap_enable on > adaptation_uses_indirect_client on > icap_send_client_ip on > icap_send_client_username on > icap_client_username_header X-Authenticated-User > icap_service service_req reqmod_precache bypass=0 icap:// > 127.0.0.1:1344/squidclamav > adaptation_access service_req allow all > icap_service service_resp respmod_precache bypass=0 icap:// > 127.0.0.1:1344/squidclamav > adaptation_access service_resp allow all > > thanks, > rob > > -- > Regards, > > Robert K Wild. > -- Regards, Robert K Wild. ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
[squid-users] cicap lines in squid.conf
hi all, going by this link https://wiki.squid-cache.org/ConfigExamples/ContentAdaptation/C-ICAP theres two icap configuration options, what one should i use as atm i have this in my squid.conf #ICAP icap_enable on adaptation_uses_indirect_client on icap_send_client_ip on icap_send_client_username on icap_client_username_header X-Authenticated-User icap_service service_req reqmod_precache bypass=0 icap:// 127.0.0.1:1344/squidclamav adaptation_access service_req allow all icap_service service_resp respmod_precache bypass=0 icap:// 127.0.0.1:1344/squidclamav adaptation_access service_resp allow all thanks, rob -- Regards, Robert K Wild. ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] net err cert validity too long - chrome/safari
Thanks Alex On Thu, 23 Sep 2021, 15:46 Alex Rousskov, wrote: > On 9/23/21 9:49 AM, L.P.H. van Belle wrote: > > sadly yes.. > > > https://chromium.googlesource.com/chromium/src/+/HEAD/net/docs/certificate_lifetimes.md > > AFAICT, the above article says that Chrome only applies the 398-day > restriction to certificates signed by CAs that are trusted in a > _default_ installation of Chrome (i.e. the so called "publicly trusted > CAs"). Rob's custom CA is not one of those publicly trusted CAs. > > Evidently, either the 398-day restriction is now applied to more > situations than those described in the article OR Rob has circumvented > Crhome's idea of "publicly trusted CAs". > > Alex. > > > > > > *Van:* squid-users > > [mailto:squid-users-boun...@lists.squid-cache.org] *Namens *robert k > > Wild > > *Verzonden:* donderdag 23 september 2021 14:53 > > *Aan:* squid-users@lists.squid-cache.org > > *Onderwerp:* [squid-users] net err cert validity too long - > > chrome/safari > > > > hi all, > > > > i get this error on chrome and safari, when i access the same > > website on firefox i get the proxy error page as i havnt whitelisted > > this site, when i whitelist it, i can get on the website on all > > three diff browsers and when i take it off the whitelist exactly the > > same before > > > > i have googled and its because my cert is too long age, i made it > > 999 days and i find out now it should be longer than 397 days > > > > is this correct? > > > > thanks, > > rob > > > > -- > > Regards, > > > > Robert K Wild. > > > > > > ___ > > squid-users mailing list > > squid-users@lists.squid-cache.org > > http://lists.squid-cache.org/listinfo/squid-users > > > > ___ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users > ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users