Re: [squid-users] What squid should do with RFC non-compliant response header?
Thanks for the reponse. Actually browsers ignore the header as a response header and do not show it at all. (at least firefox) Technically I would expect squid to pass it but it's might have the potential for a CVE in some casese. Eliezer Eliezer Croitoru Linux System Administrator Mobile: +972-5-28704261 Email: elie...@ngtech.co.il -Original Message- From: L A Walsh [mailto:squid-u...@tlinx.org] Sent: Wednesday, April 5, 2017 10:19 PM To: Eliezer CroitoruCc: squid-users@lists.squid-cache.org Subject: Re: What squid should do with RFC non-compliant response header? Eliezer Croitoru wrote: > Hi List, > > I noticed that there are broken services out-there which uses non RFC > compliance response header such as the case of space, for example: > "Content Type: hola amigos" > HmmmApril 1?... Seriously -- what would a user's browser do? Probably depends on browser, but browsers are notoriously accepting and most would likely ignore a problem like that and try to use defaults to decide on content and rendering. So if you want your proxy to not look like a stick-in-the-mud for standards, I'd just pass it on. If a proxy rejected every non-compliant web-page, some significant percentage of the web would be unviewable. ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] What squid should do with RFC non-compliant response header?
Eliezer Croitoru wrote: Hi List, I noticed that there are broken services out-there which uses non RFC compliance response header such as the case of space, for example: "Content Type: hola amigos" HmmmApril 1?... Seriously -- what would a user's browser do? Probably depends on browser, but browsers are notoriously accepting and most would likely ignore a problem like that and try to use defaults to decide on content and rendering. So if you want your proxy to not look like a stick-in-the-mud for standards, I'd just pass it on. If a proxy rejected every non-compliant web-page, some significant percentage of the web would be unviewable. ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
[squid-users] What squid should do with RFC non-compliant response header?
Hi List, I noticed that there are broken services out-there which uses non RFC compliance response header such as the case of space, for example: "Content Type: hola amigos" Compared to: "Content-Type: Hola amigos" Leaving aside if the content type is valid and is indeed mime one and looking only at the header name. Should squid pass such a header or deny it? What is expected from squid? Should squid continue to handle the request or report an error? Thanks, Eliezer Eliezer Croitoru Linux System Administrator Mobile: +972-5-28704261 Email: elie...@ngtech.co.il ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] Squid Authentication if URL is on a Blacklist from SquidGuard
/If you have such a thing as AD and the ability to push Group Policy to the users there is no need to avoid authentication./ I have a running AD on Ubuntu 16.04 with samba4. /Perhapse the client is actually asking to get away from lots of annoying popups the browsers are forcing on them? if that is happening it is a strong sign that the authentication system needs fixing. When it works there should be zero popups./ The client gets asked for his username/password everytime he closes & opens the browser, while surfing, there are no PopUps so the client can surf undisturbed. At first, my client wanted to authenticate everytime someone opens & closes the browser, now he wants to authenticate ONLY if someone calls up "a bad Website". /Er, credentials are valid for 2 hours, but the "users" are jumping around between IPs every second? NP: the authenticate_ip_* stuff is irrelevant unless a maxuserip type ACL is being used. / Thanks, the thing with "authenticate_ip_ttl 1 second" was my fallacy. /Funky. Have you check that is not simple the browser "Password Manager" feature requesting access to their machine or AD "Domain login" details?/ Browsers with "Password-Manager"-Features can save the password, but only fill in the saved username and password. So you would have to press Enter in order to continue. If you don't use this Feature, you will get asked everytime you close & open the browser and have to enter it yourself. / To use SG as requested you need to make an external_acl_type helper that receives the same things SG needs and passes them on to it, mapping the result back to an OK/ERR result for Squid ACL use. [ IIRC Eliezer has posted a helper that does that to the list . ] Then you can do something like: external_acl_type sgMapper ... acl testWithSg external sgMapper http_access allow testWithSG http_access deny !auth ... Note that this does not involve the url_rewrite_* API. You can drop that entirely. Unless you want some traffic to still be redirected/rewritten by SG. In which case you need url_rewrite_access to define which traffic SG applies to./ I have to excuse myself, I'm still a beginner in the world of Squid. Thanks for understanding. You are right, I don't need to redirect to Blockpages anymore. If the user authenticates because he called up a bad url, he should be allowed to pass. I don't understand that solution, why do I need to make that external_acl_type helper? Isn't it the same as my external_acl_type? /external_acl_type webusers %LOGIN /usr/lib/squid/ext_ldap_group_acl -b "dc=,dc=local" -D testuser@.local -W /etc/squid/squid.secrets -f "(&(sAMAccountName=%v)(memberOf=cn=%a,cn=Users,dc=,dc=local))" -h 172.30.0.36 acl ldapgroup_webusers external webusers webusers http_access allow ldapgroup_webusers / My helper are working well: @-testproxy01:~# /usr/lib/squid/basic_ldap_auth -R -b "dc=,dc=local" -D testuser@.local -W /etc/squid/squid.secrets -f sAMAccountName=%s -h 172.30.0.36 testuser OK @-testproxy01:~# /usr/lib/squid/ext_ldap_group_acl -b "dc=,dc=local" -D testuser@.local -W /etc/squid/squid.secrets -f "(&(sAMAccountName=%v)(memberOf=cn=%a,cn=Users,dc=,dc=local))" -h 172.30.0.36 testuser webusers OK How can I match the requested URL against the Blacklists without SquidGuard? I still need to match it against the Blacklist, and then it has to get decided if he needs to authenticate or not. Thanks for answering! -- View this message in context: http://squid-web-proxy-cache.1019090.n4.nabble.com/Squid-Authentication-if-URL-is-on-a-Blacklist-from-SquidGuard-tp4681950p4681995.html Sent from the Squid - Users mailing list archive at Nabble.com. ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
[squid-users] https log message formatting help
Hi squid users Is there any way to change the request url log format for HTTPS messages? I am using %ru to pull out the URL. When we get https connections, we see the url logged as www.microsoft.com:443 is there any way to reformat the log message to remove the appended port? or to go further and rewrite to use https://? Thanks in advance -- View this message in context: http://squid-web-proxy-cache.1019090.n4.nabble.com/https-log-message-formatting-help-tp4681994.html Sent from the Squid - Users mailing list archive at Nabble.com. ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users