Re: [squid-users] a lot of TCP_SWAPFAIL_MISS/200
Hallo, Dan, Du meintest am 10.08.15 zum Thema Re: [squid-users] a lot of TCP_SWAPFAIL_MISS/200: Native English speaker here -- though I'm not HackXBack, just see that he hasn't replied to you yet. [...] I don't know why people don't give the bug higher priority as it is significantly reducing the hit ratio HTH :) Thank you so much! That longer text I can read, and I can understand it! Viele Gruesse! Helmut ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] a lot of TCP_SWAPFAIL_MISS/200
Hallo, HackXBack, Du meintest am 07.08.15: yea joe i dont know why ppl dnt give this bug importance while it deduce alot of hit ratio Can you please translate this kind of pidgin english into usual written english, to please all foreign readers who have only learned this written english? Thank you! Please excuse my gerlish. Viele Gruesse! Helmut ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] 3.5.6 compile error in Helper ServerBase
Hallo, Amos, Du meintest am 15.07.15: I've just tried to compile squid 3.5.6 using the script from slackbuilds.org. http://slackbuilds.org/slackbuilds/14.1/network/squid.tar.gz With squid 3.4.x (and older versions) this script worked well. With 3.5.6 compiling stops with - In file included from Reply.cc:14: ../../src/helper.h:134: error: base `HelperServerBase' with only non- default constructor in class without a constructor ../../src/helper.h:147: error: base `HelperServerBase' with only non-default constructor in class without a constructor make[3]: *** [Reply.lo] Error 1 [...] What compiler version? Thanks for that hint! Changing from one of my machines, with gcc-3.4.6 (for older installations) to another machine with gcc-4.9.2 solved the problem. Compiling worked, and the new squid version seems to work as expected. Nice! Viele Gruesse! Helmut ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] squidGuard configuration test - echo test
Hallo, Marcus, Du meintest am 08.06.15: you can download ufdbGuard here: https://www.urlfilterdb.com/downloads/software_doc.html and here: http://sourceforge.net/projects/ufdbguard/ ufdbGuard is just like Squid free Open Source Software. The trial license on www.urlfilterdb.com is about the URL database. Thank you - sounds good! Viele Gruesse! Helmut ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] squidGuard configuration test - echo test
Hallo, Amos, Du meintest am 08.06.15: The URL director interface was changed with Squid 3.4, see also http://wiki.squid-cache.org/Features/Redirectors The latest version of squidguard is 1.5 beta from 2010 and squidGuard does not support the new interface of Squid. [...] That wiki page Marcus references you to documents it in detail. Sorry - I'd like to get some help. Under squid 3.4 (and many earlier versions) I use url_rewrite_program /usr/bin/squidGuard How must I change this line for squid 3.5? The above page mentions http://www.eu.squid-cache.org/Doc/config/url_rewrite_extras but this page doesn't yet exist. Same problem with http://www.eu.squid-cache.org/Doc/config/url_rewrite_program Viele Gruesse! Helmut ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] squidGuard configuration test - echo test
Hallo, Amos, Du meintest am 08.06.15: Under squid 3.4 (and many earlier versions) I use url_rewrite_program /usr/bin/squidGuard How must I change this line for squid 3.5? You should not have to change the SG command line or configuration. Ok! Whats needed is a patch from http://bugs.squid-cache.org/show_bug.cgi?id=3978 to be applied to SGitself. If you are using an OS provided SG binary check to see if theyhave already patched it. It's not patched in my version, but it works under squid 3.4.10 - strange. The above page mentions http://www.eu.squid-cache.org/Doc/config/url_rewrite_extras but this page doesn't yet exist. [...] That should be: http://www.squid-cache.org/Doc/config/url_rewrite_extras/ and http://www.squid-cache.org/Doc/config/url_rewrite_program/ Ok - now I can read the pages! Viele Gruesse! Helmut ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] squidGuard configuration test - echo test
Hallo, Marcus, Du meintest am 07.06.15: The URL director interface was changed with Squid 3.4, see also http://wiki.squid-cache.org/Features/RedirectorsThe latest version of squidguard is 1.5 beta from 2010 and squidGuard does not support the new interface of Squid. ufdbGuard is also a URL redirector and since it has regular updates, ufdbGuard is compatible with the new URL redirector interface of Squid.ufdbGuard is free software with a GPL2 license, But what is the meaning of https://www.urlfilterdb.com/downloads/trialregistration.html and https://www.urlfilterdb.com/pricing/licenses.html I don't find there that using ufdbguard without having payed a licence fee is legal use. And the users of the c't/ODS-Schulserver http://de.wikipedia.org/wiki/Arktur-Schulserver which I now maintain also may ask this question ... Viele Gruesse! Helmut ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] squidGuard configuration test - echo test
Hallo, Marcus, Du meintest am 07.06.15: Hi, i have installed squidGuard 1.5 on Debian Jessie and i need an user based filter, made the src/dest/acl setting and then test with : echo http://www.testsite.com 192.168.0.82/ someuserfromauth GET | squidGuard -d The URL director interface was changed with Squid 3.4, see also http://wiki.squid-cache.org/Features/RedirectorsThe latest version of squidguard is 1.5 beta from 2010 and squidGuard does not support the new interface of Squid. Sure? I run squid-3.4.10 and squidGuard-1.5beta on many machines, without having changed the Redirector line in /etc/squid/squid.conf. Viele Gruesse! Helmut ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] squid3 doesn't work well
Hallo, Dennis, Du meintest am 27.04.15: Our squid3 server refused to work smoothly. I am not sure if I can post and get help here. Some details may be helpful. Viele Gruesse! Helmut ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] WARNING: there are more than 100 regular expressions
Hallo, navari.lore...@gmail.com, Du meintest am 27.11.14: i have these Warnings squid -k parse .. 2014/11/27 09:36:22| Processing: acl direct_urls dstdom_regex /etc/squid/direct_urls.txt 2014/11/27 09:36:22| /etc/squid/squid.conf line 86: acl direct_urls dstdom_regex /etc/squid/direct_urls.txt 2014/11/27 09:36:22| WARNING: there are more than 100 regular expressions.Consider using less REs or use rules without expressions like 'dstdomain'. [...] What can i do ? Consider using less REs ... Viele Gruesse! Helmut ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] WARNING: there are more than 100 regular expressions
Hallo, navari.lore...@gmail.com, Du meintest am 27.11.14: Consider using less REs ... is not possible. Then try something like squidguard with lots of user defined domains and URLs. Viele Gruesse! Helmut ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] Nudity Images Filter for Squid
Hallo, Squid, Du meintest am 23.08.14: Sure we may need a real time image filter for advanced image filtering. But that goal is far off squid. squid checks URLs (headers), you will check content. Viele Gruesse! Helmut
Re: [squid-users] Squid and unsupported request protocols
Hallo, m.shahverdi, Du meintest am 15.01.14: I want to pass all traffics through squid not only traffics are received on port 80 and handling them in some ways. Sorry - I can't see any benefit in that desired configuration. Viele Gruesse! Helmut
Re: [squid-users] Squid and unsupported request protocols
Hallo, Amos, Du meintest am 14.01.14: On 2014-01-14 02:33, Jan Wiegmann wrote: fuck you squid cache.. [...] Pardon? This is a subscription-only mailing list, no profiles. Meanwhile he's got some instructions about convenient behaviour in mailing lists ... Viele Gruesse! Helmut
Re: [squid-users] Maybe does Microsoft Windows Update Server accelerate Update better than Squid?
Hallo, Dirk, Du meintest am 19.12.13: one more question regarding downloads (updates and patches) for the operating system windows XP/Vista/7/8: Would it be much faster if I use WSUS to distribute Updates within a LAN than to use Squid Proxy which maybe is not allowed to cache the download by the microsoft website? I was wondering if anyone can give me a hint. Just take a look onto wsusoffline: wsusoffline.net Viele Gruesse! Helmut
Re: [squid-users] ip hiding in squid-3.3.8
Hallo, John, Du meintest am 12.12.13: Hi, Am using Squid 3.3.8. Want to prevent the Squid server to change ip addresses of clients. How can I do it? How to disable ip replacing in Squid? Squid is the one talking to the web servers... so the web servers see squid's IP. No - they see the IP address oft that host which runs (among other services) the service/program squid. Maybe you want to check the http headers HTTP_CLIENT_IP or HTTP_X_FORWARDED_FOR... That's the apache part (or what webserver may run on that host). And the webserver only reads this address, it cannot change it. Viele Gruesse! Helmut
Re: [squid-users] logrotate with SMP problem !!!
Hallo, Dr.x, Du meintest am 17.11.13: ive configured squid to be logrotated from script in /etc/logrotate.d/squid but , logrotating is not working fine ! Sorry - logrotate works fine, but it works in another way than you want it to work. as we see , i have access.log.1 ,access.log.2 ,access.log.3 access.log.6 i dont want the numbered values here , Then you need another program than logrotate. Viele Gruesse! Helmut
Re: [squid-users] get information about download/upload
Hallo, ana, Du meintest am 22.10.13: thanks guys for your reply i authenticate my users by digest and want to limit them by check their total using of bandwidth. Is it possible to use the output of one of log analysis softwares? As I had recommended some hours ago: squish Viele Gruesse! Helmut
Re: [squid-users] get information about download/upload
Hallo, ana, Du meintest am 21.10.13: i have squid3.3.9. i want to know if it can give me any statistics about amount of downloading or uploading for each user in a period of time(1day, week, etc). As Amos told: per machine it is possible. I use squish for such a job. Viele Gruesse! Helmut
Re: [squid-users] Re: Any Way To Check If Windows Updates Are Cached?
Hallo, HillTopsGM, Du meintest am 07.09.13: I have been playing with the wsusoffline, learning more about it. What I have discovered is that it only will bring a system up to a 'patched' level - critical security updates only. Please take a look at http://www.wsusoffline.net/ It's a better place for talking about wsusoffline. Viele Gruesse! Helmut
Re: [squid-users] Re: Any Way To Check If Windows Updates Are Cached?
Hallo, Amos, Du meintest am 06.09.13: No - thanks to Torsten Wittrock. He has made wsusoffline. [...] Would anyone care to update the WindowsUpdate wiki page to write up a nice configuration example for using that tool with Squid? Hmmm - is there any need? One of the mailing list FAQs is whether there is any better way than the current wiki WindowsUpdate page method to store windows updates. People mumble about WSUS but nothing has been written to say how to integrate it yet - so it keeps popping up as either a direct question or as something like the project you have just spent time on. Hmmm - I'll take a look at the squid FAQs. But I'm sure my gerlish is no good language to write an understandable answer ... Viele Gruesse! Helmut
Re: [squid-users] Re: Any Way To Check If Windows Updates Are Cached?
Hallo, Amos, Du meintest am 06.09.13: No - thanks to Torsten Wittrock. He has made wsusoffline. One of the mailing list FAQs is whether there is any better way than the current wiki WindowsUpdate page method to store windows updates. Take a look at http://helmut.hullen.de/Rechnerraum/wupdate.html Ok - it's in german, translating is recommended. Viele Gruesse! Helmut
Re: [squid-users] Re: Any Way To Check If Windows Updates Are Cached?
Hallo, HillTopsGM, Du meintest am 04.09.13: Helmut, you are the *BEST!!! No - thanks to Torsten Wittrock. He has made wsusoffline. Thank you, Thank you, Thank you. I am completely amazed as to how well *WSUS Offline Update* works. Viele Gruesse! Helmut
Re: [squid-users] Re: Any Way To Check If Windows Updates Are Cached?
Hallo, Amos, Du meintest am 05.09.13: [...] No - thanks to Torsten Wittrock. He has made wsusoffline. [...] Lol. Would anyone care to update the WindowsUpdate wiki page to write up a nice configuration example for using that tool with Squid? Hmmm - is there any need? I've just taken a look into one of the wsusoffline directories: 52 patch files, sizes from 1 MByte to 45 MByte. Totally about 750 MByte. That's no job for squid. Viele Gruesse! Helmut
Re: [squid-users] Any Way To Check If Windows Updates Are Cached?
Hallo, HillTopsGM, Du meintest am 03.09.13: I have basically done everything the FAQ has said with regard to optimizing the settings for windows updates. I have done a few, and I am wondering if there is any way to check what has been cached? You should really use something like wsusoffline - it does a better job for this purpose than fiddling around with squid. Viele Gruesse! Helmut
Re: [squid-users] Reiser vs etx4
Hallo, Alfredo, Du meintest am 04.09.13: In the page reccomends reiserfs. So I tried to set up a squid for 200Mbps squid using reiser and squid 3.3 For the squid cache? Strange. That's a cache, no archive. Journalling shouldn't be necessary. Viele Gruesse! Helmut
Re: [squid-users] Re: Any Way To Check If Windows Updates Are Cached?
Hallo, HillTopsGM, Du meintest am 04.09.13: [wsusoffline] Thanks for this tip - I know you have mentioned it before, but what I am trying to avoid is on an on going basis (every week) when there is an update, having all 12 computers download the same file. This is a great tool if you are always doing fresh installs - that's fine - but it doesn't help me day to day. But surely it helps! It's a data base for all desired windows versions, it's completed/refreshed every time you want. The other thing is that it doesn't look like the tool is updated as frequently as windows updates come out - in other words, it doesn't appear to incrementally update itself. Updateing is a cron job. Only not yet existing files are downloaded during such a job, and they stay in the directory as long as Windows looks for them - that's another way than staying in the squid cache. Am I correct in assuming this? No. Just take a try, for about some weeks. Microsoft has a fixed patch day. This is why I'd really prefer to have the proxy work properly - just set it up and forget it. That's the dream. The squid cache deletes old files, because it's a cache. By the way: my wsusoffline directory contains actually the updates for Windows XP (32 bit), Windows 7 (32 bit and 64 bit), that's nearly 7 Gbyte. I wouldn't fill a cache with so much nearly static files. Viele Gruesse! Helmut
Re: [squid-users] Re: Any Way To Check If Windows Updates Are Cached?
Hallo, HillTopsGM, Du meintest am 04.09.13: it's completed/refreshed every time you want.? I was looking into it more and so can you confirm this for me; If I run the updategnerator.exe file that will ONLY add the files I don't have right at the moment? Surely. wget checks if the file already exists. If that is so, then anytime windows notifies me that there are updates, I'd simply have to run this updategnerator.exe file to get the new ones; and then go to all the other machines and run the updateinstaller.exe file. Is that right? That's right. Helmut Hullen wrote Updateing is a cron job. Only not yet existing files are downloaded during such a job, and they stay in the directory as long as Windows looks for them - that's another way than staying in the squid cache. .. . . when you say it is a cron job, are you saying that it is part of the *wsusoffline* program itself? No - writing a cronjob is the administrator's job. But that's a very simple job. These updates that it collects come directly from Microsoft? Yes. If this is the case, this would be tremendously helpful! Oh, and what happens when http://www.wsusoffline.net/ http://www.wsusoffline.net/ comes up with a new 'version', do you have to start all over again? That depends! Windows: you're told that there is a newer version. Linux: the administrator has to watch the wsusoffline website. Installing the program: under Linux just copy it into your desired directory; it overwrites only the wsusoffline binaries/scripts. But that's a wsusoffline problem (if it is a problem), no squid problem. Viele Gruesse! Helmut
Re: [squid-users] Re: cache_dir size v.s. available RAM
Hallo, HillTopsGM, Du meintest am 22.08.13: *Question 2:* Seeing how this is significantly larger that the default 100 MB should I consider increasing the size of the 16 256 in the above sample code? That depends! I know many squid installations in schools, with about 200 ... 500 clients and about 500 ... 2000 pupils using the internet (only for educational purposes - surely!). There I have reduced this part of the entry to 8 256, and 4 256 may be enough in (nearly) all cases too. The disk_cache is set to about 1 ... 2 GByte. Please remember: that's a cache, no archive. Viele Gruesse! Helmut
Re: [squid-users] cache_dir size v.s. available RAM
Hallo, HillTopsGM, Du meintest am 21.08.13: *MY MAIN GOAL: Cache all Windows Updates* What about wsusoffline? http://www.wsusoffline.net/ http://www.wsusoffline.net/docs/ That may be a better way than using the proxy cache. There is a Windows and a Linux version. Viele Gruesse! Helmut
Re: [squid-users] Re: Forcing Windows Automatic Updates to us Proxy - Question about FAQ
Hallo, HillTopsGM, If I may repeat back to you in my own words (and ask a question) to make sure that I understand what it is that is being said: *Question 1:* I do not actually have to run these command lines UNLESS it appears that Windows is not using the proxy - is that correct? Something like that - you can (perhaps) define the proxy configuration in every application. But sometimes it's more easy for you when the application finds the configuration in HKLM. *Question 2:* I just ran proxycfg at the CMD Prompt on a Windows 7 Machine and the response that I got was: (QUOTE) 'proxycfg' is not recognized as an internal or external command, operable program or batch file. (END QUOTE) Would I be correct in assuming that Windows 7 machines are not able to take advantage of those commands, or did I do something wrong? May be - I've just tried this command under W2k, and W2k didn't recognize it too. Searching for proxycfg lead to http://support.microsoft.com/kb/900935/de and the information (in german) Für Windows Vista und neuere Windows-Versionen ist das Tool Netsh.exe statt proxycfg.exe verfügbar. Additional: http://msdn.microsoft.com/en-us/library/windows/desktop/aa384069%28v=vs.85%29.aspx - But that's all windows and not squid ... Viele Gruesse! Helmut
Re: [squid-users] Forcing Windows Automatic Updates to us Proxy - Question about FAQ
Hallo, HillTopsGM, Du meintest am 20.08.13: At the end of this FAQ page: http://wiki.squid-cache.org/SquidFaq/WindowsUpdate http://wiki.squid-cache.org/SquidFaq/WindowsUpdate they give some windows command prompt commands. Do these apply to all of the windows versions like Vista Windows 7 and now 8 as well? Or do you run them only if you have an issue? Thanks for the help. (Start of The Windows Commands) C:\ proxycfg # gives information about the current connection type. Note: 'Direct Connection' does not force WU to bypass proxy C:\ proxycfg -d # Set Direct Connection C:\ proxycfg -p wu-proxy.lan:8080 # Set Proxy to use with Windows Update to wu-proxy.lan, port 8080 c:\ proxycfg -u # Set proxy to Internet Explorer settings. (End of The Windows Commands) If I have understood the system (I'm not sure): a new system always tells Direct connection, proxycfg -u copies the HKCU entry to HKLM, and then proxycfg tells the entries you had defined for some special user for all users (system wide). Viele Gruesse! Helmut
Re: [squid-users] kerberos keytab
Hallo, Carlos, Du meintest am 19.08.13: What is the best strategy to use a keytab file within multiple servers? By now i'm using a NFS share to export the keytab. Every day msktutil runs to update the file if necessary. The job is schedule in one server only. Also, after the update of the keytab file, is it necessary to reload squid? I'd prefer incron for watching the keytab. Rule (pseudo code): if the original keytab is changed: copy it to the necessary places run squid -k reconfigure Viele Gruesse! Helmut
Re: [squid-users] Best OS
Hallo, Amos, Du meintest am 16.06.13: Which OS is better for squid. [...] This has been hashed over soo many times... http://wiki.squid-cache.org/BestOsForSquid There's also link link to some slackware packets: http://wiki.squid-cache.org/KnowledgeBase/Slackware The link to the slackware packet for Squid-3.2 should be replaced by a link only to http://helmut.hullen.de/filebox/Linux/slackware/n/ There you can now find a packet for Squid 3.3.5. Viele Gruesse! Helmut
Re: [squid-users] Best OS
Hallo, Bilal, Du meintest am 15.06.13: Which OS is better for squid. Debian 7 or UBUNTU 10.04 ?? Sorry - if you need something like squid (for many users, on a server) you need an OS which is designed for running a server. For a server I prefer slackware. And on this server I run squid-3.3.5, seems to run without any problem. Viele Gruesse! Helmut
Re: [squid-users] Squid Hardware requirements.
Hallo, Ricardo, Du meintest am 14.06.13: I think that if you can use a good Disc controller (with 1G+ of cache) and make: 1 Raid10 for the SO with 4 discs 2 RAID10 for 2 disc_cache storages for squid with 4 discs each (or even 2 RAID5 with 3 discs each) Sorry - RAID10 decreases the performance, and RAID5 too. A cache doesn't need the redundancy options of RAID. A simple jbod configuration (or 1 fast disk, especially an SSD) should do the job. By the way: the squid cache is a short time buffer, no archive. Viele Gruesse! Helmut
Re: [squid-users] squidguard not redirecting
Hallo, Amos, Du meintest am 18.05.13: I have enabled squidGuard within a huge network. [...] What are you using squidGuard for anyway? There are 2 different options/decisions: a) using redirect/rewrite (as squidGuard and ufdbguard do) or using the squid options acl and http_access (as squidblacklist does) b) using a long time maintained blacklist (p.e. shallalist or squidguard.mesd.k12.or.us/blacklists.tgz) or a newer one (as squidblacklist does) and/or using self made lists and/or using lists from some other places Using blacklists is (especially in schools) a job with many legal implications; people who use them should at least have a good feeling. And using something like squidguard gives such a good feeling - even when such a program may be technically ugly. But the teacher who uses it as a helper has to explain this helper to many parents, and sometimes he/she has to epxlain it to a court of justice (but he never has to explain it to programmers etc). Yes - I know how to circumvent (? - please excuse my gerlish) such filters like squidguard. Viele Gruesse! Helmut
Re: [squid-users] squidguard not redirecting
Hallo, Amos, Du meintest am 18.05.13: SG has numerous problems which caused it not to do what it's supposed to, including that emergency mode thing. Here are some things to consider: 1) a BIG blacklist is overhyped - when I had a good look at our requirements, there was only a small percentage of those websites we actually wanted to block, the rest were either squatting websites or non-existent, or not relevant. Squid could blacklist (eg ACL DENY) those websites natively with a minimum of fuss. May be - it does a good job even with these unnecessary entries. If the list is that badly out of date it will also be *missing* a great deal of entries. Yes - may be. But updating the list is a really simple job. 2) SG has not been updated for 4 or 5 years, if that's your latest version, you are still out of date. I can't see a big need for updating. Software really doesn't need changes (updates) every month or so. For regular software yes. But security software which has set itself out as enumerating badness/goodness for a control method needs constant updates. May be - but squidguard does a really simple job: it looks into a list of not allowed domains and URLs and then decides wether to allow or to deny. That job doesn't need constant updates. More to the point, you will not find much help now. or anyone to fix it even if you could prove it's a bug. That depends! - I know many colleagues who use squidguard since years; the program doesn't need much help. During which time a lot of things have progressed. Squid has gained a lt of ACL types, better regex handling, better memory management, and an external ACL helpers interface (which most installations of SG should really be using). Which brings me back to my question of what SG was being used for. If it is something which the current Squid are capable of doing without SG then you maybe can gain better traffic performance simply by removing SG from the software chain. Like csn233 found it may be worth it. The squidguard job is working with a really big blacklist. And working with some specialized ACLs. I know squid can do this job too - and I maintain a schoolserver which uses many of these possibilities of squid. But then some other people has to maintain the blacklist. That's no job for the administrator in the school. better traffic performance may be a criteria, but (p.e.) blocking porn URLs is (in schools) a criteria too. Teachers have to look at legal protection for children and young persons too. Please excuse my gerlish. Amos Viele Gruesse! Helmut
Re: [squid-users] squidguard not redirecting
Hallo, Amos, Du meintest am 18.05.13: [...] The squidguard job is working with a really big blacklist. And working with some specialized ACLs. Which apart from the list files, is all based on received information sent to it by Squid. I know squid can do this job too - and I maintain a schoolserver which uses many of these possibilities of squid. But then some other people has to maintain the blacklist. That's no job for the administrator in the school. You are the first to mention that change of job. The proposal was to: * make Squid load the blacklist * remove SG from the software chain * watch response time improve ? Nowhere in that sequence does it require any change of who is creating the list. But that's one of the major problems for a user of any blacklist: who maintains this blacklist. That's no squid job, of course. At most the administrator may need to run a tool to convert from some strange format to one Squid can load. (FWIW: both squidblacklists.org and Shalla provide lists which have already been converted to Squid-compatible formats). Hmmm - sounds interesting. [...] Note that we have not even got near discussing the content of those regex lists. I've seen many SquidGuard installations where the rationale for holding onto SG was that squid can't handle this many regex. And at least for such purpose as a schoolserver that's a valid objection ... A teacher has to teach pupils, not to build regular expressions for a machine. Listing 5 million domain names in a file with some 1% having a /something path tacked on the end does not make it a regex list. ** split the fie into domains and domain+path entries. Suddenly you have a small file of url_regex, a small file of dstdom_regex and a long list of dstdomain ... which Squid can handle. Yes - I know. But that sounds more like a theory, not like a downloadable solution. And again: who maintains this solution? Viele Gruesse! Helmut
Re: [squid-users] squidguard not redirecting
Hallo, csn233, Du meintest am 18.05.13: SG has numerous problems which caused it not to do what it's supposed to, including that emergency mode thing. Here are some things to consider: 1) a BIG blacklist is overhyped - when I had a good look at our requirements, there was only a small percentage of those websites we actually wanted to block, the rest were either squatting websites or non-existent, or not relevant. Squid could blacklist (eg ACL DENY) those websites natively with a minimum of fuss. May be - it does a good job even with these unnecessary entries. 2) SG has not been updated for 4 or 5 years, if that's your latest version, you are still out of date. I can't see a big need for updating. Software really doesn't need changes (updates) every month or so. More to the point, you will not find much help now. or anyone to fix it even if you could prove it's a bug. That depends! - I know many colleagues who use squidguard since years; the program doesn't need much help. Viele Gruesse! Helmut
Re: [squid-users] squidguard not redirecting
Hallo, csn233, Du meintest am 18.05.13: I can't see a big need for updating. Software really doesn't need changes (updates) every month or so. No, it doesn't - until you have a problem. That depends! - I know many colleagues who use squidguard since years; the program doesn't need much help. Great! Why are you posting here then? Because I can. Viele Gruesse! Helmut
Re: [squid-users] squidguard not redirecting
Hallo, csn233, Du meintest am 18.05.13: Because I can. Sorry, more relevant question would be - do you have an answer for the original poster? No. I don't run a huge network with the problem he has described. I only run networks with about 50 ... 200 clients, and there I never had this problem. Viele Gruesse! Helmut
Re: [squid-users] Content Encoding Error
Hallo, Cacook, Du meintest am 10.05.13: If you like we'd probably get that sorted. I'm thinking its a permissions issue in the logs directory, overflowing logs due to log rotation errors (ALL,3 can output a lot of data and get into a bit of trouble getting past 2 or 4 GB). OK I've always gone to /var/log/squid, which is empty, but I see there is now a squid3. Logs are there, although don't seem to be getting rotated. Maybe a debian error. Rotating ist mostly a job for logrotate, and the most config files are in /etc/logrotate.d. What tells squid -v about sysconfdir (where squid.conf is found) and about with- logdir? What tells grep log sysconfdir/squid.conf about the logging directives? Viele Gruesse! Helmut
Re: [squid-users] Content Encoding Error
Hallo, Cacook, Du meintest am 10.05.13: What tells grep log sysconfdir/squid.conf about the logging directives? The only line uncommented: logfile_rotate 2 And nothing like access_log stdio:/var/log/squid3/access.log Strange. Hm, it appears that squid has built-in log rotate? Wouldn't the system logrotate interfere? That depends! In my configurations logrotate works earlier than the squid routine. Viele Gruesse! Helmut
Re: [squid-users] Re: big log filescache.log cause squid to get down !!
Hallo, Ahmad, Du meintest am 06.04.13: but if i want to limit the size of log file : assume i want max size of access.log =2 G cache.log ==2G store.log==2G and if the files size exceeded , i wan to replace the same file . how can i do it ? does just logfile_rotate sufficient ? What about logrotate? Many distributions use this program for the job. It may use a file like /etc/logrotate.d/squid. Viele Gruesse! Helmut
Re: [squid-users] Upgrading SQUID from 3.1.6 to 3.1.23 - not working-
Hallo, Vernet, Du meintest am 02.04.13: So I tried to replace the Debian SQUID3 3.1.6 binary with my build of SQUID 3.1.23 and had *no luck*. The actual version is 3.3.3 ... Viele Gruesse! Helmut
Re: [squid-users] problems with a site
Hallo, Luigi, Du meintest am 08.03.13: I have some problems with the following site: http://www.wasteservmalta.com/ the .NET framework from site give me an exception. Without squid I have not problems. Here: no problem. Squid 3.2.3 Viele Gruesse! Helmut
Re: [squid-users] Squid 3.3.2 is available
Hallo, Amos, Du meintest am 08.03.13: The Squid HTTP Proxy team is very pleased to announce the availability of the Squid-3.3.2 release! Compiling it on one of my machines stopped with depbase=`echo peer_proxy_negotiate_auth.o | sed 's|[^/]*$|.deps/ |;s|\.o$||'`;\ g++ -DHAVE_CONFIG_H -DDEFAULT_CONFIG_FILE=\/etc/squid/squid.conf\ -DDEFAULT_SQUID_DATA_DIR=\/usr/share/squid\ -DDEFAULT_SQUID_CONFIG_DIR=\/etc/squid\ -I.. -I../include -I../lib -I../src -I../include -I/usr/heimdal/include [...] deprecated (declared at /usr/heimdal/include/krb5-protos.h:2284) [-Werror=deprecated-declarations] cc1plus: all warnings being treated as errors make[3]: *** [peer_proxy_negotiate_auth.o] Error 1 make[3]: Leaving directory `/tmp/SBo/squid-3.3.2/src' May be related in some strange way with the kerberos installation; on another machine compiling worked, but when running it stopped with a kerberos related error message. (My kerberos installations may be corrupt, but that should be another problem) It would be best to sort that out krb5 stuff first IMO. The above warnings are about internal errors in the krb5 installation. Next try: compiling on a machine without any kerberos installation (neither MIT nor Heimdal). stopped with libtool: link: cannot find the library `/usr/lib/libcom_err.la' or unhandled argument `/usr/lib/libcom_err.la' That file is at least part of the heimdal packet. Once you have krb5 sorted out if it is still halting Squid build you can use --disable-strict-error-checking to get Squid to build. The usual lack of guarantees about both operation and future build success if you use it regularly though. Isn't there any switch --without kerberos or so? Viele Gruesse! Helmut
Re: [squid-users] Squid 3.3.2 is available
Hallo, Amos, Du meintest am 08.03.13: The Squid HTTP Proxy team is very pleased to announce the availability of the Squid-3.3.2 release! Compiling it on one of my machines stopped with depbase=`echo peer_proxy_negotiate_auth.o | sed 's|[^/]*$|.deps/ |;s|\.o$||'`;\ g++ -DHAVE_CONFIG_H -DDEFAULT_CONFIG_FILE=\/etc/squid/squid.conf\ -DDEFAULT_SQUID_DATA_DIR=\/usr/share/squid\ -DDEFAULT_SQUID_CONFIG_DIR=\/etc/squid\ -I.. -I../include -I../lib -I../src -I../include -I/usr/heimdal/include [...] Once you have krb5 sorted out if it is still halting Squid build you can use --disable-strict-error-checking to get Squid to build. I had to clean up the kerberos entries in many places (p.e. /etc/ ld.conf, PATH definition); squid searches in these many places. That makes compiling squid for machines without kerberos complicated on a machine where kerberos is (somehow) included. Viele Gruesse! Helmut
Re: [squid-users] Squid 3.3.2 is available
Hallo, Amos, Du meintest am 02.03.13: The Squid HTTP Proxy team is very pleased to announce the availability of the Squid-3.3.2 release! Compiling it on one of my machines stopped with depbase=`echo peer_proxy_negotiate_auth.o | sed 's|[^/]*$|.deps/ |;s|\.o$||'`;\ g++ -DHAVE_CONFIG_H -DDEFAULT_CONFIG_FILE=\/etc/squid/squid.conf\ -DDEFAULT_SQUID_DATA_DIR=\/usr/share/squid\ -DDEFAULT_SQUID_CONFIG_DIR=\/etc/squid\ -I.. -I../include -I../lib -I../src -I../include -I/usr/heimdal/include -I/usr/heimdal/include -I../src -I/usr/heimdal/include -I/usr/heimdal/include -I/usr/include/libxml2 -I/usr/heimdal/include -I/usr/heimdal/include -I/usr/include/libxml2 -Wall -Wpointer-arith -Wwrite-strings -Wcomments -Werror -pipe -D_REENTRANT -m32 -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -O2 -march=i486 -mtune=i686 -std=c++0x -MT peer_proxy_negotiate_auth.o -MD -MP -MF $depbase.Tpo -c -o peer_proxy_negotiate_auth.o peer_proxy_negotiate_auth.cc \ mv -f $depbase.Tpo $depbase.Po peer_proxy_negotiate_auth.cc: In function 'int krb5_create_cache(char*, char*)': peer_proxy_negotiate_auth.cc:259:32: error: 'char** krb5_princ_realm(krb5_context, krb5_principal)' is deprecated (declared at /usr/heimdal/include/krb5-protos.h:3251) [-Werror=deprecated-declarations] peer_proxy_negotiate_auth.cc:259:78: error: 'char** krb5_princ_realm(krb5_context, krb5_principal)' is deprecated (declared at /usr/heimdal/include/krb5-protos.h:3251) [-Werror=deprecated-declarations] peer_proxy_negotiate_auth.cc:403:13: error: 'void krb5_get_init_creds_opt_init(krb5_get_init_creds_opt*)' is deprecated (declared at /usr/heimdal/include/krb5-protos.h:2284) [-Werror=deprecated-declarations] peer_proxy_negotiate_auth.cc:403:50: error: 'void krb5_get_init_creds_opt_init(krb5_get_init_creds_opt*)' is deprecated (declared at /usr/heimdal/include/krb5-protos.h:2284) [-Werror=deprecated-declarations] cc1plus: all warnings being treated as errors make[3]: *** [peer_proxy_negotiate_auth.o] Error 1 make[3]: Leaving directory `/tmp/SBo/squid-3.3.2/src' -- ./configure \ --prefix=/usr \ --libdir=/usr/lib${LIBDIRSUFFIX} \ --sysconfdir=/etc/squid \ --localstatedir=/var/log/squid \ --datadir=/usr/share/squid \ --with-pidfile=/var/run/squid \ --mandir=/usr/man \ --with-logdir=/var/log/squid \ --enable-snmp \ --enable-basic-auth-helpers=NCSA,YP,MSNT-multi-domain,MSNT,SMB,getpwnam,LDAP,POP3,RADIUS \ --enable-linux-netfilter \ --enable-async-io \ --with-large-files \ --disable-option-checking \ --with-filedescriptors=65536 \ --enable-icmp \ --enable-delay-pools \ --enable-digest-auth-helpers=LDAP,file \ --enable-ntlm-auth-helpers=smb_lm \ --enable-inline \ --disable-loadable-modules \ --disable-translation \ --enable-storeio=aufs,ufs \ --enable-arp-acl \ --enable-wccp \ --enable-external-acl-helpers=ip_user,ldap_group,unix_group,wbinfo_group \ --enable-removal-policies=lru,heap \ --enable-esi \ --build=$ARCH-slackware-linux -- May be related in some strange way with the kerberos installation; on another machine compiling worked, but when running it stopped with a kerberos related error message. (My kerberos installations may be corrupt, but that should be another problem) Viele Gruesse! Helmut
Re: [squid-users] Re: slow browsing in centos 6.3 with squid 3 !!
Hallo, Ahmad, Du meintest am 24.02.13: here is the log of squidguard ! squidGuard is another problem than squid. == 2013-02-24 06:25:32 [17282] Warning: Possible bypass attempt. Found multiple slashes where only one is expected: http://surprises.tango.me/ts//assets/ayol_fairy_gingerbread_surprise_ 2-UI_VG_SELECTOR_PACK-android.zip Nasty, but not bad. Viele Gruesse! Helmut
Re: [squid-users] problem in squid 2.7stable9 with debian 6.0.1 with some sites!!!
Hallo, Ahmad, Du meintest am 24.02.13: i have os debian 6.0.1 with kernel Linux cache1 2.6.37-1 im using squid 2.7 stable 9 ! That's a version which is about 3 years old. Can you use an actual squid version? Viele Gruesse! Helmut
Re: [squid-users] Filter by time and white-black lists
Hallo, Artur, Du meintest am 21.01.13: I've tried many times and I can not do it, please help :( I have 2 classrooms total 40 PC's +5 manager PC's +1 administrator So IP range is 10.77.88.1-10.77.88.41 - for classroom 10.77.88.42-10.77.88.46 - for managers 10.77.88.47 - admin Task: 1)Internet only for this 46 hosts 10.77.88.1-10.77.88.47 2)Classroom and managers can access internet only workdays from 9 to 17 3)Classroom have blacklist of sites in file for what access is denied 4)Managers only can visit white list sites in file, all other blocked 5)Admin can visit any web at any time 6)In weekends (A S) access only by authentification Iam new to squid so I have difficulty to do this, I was able to set access by days and time for one range, but how to join this with white black list and other ranges for manager and admin + authentication o_o I dont get how this http_access deny access work, in what order, can someone provide solution for my task?I will be very grateful You seem to live in germany, perhaps you should look at linux-user 2/ 2013, p. 16 ... 20 (Squid als Spiel- und Social-Network-Bremse). I'd try the following http_access order (untested): acl localnet src 10.77.88.1-10.77.88.47 acl admin 10.77.88.47 http_access deny !localnet # all others get fired acl admin 10.77.88.47 http_access allow admin # they are privileged acl blacklist src /etc/squid/blacklist acl schueler 10.77.88.1-10.77.88.41 http_access allow !schueler # managers have more rights # you may define a special acl for managers; it's not necessary in this # example http_access allow schueler !blacklist # pupils are restricted http_access deny all # all other cases - The time restrictions are not implemented; take a look at listing 1 in the above mentioned article. Viele Gruesse! Helmut
Re: [squid-users] Filter by time and white-black lists
Hallo, Artur, Du meintest am 21.01.13: Can you please make full filter with test, I know that i maybe asking too much but I struggle to do this quite a while :( Sorry - I have enough other work to do. and what is schueler It's the german word for pupil. with complete version of filter I would understand how this work, exactly with time I started to get problems No - you should try to understand how squid works. Perhaps you know how a potato harvester works: it sorts for size, and it sorts out all particles which ar not potatos (a bit simplified). small potatoes: sort it out into the basket for small - ready no potato: sort it out to garbage - ready bigger potatos: go on to the next station And then you should study how and when acls are AND-combined or OR- combined. Viele Gruesse! Helmut
Re: [squid-users] Access Deny
Hallo, Usuário, Du meintest am 20.01.13: I wonder why it replace UTF-8 for LOCALE ?? as it's show on the output Substituting charset 'UTF-8' for LOCALE password: lang_tdb_init: /usr/lib/samba/en_US.UTF-8.msg: No such file or directory NT_STATUS_ACCESS_DENIED: Access denied (0xc022) utf-8 is the better choice - the file has to be shown by the client's browser, not from the machine on which squid is running. Viele Gruesse! Helmut
Re: [squid-users] Access Deny
Hallo, Usuário, Du meintest am 18.01.13: Substituting charset 'UTF-8' for LOCALE password: lang_tdb_init: /usr/lib/samba/en_US.UTF-8.msg: No such file or directory NT_STATUS_ACCESS_DENIED: Access denied (0xc022) Why it's returning Access Denied for me ? Perhaps because the file /usr/lib/samba/en_US.UTF-8.msg doesn't exist (as the error message tells you). Viele Gruesse! Helmut
Re: [squid-users] How to set /etc/logrotate.d/squid to have good sarg reports?
Hallo, Bartosz, you wrote in How to set /etc/logrotate.d/squid to have good sarg reports?: How to set /etc/logrotate.d/squid to have good sarg reports? My system runs the sarg reports at the end of the day, as a separate cronjob, and logrotate runs in the very early morning, as part of cron.daily. Viele Gruesse! Helmut
Re: [squid-users] How to set /etc/logrotate.d/squid to have good sarg reports?
Hallo, Bartosz, you wrote to [squid-users] How to set /etc/logrotate.d/squid to have good sarg reports?: My system runs the sarg reports at the end of the day, as a separate cronjob, and logrotate runs in the very early morning, as part of cron.daily. Helmut So how can you create weekly and monthly reports if you create every day new log file? I create only daily reports. For quota etc. I use squish. And after rotating you are having only one day in log file, dont you? That's another problem; I've just seen that rotating doesn't work as expected ... Viele Gruesse! Helmut
Re: [squid-users] How to set /etc/logrotate.d/squid to have good sarg reports?
Hallo, Bartosz, bartos...@gmail.com meinte am 29.11.12 in squid zum Thema Re: [squid-users] How to set /etc/logrotate.d/squid to have good sarg reports?: My system runs the sarg reports at the end of the day, as a separate cronjob, and logrotate runs in the very early morning, as part of cron.daily. [...] And after rotating you are having only one day in log file, dont you? Yes, that may happen. I've just written a quick and dirty script which deletes all sarg directories which are older than 3 months. It's invoked from the /etc/ cron.monthly directory, it could be invoked from a simple cron job instead. The only thing you should change is DocRoot. #! /bin/bash # loescht alte sarg-Verzeichnisse # Helmut Hullen DocRoot=/home/www/squid-reports sargRef=/tmp/sarg$$ touch -d 'now - 3 months' $sargRef || exit 1 for Verz in $DocRoot/* do test -d $Verz || continue test -s $Verz/sarg-date || continue test $Verz/sarg-date -nt $sargRef continue rm -rf $Verz done rm -f $sargRef # # == # $Id: sarg-alt,v 1.1 2012-11-29 15:39:53+01 HHullen Exp $ # $Log: sarg-alt,v $ # Revision 1.1 2012-11-29 15:39:53+01 HHullen # Start # This script is independent from every logrotate mechanism. Viele Gruesse! Helmut
Re: [squid-users] very tricky problem
Hallo, Maurizio, Du meintest am 18.11.12: i was asked to trace (= to log using access log) all the url of a site: https://www.example.com visited by the dependents of a company. http://de.wikipedia.org/wiki/Example.com (and - perhaps - similar entries in other wikipedias, for most or all other languages) Or have you used example.com as an example for another domain whose name you don't want to tell? Viele Gruesse! Helmut
Re: [squid-users] [Squidguard] squidGuard stops blocking randomly after a while
Hallo, Stefan, Du meintest am 27.08.12: ps02:/var/log/squid # tail -f -n 50 cache.log-20120823 2012/08/23 13:30:07| WARNING: HTTP header contains NULL characters {Accept: */* Content-Type: application/x-www-form-urlencoded} NULL [...] {Accept: */* Content-Type: application/x-www-form-urlencoded} NULL {Accept: */* Content-Type: application/x-www-form-urlencoded 2012-08-23 13:30:07 [14636] ending emergency mode, stdin empty 2012-08-23 13:30:07 [14637] ending emergency mode, stdin empty 2012-08-23 13:30:07 [14638] ending emergency mode, stdin empty 2012-08-23 13:30:07 [14640] squidGuard stopped (1345721407.666) 2012-08-23 13:30:07 [14639] squidGuard stopped (1345721407.666) Is there any line before 13:30:07 which shows more information? Viele Gruesse! Helmut
Re: [squid-users] [Squidguard] squidGuard stops blocking randomly after a while
Hallo, Stefan, Du meintest am 27.08.12: I just tried to rebuild squidguard 1.4 with the additional 2 patches http://squidguard.org/Downloads/Patches/1.4/ just to be sure but same results. Perhaps you try http://helmut.hullen.de/filebox/Linux/slackware/n/squidGuard-1.5-beta-i486-1hln.tgz It's beta, but it runs well since about 2 years. Ok - it's a slackware tarball, but that shouldn't be a big problem. You can see the configuration in /usr/doc/squidGuard/Build. Viele Gruesse! Helmut
Re: [squid-users] I need a help!
Hallo, yongjun, Du meintest am 27.07.12: Hello, can you tell me what is the squid offline cache? and how it's work? Long time ago squid had an option offline_mode which could be set to on or off. Maybe you ask for this option - it's deprecated. Don't use it. Viele Gruesse! Helmut
Re: [squid-users] Big issue on all Squid that we have (2.5 and 3.0) on a web site with IE8/IE9
Hallo, Noc, Du meintest am 21.06.12: We have a big issue with our squid proxy. We browse this website (http://www.laroutedulait.fr) through squid 3.0. We get a blue background and nothing else ( using IE8 9 ). Seems to be no squid problem but a problem made by the web designer. The side tries do use a 'class=ie8' or 'class=ie9' when it finds such a browser. On my system it shows less informations under Internet Explorer than under Firefox. No errors in log. a Idea of the problems ? Just ask the maker of the side (but contact doesn't work ...). Viele Gruesse! Helmut
Re: [squid-users] No response from Squid when cannot resolve DNS for host
Hallo, Vincent, Du meintest am 16.04.12: I have a strange issue with at least one URL : http://d.businessinsider.com/ The host does not exist and cannot be resolved. But instead of telling me that Squid cannot resolve the host (as it does for example for the non existing http://zfsdfo.sdfdsfrgq.com/ ), Squid gives no response, and make the browser wait forever. Here: Der DNS-Server meldete: Name Error: The domain name does not exist. - Viele Gruesse! Helmut
Re: [squid-users] DB Error ?
Hallo, Jarosch,, Du meintest am 22.03.12: Is there any way to change the location where squidguard store the temp DB files? File squidGuard.conf Line dbhome /var/lib/squidGuard/db or whatever you want. Viele Gruesse! Helmut
Re: [squid-users] DB Error ?
Hallo, Jarosch,, Du meintest am 21.03.12: Hi together I think I have some trouble with my Berkely Database. When I start up my Squid I get following error in my cache.log 2012-03-21 15:50:28 [2325] init domainlist /usr/local/squidGuard/list/BL/warez/domains 2012-03-21 15:50:28 [2325] init urllist /usr/local/squidGuard/list/BL/warez/urls temporary open: /var/tmp/BDB02320: Permission denied unable to create temporary backing file temporary open: /var/tmp/BDB02320: Permission denied My usual way for creating the squidGuard database(s): squidGuard -b -d -C all squidGuard -b -d u Do these commands run without an error message? Viele Gruesse! Helmut
Re: [squid-users] Unable to open HTTP Socket in syslog
Hallo, zozo, Du meintest am 19.03.12: userdemo@ubuntu-demo:~$ sudo cat /var/log/syslog |tail Mar 19 16:01:01 ubuntu-demo (squid-1): Unable to open HTTP Socket Mar 19 16:01:01 ubuntu-demo squid[1173]: Squid Parent: (squid-1) process 1193 exited with status 1 Please try to start squid again, wait 1 to 2 minutes and then examine the log file(s) again with sudo tail -20 /var/log/syslog or sudo grep 'squid' /var/log/syslog May be the last ten lines of /var/log/syslog don't show the reason. But Unable to open HTTP Socket may point to the error. Viele Gruesse! Helmut
Re: [squid-users] Squid 3.1.x and detect/disable http tunneling over proxe web sites
Hallo, Josef, Du meintest am 08.03.12: is it able to detect somehow (and disable) tunneling http regular web thru proxy web sites ? For example porn web site thru hidemyass.com. There are a lot of web proxies, couldn't locate everyone and disable it :). How do you solve it ? I use squidGuard with its database p.e. for porn and/or proxies. It's simple to use it under squid. Viele Gruesse! Helmut
Re: [squid-users] blacklist
Hallo, Esteban, Du meintest am 04.03.12: Currently I have 3 servers running with squid and haproxy balancing ahead of them. It works perfectly. Now I want to block porn sites, viruses, external proxies, etc ... I tried dansguardian and squidguard, but slows down my squid and I do not like. I can use? Every filter needs time. squidGuard looks only for URLs - that needs not much time. Looking vor viruses needs much time. That's no special squid problem. That's a problem of the additional software. Viele Gruesse! Helmut
Re: [squid-users] Page seems to load for ever
Hallo, karj, Du meintest am 23.02.12: I? have a problem with the first page of a site, that?s behind squid. The page of the site www.tovima.gr seems to load forever (using chrome and firefox). Here: no problem. Squid 3.1.14 Viele Gruesse! Helmut
Re: [squid-users] Prefetch patch test
Hallo, anita.sivakumar, Du meintest am 16.02.12: Sorry Amos. But where else do I post this ? I thought I can mail it to this mail id squid-users@squid-cache.org. But if there is some other place, please let me know. [full quote deleted - don't top post, please, don't full quote, please] The address is ok. But when you want to write a new question then you shouldn't really answer to an existing problem and then only change the headline. Your mail reader can produce a new mail too. Viele Gruesse! Helmut
Re: [squid-users] Squid 3.2.0.14: failed to select source for ...
Hallo, Amos, Du meintest am 14.02.12: my self made squid 3.2.0.14 sometimes produces messages like Jan 30 08:56:58 Arktur squid[4263]: Failed to select source for 'http:// ivwbox.de/' Jan 30 08:56:58 Arktur squid[4263]: always_direct = 0 Jan 30 08:56:58 Arktur squid[4263]:never_direct = 0 Jan 30 08:56:58 Arktur squid[4263]:timedout = 0 [...] Restarting squid helps for some time, but maybe the real problem would have disappeared by only waiting some time too ... Like Ralf mentioned... ## host-check ivwbox.de ivwbox.de has SOA record ns.ivwbox.de. hostmaster.ivwbox.de. 2011061896 1 3600 604800 86400 It's no problem which only happens with ivwbox.de. Just a few lines: messages.1.gz:Feb 10 10:06:30 Capella squid[11774]: Failed to select source for 'http://id.google.de/verify/EMh-Wz2X9xLGTCOymxWnaw8.gif' messages.1.gz:Feb 10 10:06:55 Capella squid[11774]: Failed to select source for 'http://id.google.de/verify/EN2Vjy0k3x-VRsw45NuiIlg.gif' messages.1.gz:Feb 10 10:06:55 Capella squid[11774]: Failed to select source for 'http://id.google.de/verify/ENJhY8KMMFdd7qtN9ZOGAR0.gif' messages.1.gz:Feb 10 10:07:16 Capella squid[11774]: Failed to select source for 'http://id.google.de/verify/ENzpClI9YEXm5qSK4htjImk.gif' messages.1.gz:Feb 10 10:07:16 Capella squid[11774]: Failed to select source for 'http://id.google.de/verify/ENzpClI9YEXm5qSK4htjImk.gif' messages.1.gz:Feb 10 10:08:23 Capella squid[11774]: Failed to select source for 'http://ipv6-count.gmx.net/p6.gif?ts=1328864900621' messages.1.gz:Feb 10 10:10:05 Capella squid[11774]: Failed to select source for '[null_entry]' messages.1.gz:Feb 10 10:10:09 Capella squid[11774]: Failed to select source for '[null_entry]' messages.1.gz:Feb 10 10:10:10 Capella squid[11774]: Failed to select source for '[null_entry]' messages.1.gz:Feb 10 10:10:28 Capella squid[11774]: Failed to select source for 'http://www.ureader.de/msg/15515065.aspx' messages.1.gz:Feb 10 10:10:28 Capella squid[11774]: Failed to select source for 'http://www.ureader.de/favicon.ico' messages.1.gz:Feb 10 10:10:42 Capella squid[11774]: Failed to select source for '[null_entry]' messages.1.gz:Feb 10 10:16:40 Capella squid[11774]: Failed to select source for 'http://m-wissen.de/' messages.1.gz:Feb 10 10:16:40 Capella squid[11774]: Failed to select source for 'http://m-wissen.de/favicon.ico' messages.1.gz:Feb 10 10:16:43 Capella squid[11774]: Failed to select source for 'http://m-wissen.de/favicon.ico' Going back to squid 3.1.14: no such message. Viele Gruesse! Helmut
Re: [squid-users] Squid 3.2.0.14: failed to select source for ...
Hallo, Henrik, Du meintest am 12.02.12: Jan 30 08:56:58 Arktur squid[4263]: Failed to select source for 'http:// ivwbox.de/' Jan 30 08:56:58 Arktur squid[4263]: always_direct = 0 Jan 30 08:56:58 Arktur squid[4263]:never_direct = 0 Jan 30 08:56:58 Arktur squid[4263]:timedout = 0 Reverse proxy with no matching cache_peer? No. classic proxy. Which part of the squid.conf might you need? Viele Gruesse! Helmut
Re: [squid-users] Squid 3.2.0.14: failed to select source for ...
Hallo, Amos, Du meintest am 11.02.12: my self made squid 3.2.0.14 sometimes produces messages like Jan 30 08:56:58 Arktur squid[4263]: Failed to select source for 'http:// ivwbox.de/' Jan 30 08:56:58 Arktur squid[4263]: always_direct = 0 Jan 30 08:56:58 Arktur squid[4263]:never_direct = 0 Jan 30 08:56:58 Arktur squid[4263]:timedout = 0 [...] Where's the problem? Is that message an information, a warning or an error message? Direct access is permitted, but DNS produced no usable results. Why (as far as you can be clairvoyant)? It doesn't happen always, it only happens sometimes. That makes testing difficult - I don't dare to run the new squid version on a productive system, and the last time I've seen this message on my local system is some weeks ago. As I've told some days ago both machines where I've seen these messages run dnsmasq as DNS, not ISC-bind. Restarting squid helps for some time, but maybe the real problem would have disappeared by only waiting some time too ... Viele Gruesse! Helmut
[squid-users] Squid 3.2.0.14: failed to select source for ...
Hallo, squid-users, my self made squid 3.2.0.14 sometimes produces messages like Jan 30 08:56:58 Arktur squid[4263]: Failed to select source for 'http:// ivwbox.de/' Jan 30 08:56:58 Arktur squid[4263]: always_direct = 0 Jan 30 08:56:58 Arktur squid[4263]:never_direct = 0 Jan 30 08:56:58 Arktur squid[4263]:timedout = 0 I've searched Google for this messages; seems to be another kind of error(?). Especially: http://wiki.squid-cache.org/KnowledgeBase/FailedToSelectSource and https://bugzilla.redhat.com/show_bug.cgi?id=186561 tell nothing about all ... = 0. pgrep -l squid shows that squid is running, the log files also show that squid runs and works hard. Where's the problem? Is that message an information, a warning or an error message? Viele Gruesse! Helmut
Re: [squid-users] Squid 3.2.0.14: disk size
Hallo, Amos, Du meintest am 23.01.12: I'm running squid 3.2.0.14 (self made). It always tells Jan 21 14:37:00 Arktur squid[13670]: WARNING: Disk space over limit: 11131162772328824.00 KB 512000 KB This is http://bugs.squid-cache.org/show_bug.cgi?id=3441 Thank you - rm -f /path/to/swap.state* /path/to/bin/squid -s -S seems to do the job. Viele Gruesse! Helmut
Re: [squid-users] Changes related to /etc/resolv.conf and squid
Hallo, Amos, Du meintest am 25.01.12: If I changed the entry in /etc/resolv.conf then, Is there need to reload/reconfigure the squid? Yes. Squid does not monitor the configuration files for changes. It's on the feature wishlist, but nobody has had enough interest to code up a small background ticker event to do the checks. I use incron from Lukas Aiken, http://inotify.aiken.cz with the file /etc/incron.d/squid with the contents /etc/resolv.conf IN_MODIFY squid -k reconfigure That means: if /etc/resolv.conf is modified then run squid -k reconfigure. There are some other programs which also work with inotify - maybe it's hard for a squid developper to look for the special routine(s) on a special target machine. incron hasn't changed the last two years - don't know if Lukas still works on it. But the program doesn't seem to need changes ... Viele Gruesse! Helmut
[squid-users] Squid 3.2.0.14: disk size
Hallo, squid-users, I'm running squid 3.2.0.14 (self made). It always tells Jan 21 14:37:00 Arktur squid[13670]: WARNING: Disk space over limit: 11131162772328824.00 KB 512000 KB Jan 21 14:37:33 Arktur last message repeated 3 times Jan 21 14:38:39 Arktur last message repeated 6 times That disk size is completely wrong - why? My configuration: cache_mem 254 MB maximum_object_size 4096 KB ipcache_size 4096 fqdncache_size 4096 cache_dir ufs /var/proxy/cache 500 8 256 That configuration works since years without any (such) problem. The cache is in/on a 10 GByte partition. The message seems to be only a (nonsense) warning; squid works well. Viele Gruesse! Helmut
Re: [squid-users] Problem Compiling Squid 1.1.8 (noob?)
Hallo, someone, Du meintest am 30.12.11: Problem Compiling Squid 1.1.8 deviant:/home/devadmin/source/squid-3.1.18# ./configure -O2' --with-squid=/build/buildd-squid3_3.1.6-1.2+squeeze1-i386-_y3HlV /squid3-3.1.6 Just for curiosity: which squid version do you really mean? Viele Gruesse! Helmut
Re: [squid-users] Squid 3.2.0.14 beta is available
Hallo, I wrote am 29.12.11: The Squid HTTP Proxy team is very pleased to announce the availability of the Squid-3.2.0.14 beta release! I had to re-install the former version; 3.2.0.14 had stopped working. Seems to be (have been) a false alarm - sorry. The same breakdown just happened with an older squid version which had run without problems for many months. Seems to be a special problem in my installation. Viele Gruesse! Helmut
Re: [squid-users] Squid 3.2.0.14 beta is available
Hallo, Amos, Du meintest am 13.12.11: The Squid HTTP Proxy team is very pleased to announce the availability of the Squid-3.2.0.14 beta release! I had to re-install the former version; 3.2.0.14 had stopped working. Messages in /var/log/warn: Dec 29 07:05:51 Arktur squid[4479]: Starting Squid Cache version 3.2.0.14 for i486-slackware-linux-gnu... Dec 29 07:06:16 Arktur squid[4479]: BUG: Orphan Comm::Connection: local=[::]:3130 remote=[::] FD 20 flags=9 Dec 29 07:06:16 Arktur squid[4479]: NOTE: 1 Orphans since last started. Dec 29 07:06:17 Arktur squid[4479]: BUG: Orphan Comm::Connection: local=[::]:3130 remote=[::] FD 15 flags=9 Dec 29 07:06:17 Arktur squid[4479]: NOTE: 2 Orphans since last started. Dec 29 13:26:59 Arktur squid[4479]: BUG: Orphan Comm::Connection: local=[::]:3130 remote=[::] FD 15 flags=9 Dec 29 13:26:59 Arktur squid[4479]: NOTE: 3 Orphans since last started. Dec 29 13:38:18 Arktur squid[4477]: Exiting due to unexpected forced shutdown and no client got a connection to an external web site. Maybe the orphan messages are not related to this no connection event; the last days squid reportet 2 to 5 orphans, but made no other problems (the machine is re-started every morning). Which additional informations do you need? Viele Gruesse! Helmut
Re: [squid-users] Error 502 - Bad Gateway - www.allplanlernen.de
Hallo, Mario, Du meintest am 28.12.11: i am running Squid 3.1.0.14 and when i try to access www.allplanlernen.de i get a 502 error. Same here (squid 3.2.0.14): 502 Bad Gateway nginx/0.7.67 It works without squid. Same here, too. Does anyone know why? Seems to be a malformed web site. I've tested (without squid) lynx www.allplanlernen.de/themen/impressum.html and got the informations; looking with a browser (behind squid) onto the side gets Internal Error (check logs) and that's an error message for the website administrator, not for me. Viele Gruesse! Helmut
Re: [squid-users] Squid 3.2.0.14 beta is available
Hallo, Amos, Du meintest am 13.12.11: The Squid HTTP Proxy team is very pleased to announce the availability of the Squid-3.2.0.14 beta release! Slackware binary: http://helmut.hullen.de/filebox/Linux/slackware/n/squid-3.2.0.14-i486-1hln.tgz Viele Gruesse! Helmut
Re: [squid-users] Non-transparent port works, transparent doesn't
Hallo, Pieter, Du meintest am 18.10.11: [TOFU] I understand you being upset with this, but this is a text based client and I have limited time that I can reply to certain issues. I thought I would give a quick insight into an error that I might have spotted. It's quite hard to have selective lines on this client. No - it isn't. Even your mail program (pine/alpine) can delete lines and paragrafs. Please stop these nasty fullquotes. And please look at the reply to line: I had asked you per e-mail far away from the mailing list. Viele Gruesse! Helmut
Re: [squid-users] squid-3.1.16: squid -k shutdown causes crash
Hallo, Ralf, Du meintest am 18.10.11: squid-3.1.16: squid -k shutdown causes a crash, below is the result of: Here: not reproduceable. Slackware 13.37 (self made) squid-3.1.16 (self made) http://arktur.shuttle.de/CD/Testpakete/squid-3.1.16-i486-1_hln.tgz Viele Gruesse! Helmut
Re: [squid-users] facebook upload DOS's squid?
Hallo, hunter, Du meintest am 14.10.11: when uploading a 540m video to facebook, it seems like squid ends up caching(maybe not the right word) the whole thing. Does squid control uploading? Viele Gruesse! Helmut
Re: [squid-users] WARNING: You should probably remove 'www.somewebsite.com' from the ACL named 'blacklist'
Hallo, devadmin, Du meintest am 12.10.11: I have a blacklist of about 1 million domains when I reload squid I get about, a million of these error messages. Should I do something to correct this error message? Yes: include squidGuard for managing such a large blacklist. Viele Gruesse! Helmut
Re: [squid-users] ACL's by Specific Date and Time
Hallo, Jenny, Du meintest am 10.10.11: What my thoughts are is when they are on a holiday, to disable my normal rules. So when they are out of school the proxy doesn't stop their access, but if it's a non school day, it will allow them out. Very easy to do. See acl time: http://wiki.squid-cache.org/SquidFaq/SquidAcl?highlight=%28time%29#Ho w_can_I_allow_some_clients_to_use_the_cache_at_specific_times.3F You can add weekends to your rules to allow access to your kids. You can also download official public holiday list and create rules for these days. Sorry - that doesn't help. Such an ACL doesn't know school holidays, it doesn't know public holidays. Ok - that's the problem of nearly every simple calendar ... Viele Gruesse! Helmut
Re: [squid-users] How to rotate Cache.log
Hallo, Amos, Du meintest am 30.09.11: My cache file is getting too big (250mb) so it becomes really hard to view the log file. [...] Question is though why your cache.log is getting so big in the first place. It should only have rare messages about serious problems. That can happen. I've seen log files with more than 2 GByte too. A bit more precise: squid rotates them if they are bigger than 2 GByte. And then the next 2 Gbyte were filled, but the partition wasn't big enough. All happened within less than 24 hours. But I've seen this nasty behaviour only 1 time in the many last years. Viele Gruesse! Helmut
Re: [squid-users] how to use the user auth parameters
Hallo, Eliezer, Du meintest am 24.09.11: i have used the info on: http://www.cyberciti.biz/tips/linux-unix-squid-proxy-server-authent ication.html there it is FATAL: ERROR: Invalid ACL: acl ncsa_users proxy_auth REQUIRED I use # - auth.conf --- auth_param basic program /usr/libexec/ncsa_auth /etc/squid/.htpasswd auth_param basic children 20 auth_param basic realm Surf-Anmeldung auth_param basic credentialsttl 60 minutes acl Anmeldung proxy_auth REQUIRED http_access deny !Anmeldung # Configuration: Squid Cache: Version 3.2.0.10 configure options: '--prefix=/usr' '--libdir=/usr/lib' '--sysconfdir=/ etc/squid' '--localstatedir=/var/log/squid' '--datadir=/usr/share/squid' '--with-pidfile=/var/run/squid' '--mandir=/usr/man' '--with-logdir=/var/ log/squid' '--enable-snmp' '--enable-basic-auth-helpers=NCSA,YP,MSNT- multi-domain,MSNT,SMB,getpwnam,LDAP,POP3,RADIUS' '--enable-linux- netfilter' '--enable-async-io' '--with-large-files' '--disable-option- checking' '--with-filedescriptors=65536' '--enable-icmp' '--enable- delay-pools' '--enable-digest-auth-helpers=LDAP,file' '--enable-ntlm- auth-helpers=smb_lm' '--enable-negotiate-auth-helpers=kerberos' '-- enable-inline' '--disable-loadable-modules' '--disable-translation' '-- enable-storeio=aufs,ufs' '--enable-arp-acl' '--enable-wccp' '--enable- external-acl-helpers=ip_user,ldap_group,unix_group,wbinfo_group' '-- enable-removal-policies=lru,heap' '--enable-esi' '--enable-ssl' '-- build=i486-slackware-linux' 'build_alias=i486-slackware-linux' 'CFLAGS=- O2 -march=i486 -mtune=i686' 'CXXFLAGS=-O2 -march=i486 -mtune=i686' Viele Gruesse! Helmut
Re: [squid-users] Problem with deny_info
Hallo, Amos, Du meintest am 22.09.11: 3.0 and 3.1 only accept % parameter to deny_info. The extended dynamic format is a new 3.2 feature. Where are these parameters explained? Viele Gruesse! Helmut
Re: [squid-users] [3.2.0.12] ErrorDetailManager.cc(222) parse: WARNING! invalid error detail name:
Hallo, david, Du meintest am 22.09.11: Dear i receive this error in cache.log just after compiling the 3.2.0.12 version ErrorDetailManager.cc(222) parse: WARNING! invalid error detail name: P?.?P?.?09_V_ERR_DOMAIN_MISMATCH 2011/09/22 15:15:23 kid1| errorpage.cc(352) loadFromFile: parse error while reading template file: /usr/share/squid3/errors/templates/error-details.txt Maybe I've seen the same error, with squid-3.2.0.10. Compiling squid without enable-ssl cured that problem (but that's no real solution). Viele Gruesse! Helmut
Re: [squid-users] deep analysis of some request
Hallo, alexus, Du meintest am 09.09.11: 66.55.138.70 - - [08/Sep/2011:18:59:26 +] GET http://ecs.amazonaws.com/onca/xml? HTTP/1.1 200 135861 - Mozilla/4.1 TCP_MISS:DIRECT That line shows that the machine 66.55.138.70 tries to get a document from http://ecs.amazonaws.com/onca/xml And it gets the document (with about 130 kByte). 66.55.138.70 is actually my own IP, in this log you can't really see real remote IP, but that's not an issue I just have alot of request like this, so I want to somehow do some sort of capture to see what's going on there... Sorry - your client machine 66.55.138.70 requests a document from ecs.amazon.com - ask the user of this machine why he oder she wants this document. squid only manages the transfer. Viele Gruesse! Helmut
Re: [squid-users] deep analysis of some request
Hallo, alexus, Du meintest am 08.09.11: Is there a way to analyze somehow deeper what's going on with this? tss# grep 'http://ecs.amazonaws.com/onca/xml?' access.log | tail -1 66.55.138.70 - - [08/Sep/2011:18:59:26 +] GET http://ecs.amazonaws.com/onca/xml? HTTP/1.1 200 135861 - Mozilla/4.1 TCP_MISS:DIRECT That line shows that the machine 66.55.138.70 tries to get a document from http://ecs.amazonaws.com/onca/xml And it gets the document (with about 130 kByte). 66.55.138.70 belongs to Alexusbiz Corp - seems to be one of your IP addresses. Seems that this request is no squid problem. Viele Gruesse! Helmut
[squid-users] ACL for authorized users
Hallo, squid-users, is it possible to define ACLs for special users (authentification via NCSA works)? Using squidGuard is possible (and there I can define ACLs for user or userlist) but I'd prefer a solution with the pure squid. Searching in the Wiki didn't help - maybe I haven't found the right questions. squid version 3.2.0.10 Viele Gruesse! Helmut
Re: [squid-users] ACL for authorized users
Hallo, Amos, Du meintest am 29.08.11: is it possible to define ACLs for special users (authentification via NCSA works)? proxy_auth ACL. Like so: acl users proxy_auth bob knuth Nice - thank you! (I should have studied the squish examples ...) Viele Gruesse! Helmut
Re: [squid-users] blacklist to block adults sites
Hallo, alexus, Du meintest am 27.08.11: is there a blacklist of URLs/IPs that contains say an adults sites? I need to be able to feed that into my squid, people abusing it with surfing porn! What about squidGuard? Or at least the blacklist for squidGuard: http://squidguard.mesd.k12.or.us/blacklists.tgz Viele Gruesse! Helmut
Re: [squid-users] squid redirecting attempted downloads
Hallo, Dave, Du meintest am 22.08.11: We are having an issue where users try to download a file (an email attachment, setup file, etc.) and are redirected to a page on our intranet that says something about file downloads not being allowed. The person I took over from here says that it may be something configured in the squid.conf file. I found the file but have no idea how to disable or modify this setting. What's the name of the file? Can you find this name somewhere in the squid.conf? Or does squid invoke some other program like squidGuard? Viele Gruesse! Helmut
Re: [squid-users] squid redirecting attempted downloads
Hallo, Dave, Du meintest am 22.08.11: We are having an issue where users try to download a file (an email attachment, setup file, etc.) and are redirected to a page on our intranet that says something about file downloads not being allowed. The person I took over from here says that it may be something configured in the squid.conf file. I meant I found the squid.conf file. Then please show the squid.conf. Viele Gruesse! Helmut