Re: [squid-users] Squid 3.0.STABLE17 is available
Herbert Faleiros wrote: On Friday 31 July 2009 06:01:07 you wrote: [cut] Okay. This gets rid of the assert and adds some debug instead. The reason for sending eof=1 when not at true EOF is not yet clear, so use carefully, but additional debugs are added when the flag is set. debug_options ... 11,9 for these. Amos Hi Amos, http.cc: In member function 'void HttpStateData::processReplyHeader()': http.cc:742: error: request for member 'size' in '((HttpStateData*)this)- HttpStateData::readBuf', which is of non-class type 'MemBuf*' http.cc: In member function 'void HttpStateData::readReply(size_t, comm_err_t, int)': http.cc:1013: error: expected primary-expression before '' token make[3]: *** [http.o] Error 1 make[3]: Leaving directory `/usr/src/squid/squid-3.0.STABLE17/src' make[2]: *** [install-recursive] Error 1 make[2]: Leaving directory `/usr/src/squid/squid-3.0.STABLE17/src' make[1]: *** [install] Error 2 make[1]: Leaving directory `/usr/src/squid/squid-3.0.STABLE17/src' make: *** [install-recursive] Error 1 Do I have to apply this patch against what release? (daily snapshot?) I've had applied the other patch (to fix the previous BUG) plus this here (eof_debugs.patch) now. -- Herbert Yes, was written on the daily snapshot. Amos -- Please be using Current Stable Squid 2.7.STABLE6 or 3.0.STABLE17 Current Beta Squid 3.1.0.12
Re: [squid-users] Squid + MySQL ?
Chris Robertson wrote: Marcello Romani wrote: Il Friday 31 July 2009 17:29:51 Maxime Gaudreault ha scritto: After all logs entries are in the database I need to calculate how much bandwidth has been saved by squid. What HTTP Code tells me that the object came from the cache ? Everything with HIT in it ? TCP_HIT, TCP_MEM_HIT, TCP_NEGATIVE_HIT ? Don't know about NEGATIVE_HIT, but I would add TCP_DENIED to the list. While it is served from the cache, there is no way to calculate bandwidth savings associated with a TCP_DENIED*. By the same metric, TCP_NEGATIVE_HIT is served from the cache and has a size associated with it and seems like a valid bandwidth savings metric to me. Also, don't forget to use the logformat tag %st instead of the default %st. That will catch the true bandwidth used for PUT, POST, file uploads and AJAX stuff. Just my 2 (euro)cents. Ditto. Except US currency. Ditto. In NZc. Amos -- Please be using Current Stable Squid 2.7.STABLE6 or 3.0.STABLE17 Current Beta Squid 3.1.0.12
[squid-users] Blocking video streaming using squid
Hi all, How to block video streaming using squid? Regards Priscilla
Re: [squid-users] Blocking video streaming using squid
Priscilla wrote: Hi all, How to block video streaming using squid? Regards Priscilla http://wiki.squid-cache.org/ConfigExamples#head-1eafe03bd78214582693acb9b039a65b68172c31 Amos -- Please be using Current Stable Squid 2.7.STABLE6 or 3.0.STABLE17 Current Beta Squid 3.1.0.12
Re: [squid-users] New Accel Reverse Proxy Cache is not caching everything... how to force?
Waitman Gobble wrote: Amos Jeffries wrote: Andres Salazar wrote: Hello, Ive setup my first reverse proxy to accelerate a site, ive used wget to spider the entire site several times and noticed that even after running it some files never get cached like html files! I presume it is because the htmls dont have the correct cache headers. It didnt even want to cache up .swf files, but then I added this line and it helped a lot but not completely. refresh_pattern . 0 20% 4320 ignore-reload Iam thinking then the best approach is to make squid cache EVERYTHING, and then manually give it specific exceptions of dynamic content ( like .php and some .html with embedded php scripts). I dont want to start editting files because I want to test the performance increase before adding headers one by one. No its not the best approach. The best approach is to take what really exists right now and see how the variations you plan change things. hi andres, one question, you DID increase the defaults in squid.conf right? I *think* the default maximum_object_size is 8k and cache_mem is 32M and most examples i've seen on the 'net set the cache_dir to 100 or 256 M (but please check the docs for the version you're running, as these values change). If you run like that it won't cache much of anything, at least from what I've noticed. I cranked up the numbers and the hit rate increased significantly within a few hours (18% to 49% at this moment). I'm not an expert with regards to 'sane and appropriate' settings but my current experiment is cache_dir ufs /cache 307200 16 256 Try aufs (best on Linux) or diskd (best on BSD). maximum_object_size 65536 KB aka, 64 MB right? ;) If you are after multimedia bits the current traffic documented is up to 250MB object size. cache_mem 3008 MB seems to be working properly, at the moment. squid is kind-of a living thing though, i'll see what happens over the next few days. :-) Maximum Size: 314572800 KB Current Size: 1182386 KB Percent Used: 0.38% Filemap bits in use: 32249 of 32768 (98%) Filesystem Space in use: 3098264/465070216 KB (1%) Filesystem Inodes in use: 117389/60128254 (0%) HTTP: 256056 Requests, 124443 Hits ( 49%) i'm slightly concerned about the filemap bits but I've read it's supposed to automagically adjust itself. I'm still patiently waiting, patiently for that to happen. waitman Amos -- Please be using Current Stable Squid 2.7.STABLE6 or 3.0.STABLE17 Current Beta Squid 3.1.0.12
Re: [squid-users] Zero Sized Reply with some websites
Dayo Adewunmi wrote: Hi I get this Zero Sized Reply sometimes behind proxy. But when I access the same site without proxy, it works fine. What gives? Lets read ERROR ... Squid had a problem. The requested URL could not be retrieved ... somebody asked Squid to fetch something from the WWW. Squid could not. While trying to retrieve the URL: ... That something was apparently an empty piece of text. The following error was encountered: * Zero Sized Reply ... somehow a domain was found (huh?), the HTTP request was passed on. But nothing came back before the TCP link was closed by the other end. Squid read timeout is 15 minutes. Server connection-unused limits may be smaller. Squid did not receive any data for this request. ... the (rather spartan in this case) explanation of what the error message actually means. Is that Squid did not get anything from the web server to fulfill the request. (squid/2.6.STABLE18) ... the version of squid doing all the above. A bit old now, but never mind, the problem was not with Squid. Amos -- Please be using Current Stable Squid 2.7.STABLE6 or 3.0.STABLE17 Current Beta Squid 3.1.0.12
Re: [squid-users] Squid shuts down often
Dylan Palmboom wrote: Hi We have squid running fine, but every now and then the network connection to squid goes down. Then if you try ping the host, it fails. After the server is restarted, it works again until next time. I have included a section from cache.log which is when squid stopped working. I see near the bottom of the log it says it's shutting down... CPU Usage: 11.009 seconds = 5.812 user + 5.196 sys Maximum Resident Size: 0 KB Page faults with physical i/o: 3 Memory usage for squid via mallinfo(): total space in arena: 14872 KB Ordinary blocks: 14684 KB 285 blks Small blocks: 0 KB 0 blks Holding blocks: 1108 KB 1 blks Free Small blocks: 0 KB Free Ordinary blocks: 187 KB Total in use: 15792 KB 99% Total free: 187 KB 1% 2009/07/31 10:20:18| logfileClose: closing log /var/log/squid/store.log 2009/07/31 10:20:18| logfileClose: closing log /var/log/squid/access.log 2009/07/31 10:20:18| Squid Cache (Version 2.7.STABLE5): Exiting normally. 2009/07/31 10:20:21| Starting Squid Cache version 2.7.STABLE5 for i686-pc-linux-gnu... 2009/07/31 10:20:21| Process ID 14362 2009/07/31 10:20:21| With 4096 file descriptors available 2009/07/31 10:20:21| Using epoll for the IO loop 2009/07/31 10:20:21| DNS Socket created at 0.0.0.0, port 44027, FD 6 2009/07/31 10:20:21| Adding domain eva.dbn from /etc/resolv.conf 2009/07/31 10:20:21| Adding nameserver 192.168.1.10 from /etc/resolv.conf 2009/07/31 10:20:21| User-Agent logging is disabled. 2009/07/31 10:20:21| Referer logging is disabled. 2009/07/31 10:20:21| logfileOpen: opening log /var/log/squid/access.log 2009/07/31 10:20:21| Unlinkd pipe opened on FD 11 2009/07/31 10:20:21| Swap maxSize 102400 + 8192 KB, estimated 0 objects 2009/07/31 10:20:21| Target number of buckets: 425 2009/07/31 10:20:21| Using 8192 Store buckets 2009/07/31 10:20:21| Max Mem size: 8192 KB 2009/07/31 10:20:21| Max Swap size: 102400 KB 2009/07/31 10:20:21| Local cache digest enabled; rebuild/rewrite every 3600/3600 sec 2009/07/31 10:20:21| logfileOpen: opening log /var/log/squid/store.log 2009/07/31 10:20:21| Rebuilding storage in /var/cache/squid (CLEAN) 2009/07/31 10:20:21| Using Least Load store dir selection 2009/07/31 10:20:21| Set Current Directory to /var/cache/squid 2009/07/31 10:20:21| Loaded Icons. 2009/07/31 10:20:22| Accepting transparently proxied HTTP connections at 192.168.1.11, port 3128, FD 13. 2009/07/31 10:20:22| Accepting ICP messages at 0.0.0.0, port 3130, FD 14. 2009/07/31 10:20:22| HTCP Disabled. 2009/07/31 10:20:22| Accepting SNMP messages on port 3401, FD 15. 2009/07/31 10:20:22| WCCP Disabled. 2009/07/31 10:20:22| Pinger socket opened on FD 16 2009/07/31 10:20:22| Ready to serve requests. 2009/07/31 10:20:22| Store rebuilding is 57.4% complete 2009/07/31 10:20:22| Done reading /var/cache/squid swaplog (7138 entries) 2009/07/31 10:20:22| Finished rebuilding storage from disk. 2009/07/31 10:20:22| 7138 Entries scanned 2009/07/31 10:20:22| 0 Invalid entries. 2009/07/31 10:20:22| 0 With invalid flags. 2009/07/31 10:20:22| 7138 Objects loaded. 2009/07/31 10:20:22| 0 Objects expired. 2009/07/31 10:20:22| 0 Objects cancelled. 2009/07/31 10:20:22| 0 Duplicate URLs purged. 2009/07/31 10:20:22| 0 Swapfile clashes avoided. 2009/07/31 10:20:22| Took 0.3 seconds (22618.5 objects/sec). 2009/07/31 10:20:22| Beginning Validation Procedure 2009/07/31 10:20:22| Completed Validation Procedure 2009/07/31 10:20:22| Validated 7138 Entries 2009/07/31 10:20:22| store_swap_size = 92156k 2009/07/31 10:20:22| storeLateRelease: released 0 objects 2009/07/31 10:21:33| Preparing for shutdown after 0 requests 2009/07/31 10:21:33| Waiting 30 seconds for active connections to finish 2009/07/31 10:21:33| FD 13 Closing HTTP connection 2009/07/31 10:21:33| Closing Pinger socket on FD 16 2009/07/31 10:21:33| Shutting down... 2009/07/31 10:21:33| FD 14 Closing ICP connection 2009/07/31 10:21:33| FD 15 Closing SNMP socket 2009/07/31 10:21:33| Closing unlinkd pipe on FD 11 2009/07/31 10:21:33| storeDirWriteCleanLogs: Starting... 2009/07/31 10:21:33| Finished. Wrote 7138 entries. 2009/07/31 10:21:33| Took 0.0 seconds (4790604.0 entries/sec). Please could someone tell me if they know what is causing this. Someone has told Squid to restart. Running squid -k restart will do that. So will sending the process ID the restart signal. So will passing the CacheMgr control system a requests for the restart object. A manual restart is the only way I know of that will cause that Exiting normally. message. Amos -- Please be using Current Stable Squid 2.7.STABLE6 or 3.0.STABLE17 Current Beta Squid 3.1.0.12
Re: [squid-users] Squid 3 does not stop
Soporte Técnico @lemNet wrote: I recently have installed squid 3.0 in a freebsd 7.2 box. When i type in ssh (with su) squid -k shutdown or /usr/local/sbin/squid -k shutdown this error appear and squid does not shutdown. squid: ERROR: Could not send signal 15 to process 1001: (1) Operation not permitted Any idea ? There is some security setting in your operating system that prevents you and programs you run from sending kill signals to other programs. Try running that when logged in as an admin or through sudo. Amos -- Please be using Current Stable Squid 2.7.STABLE6 or 3.0.STABLE17 Current Beta Squid 3.1.0.12
[squid-users] videocache
anyone know how to set squid.conf to cache video ? and some game online patcher e.g. ragnarok online, rohan online etc etc ?? -- -=-=-=-= Personal Blog http://my.blog.or.id ( lagi belajar ) Hot News !!! : Pengin punya Layanan SMS PREMIUM ? Contact me ASAP. dapatkan Share revenue MAXIMAL tanpa syarat traffic...
Aw: [squid-users] [new] videocache question
- Original Nachricht Von: mirz...@gmail.com An: Squid Users squid-users@squid-cache.org Datum: 01.08.2009 15:40 Betreff: [squid-users] [new] videocache question anyone know how to set squid.conf to cache video ? An example configure for youtube: http://wiki.squid-cache.org/ConfigExamples/DynamicContent/YouTube Der E-Mail-Dienst PIA basic belegt im Test der Stiftung Warentest unter den kostenfreien Angeboten mit dem Qualitätsurteil 3,0 den zweiten Platz! (Testheft Juli 2009, Artikel E-Mail-Dienste - Besser gratis) JETZT TESTEN: www.arcor.de/rd/pia_sw
Re: [squid-users] how much RAM for squid proxy
Originally Posted by linuxlover.chaitanya View Post I have got no idea why people are forcing on huge RAM for squid. If you are not running graphical then 2gb is gotta be enough for you. But it will also depend on how much clients you are going to serve. Earlier I had a running squid on a old p3 machine with 512megs ram serving about 40 clients without any issues or bandwidth or speed lag. Now I have upgraded the machine to a pentium dual core with 1 gig of ram. And obviously do not run gnome or kde but it keeps more than 50% of ram free. I suppose it depends on how squid is used. If squid is used as a simple proxy just relaying connections without storing much of the data for caching does it need much disk space? I mean if you need: Quote: Thus, a system with 512 MB of RAM can support a 16-GB disk cache. Your mileage may vary, of course. Most servers are in the region of 160Gb+, this would be an extraordinary amount or RAM needed. Can squid limit bandwidth for each user? The line I have can support up to 100Mbits so that's a fair whack. Each user might only need 1Mbps each. -- View this message in context: http://www.nabble.com/how-much-RAM-for-squid-proxy-tp24698323p24770942.html Sent from the Squid - Users mailing list archive at Nabble.com.
[squid-users] Re: [new] videocache question
On 01.08.2009, Jeff Pang wrote: An example configure for youtube: http://wiki.squid-cache.org/ConfigExamples/DynamicContent/YouTube I tried this some time ago with squid-3, it didn't work.
Re: [squid-users] how much RAM for squid proxy
On 01.08.09 10:17, qwertyjjj wrote: I have got no idea why people are forcing on huge RAM for squid. If you are not running graphical then 2gb is gotta be enough for you. If you are running graphical enrivonment on machine wih squid, something's broken there. and the huge ram usually means that you nearly never have too much of RAM. you need ram for squid's memory cache, buffers, disk cache etc etc. I suppose it depends on how squid is used. precisely. Can squid limit bandwidth for each user? there are delay_pools options available. -- Matus UHLAR - fantomas, uh...@fantomas.sk ; http://www.fantomas.sk/ Warning: I wish NOT to receive e-mail advertising to this address. Varovanie: na tuto adresu chcem NEDOSTAVAT akukolvek reklamnu postu. Fighting for peace is like fucking for virginity...
Re: [squid-users] Re: [new] videocache question
im using 2.x ( latest ) anyone can help ? On Sun, Aug 2, 2009 at 3:30 AM, Heinz Diehlh...@fancy-poultry.org wrote: On 01.08.2009, Jeff Pang wrote: An example configure for youtube: http://wiki.squid-cache.org/ConfigExamples/DynamicContent/YouTube I tried this some time ago with squid-3, it didn't work.
Re: [squid-users] how much RAM for squid proxy
What's wriong with a GUI like CentOS? Can user logons be used with squid? eg give someone a logon like their email address? Can the delay_pools be used with email addresses? -- View this message in context: http://www.nabble.com/how-much-RAM-for-squid-proxy-tp24698323p24773603.html Sent from the Squid - Users mailing list archive at Nabble.com.
Re: Aw: [squid-users] Running two squid3 process, Why?
Jeff Pang wrote: - Original Nachricht Von: Sachin Malave sachinmal...@gmail.com An: squid-users@squid-cache.org Datum: 01.08.2009 14:37 Betreff: [squid-users] Running two squid3 process, Why? I am using squid3, After starting squid It creates two process $sudo squid3 $ps -A It displays something like.. . . 6325 ?00:00:00 squid3 6327 ?00:00:25 squid3 6328 ?00:00:00 unlinkd 6367 ? . . . Can anybody tell me why it has created two processes? Well, from my looking at the sources years ago, one process is the parent, who does nothing but monitor the child, another process is the child, who does the actual http request handling. When child dies due to some exception, parent will re-launch another new child. Jeff. Correct. And the third one handles the disk deletions. Amos -- Please be using Current Stable Squid 2.7.STABLE6 or 3.0.STABLE17 Current Beta Squid 3.1.0.12
Re: [squid-users] Re: [new] videocache question
On Sun, Aug 2, 2009 at 3:30 AM, Heinz Diehlh...@fancy-poultry.org wrote: On 01.08.2009, Jeff Pang wrote: An example configure for youtube: http://wiki.squid-cache.org/ConfigExamples/DynamicContent/YouTube I tried this some time ago with squid-3, it didn't work. Yes Squid-3 does not have the needed storage URL feature. We are still waiting on someone to be interested enough to port it over. ░▒▓ ɹɐzǝupɐɥʞ ɐzɹıɯ ▓▒░ wrote: im using 2.x ( latest ) anyone can help ? Are we to assume by latest 2.x you mean 2.7 or merely the latest available in your unknown operating system? Hint: 'latest 2.x' for RedHat and several others is 2.5. Obsolete many years ago. If you did mean 2.7, then the example there and in related Discussion page is the best you are going to get right now. They even provide a useful helper script to do the URL mapping. Amos -- Please be using Current Stable Squid 2.7.STABLE6 or 3.0.STABLE17 Current Beta Squid 3.1.0.12
[squid-users] Does squid support multithreading ?
I have multicore processor here, I want to run squid3 on this platform, Does squid support multithreading ? will it improve the performance ? -- Regards Sachin Malave
Re: [squid-users] Donate section not update
The donations were always few and far between. I'm not sure if there's been any real active donations in the last twelve months; I think only Duane knows. Adrian 2009/8/2 Juan C. Crespo R. jcre...@ifxnw.com.ve: Guys Checking the site I found there is no donation from December 2008, or its an error in the page?, because no donation its like no one cares about this project and that could not be possible because I see a lot of people complaining and asking for features, a errors resolutions, I include myself in this group. Regards.