Re: [squid-users] Rewrite html
I've heard that some people use Apache to do such things. i.e. for adding advertisement to head of the pages in free webhosting services. I guess it could be done using mod_proxy in combination with another module. Hope this helps On 5/19/07, Armin ranjbar <[EMAIL PROTECTED]> wrote: On Sat, 19 May 2007 17:06:09 +0800 Adrian Chadd <[EMAIL PROTECTED]> wrote: > On Sat, May 19, 2007, Armin ranjbar wrote: > > dear all , > > > > im looking for some kind of Tool for squid to rewrite html pages that squid retrieves in transparent mode , say that i need to add between . > > > > thanks! :) > > Squid can't do it natively yet. The more testers we get for Squid-3 the > quicker we can get it released and the quicker you'll be able to do that > kind of HTML page rewriting everyone wants to. > > > > > Adrian > > do you know any addon , wrapper or redirector than can do it ? something like squid guardian or etc ... ? at the end do you know any other stable proxy software that is able to do this kind of operation ? -- Tomorrow, you can be anywhere. -- Mehdi Sarmadi
Re: [squid-users] Bandwidth by file extension
as a reference check: http://www.visolve.com/squid/squid30/delaypools.php#delay_parameters So just change delaypool parameter and Pool #2 to 1Mbps On 4/5/07, Eduardo Luís <[EMAIL PROTECTED]> wrote: Thanks very much for your example. But, I would like to understand those numbers: >delay_parameters 3 25000/25000 25000/25000 this line says 25KByte/Sec Individual and 25KByte/Sec Aggregate limit > delay_parameters 2 26/26 26/26 this line says 260KByte/Sec Individual and 260KByte/Sec Aggregate limit (mean no limit on a 260KByte/Sec link ;) = 8*260Kbps = 2080Kbps = 2Mbps) for 1Mbps(128Kbps*8) change the line above to: delay_parameters 2 128000/128000 128000/128000 > delay_parameters 1 35000/35000 35000/35000 this line says 35KByte/Sec Individual and 35KByte/Sec Aggregate limit My link is 1Mbps. I want to set a rule like yours but concerning my link ando not yours 2Mbps link. Thanks Eduardo Luís 2007/4/5, Mehdi Sarmadi <[EMAIL PROTECTED]>: > Hi > > This is an example that works. > Facts: I have 2Mbps BW Totally. I want limit compressed and binary > extensions totally utilize only 25Kbps, and multimedia files upto > 35Kbps and other contents should not have any limitation here. > > Hope this helps > Regards > > === squid.conf > > > # Source ACLs > acl all src 0.0.0.0/0.0.0.0 > acl mylan src 192.168.7.0/24 > > > # Multimedia file extension, -i : case insensitive > acl mm_ext urlpath_regex -i \.mp3$ \.avi$ \.mov$ \.mpeg$ \.mpg$ > \.divx$ \.mp4$ \.xvid$ \.axf$ \.3gp$ \.img2$ \.wma$ \.wmv$ > acl compressed urlpath_regex -i \.rar$ \.zip$ > acl executablebinary urlpath_regex \.exe$ \.msi$ \.bin$ \.iso$ > > > #My LAN BWLimitation > delay_pools 3 > delay_class 1 2 > delay_class 2 2 > delay_class 3 2 > delay_parameters 3 25000/25000 25000/25000 > delay_parameters 2 26/26 26/26 > delay_parameters 1 35000/35000 35000/35000 > delay_initial_bucket_level 50 > delay_access 3 allow mylan executablebinary > delay_access 3 allow mylan compressed > delay_access 3 deny all > delay_access 2 allow mylan !mm_ext !executablebinary !compressed > delay_access 2 deny all > delay_access 1 allow myclan mm_ext > delay_access 1 deny all > > > === > > On 4/5/07, Eduardo Luís <[EMAIL PROTECTED]> wrote: > > Hi, > > > > Is there any way I can limit bandwidth from users that are limiting > > our internet access with many porno movie downloads? > > I don't want to block them, but give litle bandwidth to those > > downloads like AVI, ASF, WMV, etc... > > > > Thanks. > > > > > -- > Mehdi Sarmadi > -- Mehdi Sarmadi
Re: [squid-users] Bandwidth by file extension
Hi This is an example that works. Facts: I have 2Mbps BW Totally. I want limit compressed and binary extensions totally utilize only 25Kbps, and multimedia files upto 35Kbps and other contents should not have any limitation here. Hope this helps Regards === squid.conf # Source ACLs acl all src 0.0.0.0/0.0.0.0 acl mylan src 192.168.7.0/24 # Multimedia file extension, -i : case insensitive acl mm_ext urlpath_regex -i \.mp3$ \.avi$ \.mov$ \.mpeg$ \.mpg$ \.divx$ \.mp4$ \.xvid$ \.axf$ \.3gp$ \.img2$ \.wma$ \.wmv$ acl compressed urlpath_regex -i \.rar$ \.zip$ acl executablebinary urlpath_regex \.exe$ \.msi$ \.bin$ \.iso$ #My LAN BWLimitation delay_pools 3 delay_class 1 2 delay_class 2 2 delay_class 3 2 delay_parameters 3 25000/25000 25000/25000 delay_parameters 2 26/26 26/26 delay_parameters 1 35000/35000 35000/35000 delay_initial_bucket_level 50 delay_access 3 allow mylan executablebinary delay_access 3 allow mylan compressed delay_access 3 deny all delay_access 2 allow mylan !mm_ext !executablebinary !compressed delay_access 2 deny all delay_access 1 allow myclan mm_ext delay_access 1 deny all === On 4/5/07, Eduardo Luís <[EMAIL PROTECTED]> wrote: Hi, Is there any way I can limit bandwidth from users that are limiting our internet access with many porno movie downloads? I don't want to block them, but give litle bandwidth to those downloads like AVI, ASF, WMV, etc... Thanks. -- Mehdi Sarmadi
Re: [squid-users] ANY WAY TO SEE IP ADDRESS WITH PROXY USER NAME
Take a look at access.log when log_mime_headers is enabled On 3/25/07, Abhishek Chavan <[EMAIL PROTECTED]> wrote: I AM USING SQUID AS AN HTTP PROXY TO GIVE INTERNET ACCESS. PEOPLE GET ACCESS BY USING A USER NAME AND PASSWORD GIVEN TO THEM . IS THERE ANY WAY I CAN TRACE WHICH USER NAME PASSWORD IS BEING USED FROM WHICH IP ADDRESS? -- Mehdi Sarmadi
Re: [squid-users] Centralizing Squid
But, I guess the whole configuration should not be the same on all the Proxy Servers, and Nadeem only wants to update ACLs. On 2/25/07, Adrian Chadd <[EMAIL PROTECTED]> wrote: On Sat, Feb 24, 2007, Nadeem Semaan wrote: > I have 10 proxy servers on WAN, and when I want to, for example, block a site, its a waist of time for me to go to each one and add that name of that site. I have everyting on externals lists. Is there a way to make a change on one of my proxy servers (and call it the central server) and have it uplaod that new file to all the other servers? You could schedule an rsync of the configuration file/directory, add in your local modifications, and then squid -k reload the configuration changes. Its not included in as part of the core Squid distribution. You can build that yourself using freely available UNIX utilities. (And I've done just that on more than one occasion.) Adrian -- Mehdi Sarmadi
Re: [squid-users] squid throughput limits?
I've tought squid shouldn't be much cpu intensive, unless there are many acl or filtering rules in the configuration On 11/23/06, Wojciech Puchar <[EMAIL PROTECTED]> wrote: i recently installed squid on P4 machine with FreeBSD, serving about 1000 users with diskd, 3 spool dirs etc. etc. and it uses 30% CPU at max load hours. disk load is less than 30%. is it possible to run squid for caching >3000 people network (will be >90% CPU)? squid can't divide for many processors/cores. and wy so much CPU power? -- Mehdi Sarmadi
Re: [squid-users] Cache only to RAM
In such case, how much data object would get cached there? I mean how much data will remain in memory until get replaced? ( without considering expire-age ) I hope, I could express myself well. On 9/22/06, Adrian Chadd <[EMAIL PROTECTED]> wrote: On Fri, Sep 22, 2006, Gavin White wrote: > Hello, > > I would like to set up a squid box which caches only to RAM. I do not > want it to cache to disk at all. > > Does anyone know how I can configure this? I have tried setting > cache_dir to none, and also setting it to zero size, but squid will > not start. cache_dir null / :) (And compile in the null storedir type.) Adrian > > > Thanks, > > Gavin -- Mehdi Sarmadi
Re: [squid-users] does squid write to access.log after a user leaves a webpage, or
Squid only can log requested URLs. Some log-analyzers treat the first visit to a site as the enter page, and with a time period if has not seen a log entry starting with that site address as an exit page, e.g. Webalizer. Hope this helps On 9/15/06, Hung Ng <[EMAIL PROTECTED]> wrote: when the user first visits the webpage. Thank you. __ Do You Yahoo!? Tired of spam? Yahoo! Mail has the best spam protection around http://mail.yahoo.com -- Mehdi Sarmadi http://msarmadi.googlepages.com
Re: [squid-users] acl rep_header for Content_Length
Dear This combination worked for me(for test): acl lessthan3MB rep_header Content-Length ^(1|2)?+[0-9]{0,6}$ http_reply_access allow lessthan3MB http_reply_access deny all I guess the rep_header acl type is only effective when is used with http_reply_access and not with others. Hope it helps Regards -- Mehdi Sarmadi On 8/31/06, lopl <[EMAIL PROTECTED]> wrote: Dear I found an acl (rep_header) useful, to prevent sending bigfiles to icap_server. for example 3 MB . this config line was added. acl test rep_header Content-Length ^(1|2)?+[0-9]{0-5}$ icap_access s1 allow test icap_access s1 deny all but unfortunately all of downloaded files were denied . where is the error? thanks for your help -- Mehdi Sarmadi
Re: [squid-users] acl for file size
No. Actually there is no such straight thing to tell squid, prevent some IP's from downloading bigger than X byte size files. Squid cannot realize how much big is size of a reply. You could realize size of the reply from header of server's reply using rep_header ACL's ;Maybe "Content-Length" or "Range". -- Please correct me if I'm wrong For preventing people from download BIG files, I prefer to put requests for known extentions for multimedia, disk images, executable/binary, and compressed archive into a fixed rated delay_pool in office work-time. It have been suitable for me. Any other clue? On 8/26/06, lopl <[EMAIL PROTECTED]> wrote: Hi is any acl to prevent downloading files bigger than x byte.? Best Pezhman -- Mehdi Sarmadi
Re: [squid-users] logging delay_pools
About logging, I'm not sure. But there is something that you could use for seeing requests in realtime, It's Alex Samorukov's work that I've modified to show the delay_pool. Take it from: http://msarmadi.googlepages.com/sqstat.php.zip On 7/15/06, Leonardo Rodrigues Magalhães <[EMAIL PROTECTED]> wrote: Hello Guys, Is it possible to have some kind of logging of the delay_pools actions ? Let me explain ... I have an internet connection that is not very good. It has some slowdown moments. But I found some of those slowdowns were being caused by users downloading stuff (videos, movie trailers, music, etc). For that, i have setup some delay_pools which I can see are OK through cachemgr.cgi. I would like to have some delay_pools logging so i could try to identify users that are being slowed down by delay_pools and maybe some legitim accesses that are not very good because of my internet connection problems. Is that possible ? -- Atenciosamente / Sincerily, Leonardo Rodrigues Solutti Tecnologia http://www.solutti.com.br Minha armadilha de SPAM, NÃO mandem email [EMAIL PROTECTED] My SPAMTRAP, do not email it -- Mehdi Sarmadi
Re: [squid-users] delay access to cached objects
I guess, it could be possible running two squids one cache-only and the other delay_pool-only(with caching disabled) chained On 6/25/06, Santosh Rani <[EMAIL PROTECTED]> wrote: Hello, Is it possible? I want that the objects from the cache should not be served instantly. Your help is needed please. Regards -- Mehdi Sarmadi
Re: [squid-users] HTTPS and delay_pools
The delay_pool and ACL directives that I've used are: === # Multimedia file extension, -i : case insensitive acl mm_ext urlpath_regex -i \.mp3$ \.avi$ \.mov$ \.mpeg$ \.mpg$ \.divx$ \.mp4$ \.xvid$ \.axf$ \.3gp$ \.img2$ \.wma$ \.wmv$ acl compressed urlpath_regex -i \.rar$ \.zip$ acl executablebinary urlpath_regex \.exe$ \.msi$ \.bin$ \.iso$ #delay_pools delay_pools 3 delay_class 1 2 delay_class 2 2 delay_class 3 2 delay_parameters 3 25000/25000 25000/25000 delay_parameters 2 26/26 26/26 delay_parameters 1 35000/35000 35000/35000 delay_initial_bucket_level 50 delay_access 3 allow executablebinary delay_access 3 allow compressed delay_access 3 deny all delay_access 2 allow needbw delay_access 2 allow !mm_ext !executablebinary !compressed delay_access 2 deny all delay_access 1 allow mm_ext delay_access 1 deny all === I hoped that URLs with Multimedia, Binary, ... extentions could fall in delay_pool number 1, but all of them fall into delay_pool number 1. I guess the problem is I could not match urlpath_regex in an HTTPS session, isn't it? If yes, Any clue? On 6/25/06, Henrik Nordstrom <[EMAIL PROTECTED]> wrote: sön 2006-06-25 klockan 01:41 +0330 skrev Mehdi Sarmadi: > Could https requests/replies fall in delay_pools? Yes. > If yes, How to? Should by default, unless you use some incompatible ACLs in your delay_access... As for all the other protocols the delay pools only applies on internet->client traffic. Not client->internet traffic. Regards Henrik -BEGIN PGP SIGNATURE- Version: GnuPG v1.4.3 (GNU/Linux) iD8DBQBEncOzB5pTNio2V7IRAhLvAJ4kTjglxuxCFER/fqm32cFgyz7k4wCfUO/Y 3v285H3EE/v9aT/YWMKf4Yg= =3YVr -END PGP SIGNATURE- -- Mehdi Sarmadi
[squid-users] HTTPS and delay_pools
Could https requests/replies fall in delay_pools? If yes, How to? -- Mehdi Sarmadi
Re: [squid-users] Best non-graphical Squid log analysis tool for Linux shell?
Have you tried Calamaris? it would also email each analysis result in plain text. On 6/21/06, Strandell, Ralf <[EMAIL PROTECTED]> wrote: Hi What Squid logfile analysis tool would you recommend that 1) runs from Linux command line 2) produces clear human readable text format reports (no html, no graphics) 3) is capable of listing most used domains (causing most traffic measured in MB/day) 4) is capable of listing most used single objects (causing most traffic measured in MB/day) 5) is capable of producing traffic statistics (history)(MB by hour or MB by weekday) Recommendations for cache optimization tools and helpers would be welcome, too. Thanks -- Mehdi Sarmadi http://msarmadi.googlepages.com
Re: [squid-users] Is there a way to tell squid on certain domains to not proxy for. For example in transparent mode ho
Let's see together, you have configured some redirection, in your firewall(e.g. Using Netfilter/IPTABLES) or in a router, that should redirect the traffic with [port 80 and a host outside the LAN] as the destination to the squid process somewhere in you LAN. So what should happen if we don't like some website not to get proxied by squid? If a client request for a website, the traffic(HTTP request) gets to squid, then it couldn't tell "I don't want it give it to hotmail.com", and squid could do anything with the request except proxying. I told that example to show, what you say would be done with your firewall (Netfilter/IPTABLES) or what ever that redirects the HTTP traffic to the squid. Please correct me if I'm wrong. On 6/14/06, Keith Owen <[EMAIL PROTECTED]> wrote: Is there a way to tell squid on certain domains to not proxy for. For example in transparent mode hotmail.com will hang at login, so instead squid just log and send the packet untouched on its way. Is that possible? -- Mehdi Sarmadi http://msarmadi.googlepages.com
Re: [squid-users] blocking/logging based on reply headers
I guess squid can't do it, but I'm not sure. You better have a filter on your logfile externally, e.g. when it is rotating do sth to exclude what you want using sed, awk and/or grep. Another solution would be, to use a log analyzer that can do it for you, maybe sarg or calamaris or etc. On 6/13/06, Leonardo Rodrigues Magalhães <[EMAIL PROTECTED]> wrote: Mehdi Sarmadi escreveu: > A wellknown example would be blocking wmf of reply content > > acl blocked_contdisp rep_header Content-Disposition -i \.wmf > http_reply_access deny blocked_contdisp > http_reply_access allow all > Yeah . that's what i was looking for :) Do you know if it's possible to have some ACL and have some logging based on that ?? I dont want allow neither deny, i just want logging. I use to do that on postfix MTA, for finding impact of some new rule. I can ask the rule just LOG ... is that possible with squid ?? I would like to have some acl based on Content-Disposition header reply and have LOGS of them only . -- Atenciosamente / Sincerily, Leonardo Rodrigues Solutti Tecnologia http://www.solutti.com.br Minha armadilha de SPAM, NÃO mandem email [EMAIL PROTECTED] My SPAMTRAP, do not email it -- Mehdi Sarmadi
Re: [squid-users] blocking based on reply headers
A wellknown example would be blocking wmf of reply content acl blocked_contdisp rep_header Content-Disposition -i \.wmf http_reply_access deny blocked_contdisp http_reply_access allow all On 6/12/06, Leonardo Rodrigues Magalhães <[EMAIL PROTECTED]> wrote: Hello Guys, Is it possible to have ACLs based on arbitrary responde headers ? In my case, i would like to have some blocking on Content-Disposition: reply header ... -- Atenciosamente / Sincerily, Leonardo Rodrigues Solutti Tecnologia http://www.solutti.com.br Minha armadilha de SPAM, NÃO mandem email [EMAIL PROTECTED] My SPAMTRAP, do not email it -- Mehdi Sarmadi
Re: [squid-users] Mangling redirects in accelerator mode
I guess, a simple deny page with redirect meta tag in could do it. On 6/11/06, Damian Birchler <[EMAIL PROTECTED]> wrote: Hi there I have installed Squid as an accelerator for httpd. It will block requests to internal.example.com lest they were received per SSL/TLS. Httpd is talked to in plain text by Squid and therefore its redirect messages tell clients to connect to internal.example.com using plain HTTP which they are not allowed. Is there a way to rewrite (the scheme of) the redirects that httpd generates? Tanks very much Damian Birchler -- Mehdi Sarmadi
Re: [squid-users] Broken Upload
The link I'm using for web service is an Internet Sattelite Link with 560ms lag and 1mbps for receive and 256kbps for send, and a Point-To-Point Sattelite Duplex link with 520ms lag and 256kpbs BW. Concurrent usage of web is at most 75 simultinous sessions, and average 45 simultinous sessions. I guess 5 to 8 min would be good for about 10MB upload, isn't it? On 6/8/06, Matus UHLAR - fantomas <[EMAIL PROTECTED]> wrote: On 07.06.06 20:46, Mehdi Sarmadi wrote: > It could be right, but this value was because of the performance > that I wanted for fast reply. Do you mean,do you want to receive fast error message instead of > I hope the problem were this timeout value. > But, you know, this problem is recently occured, and the > configuration you see is not changed for months. > Any other clue?! it depends on your clients connection. If you have dialup clients which post much data (e.g. sending huge attachments through webmail), you should count how much does sending 10 MB of data through 28800 bps connection... 10485760/(28800/10) (10 bits per byte on async connection) = 3640 seconds, which means, one hour in the best case -- Matus UHLAR - fantomas, [EMAIL PROTECTED] ; http://www.fantomas.sk/ Warning: I wish NOT to receive e-mail advertising to this address. Varovanie: na tuto adresu chcem NEDOSTAVAT akukolvek reklamnu postu. M$ Win's are shit, do not use it ! -- Mehdi Sarmadi
Re: [squid-users] Broken Upload
For the read timeout, What is your oppinion? is 5 min. suitable? On 6/7/06, Mehdi Sarmadi <[EMAIL PROTECTED]> wrote: Dear Henrik It could be right, but this value was because of the performance that I wanted for fast reply. I hope the problem were this timeout value. But, you know, this problem is recently occured, and the configuration you see is not changed for months. Any other clue?! Anyway, Thank you very much P.S. "Oops! ;)" On 6/7/06, Henrik Nordstrom <[EMAIL PROTECTED]> wrote: > ons 2006-06-07 klockan 20:31 +0330 skrev Mehdi Sarmadi: > > > # timouts > > read_timeout 1 minutes > > Are you sure you really want this? This could be causing your problem if > the server accepting the upload takes some time to process the upload.. > > Regards > Henrik > > > -BEGIN PGP SIGNATURE- > Version: GnuPG v1.4.3 (GNU/Linux) > > iD8DBQBEhwhUB5pTNio2V7IRAipOAKDAbwsJ37569Di5IvlZRolPN0LutwCbBdDq > c9u20R6J1m26yZrM1piKeBw= > =P5sE > -END PGP SIGNATURE- > > > -- Mehdi Sarmadi -- Mehdi Sarmadi
Re: [squid-users] Broken Upload
Dear Henrik It could be right, but this value was because of the performance that I wanted for fast reply. I hope the problem were this timeout value. But, you know, this problem is recently occured, and the configuration you see is not changed for months. Any other clue?! Anyway, Thank you very much P.S. "Oops! ;)" On 6/7/06, Henrik Nordstrom <[EMAIL PROTECTED]> wrote: ons 2006-06-07 klockan 20:31 +0330 skrev Mehdi Sarmadi: > # timouts > read_timeout 1 minutes Are you sure you really want this? This could be causing your problem if the server accepting the upload takes some time to process the upload.. Regards Henrik -BEGIN PGP SIGNATURE- Version: GnuPG v1.4.3 (GNU/Linux) iD8DBQBEhwhUB5pTNio2V7IRAipOAKDAbwsJ37569Di5IvlZRolPN0LutwCbBdDq c9u20R6J1m26yZrM1piKeBw= =P5sE -END PGP SIGNATURE- -- Mehdi Sarmadi
Re: [squid-users] Broken Upload
Squid is (squid/2.5.STABLE13) and about the delay_pools : # ACLs # Multimedia file extension, -i : case insensitive acl mm_ext urlpath_regex -i \.mp3$ \.avi$ \.mov$ \.mpeg$ \.mpg$ \.divx$ \.mp4$ \.xvid$ \.axf$ \.3gp$ \.img2$ \.wma$ \.wmv$ acl compressed urlpath_regex -i \.rar$ \.zip$ acl executablebinary urlpath_regex \.exe$ \.msi$ \.bin$ \.iso$ # Delay Pools delay_pools 3 delay_class 1 2 delay_class 2 2 delay_class 3 2 delay_parameters 3 3/3 3/3 delay_parameters 2 5/5 5/5 delay_parameters 1 26/26 26/26 delay_initial_bucket_level 50 delay_access 3 allow isaserver executablebinary delay_access 3 allow isaserver compressed delay_access 3 deny all delay_access 2 allow isaserver mm_ext delay_access 2 deny all delay_access 1 allow needbw delay_access 1 allow isaserver !mm_ext !executablebinary !compressed delay_access 1 deny all and about the time out directives # timouts read_timeout 1 minutes client_lifetime 60 minutes # 0 for unlimited request_body_max_size 0 P.S. "Dear Henrik, Oops! sorry for direct reply to your email address" On 6/7/06, Henrik Nordstrom <[EMAIL PROTECTED]> wrote: ons 2006-06-07 klockan 11:12 +0330 skrev Mehdi Sarmadi: > I've problem with upload use, uploads more than 1MB get broken often. Which Squid version? Check the request_body_max_size parameter in your squid.conf. Old versions of Squid defaults to 1MB. Regards Henrik -BEGIN PGP SIGNATURE- Version: GnuPG v1.4.3 (GNU/Linux) iD8DBQBEht5JB5pTNio2V7IRAk1cAKCwPnlLS0ZpPY2bXmSpnFhOW6yQIQCfYyqC UShdARmHJfA65HrNZ96r3xA= =DBs2 -END PGP SIGNATURE- -- Mehdi Sarmadi http://msarmadi.googlepages.com
[squid-users] Broken Upload
Dears I've problem with upload use, uploads more than 1MB get broken often. What should affect on such usage? Any configuration directive or system hardware? Looking forward to your reply TIA -- Mehdi Sarmadi
Re: [squid-users] Content-Length and range header
Any Clue?! Any the same experience TIA Best Regards -- Mehdi Sarmadi http://msarmadi.googlepages.com/ On 5/25/06, Mehdi Sarmadi <[EMAIL PROTECTED]> wrote: My Purpose is to write an acl to match multipart and also big download contents, e.g. more than 3MB. In my oppinion these headers are what should I try to match by acl: ---In request header Range: bytes=77455844-\r\n ---In reply header HTTP/1.1 206 Partial Content\r\n Accept-Ranges: bytes\r\n Content-Length: 88635807\r\n Content-Range: bytes 77455844-166091650/166091651\r\n - The original entry of access.log 1148533570.809 20145 192.168.7.54 TCP_MISS/206 346834 GET http://y.mrbass.org/ubcd34-full.zip arpc DIRECT/70.84.90.170 application/zip [Host: y.mrbass.org\r\nAccept: */*\r\nReferer: http://www.mrbass.org/ubcd/\r\nUser-Agent: Mozilla/4.0 (compatible; MSIE 5.00; Windows 98)\r\nRange: bytes=77455844-\r\nPragma: no-cache\r\nCache-Control: no-cache\r\nProxy-Authorization: Basic YXJwYzoxMjM0\r\nConnection: close\r\n] [HTTP/1.1 206 Partial Content\r\nDate: Tue, 23 May 2006 19:35:23 GMT\r\nServer: Apache/1.3.33 (Debian GNU/Linux)\r\nLast-Modified: Wed, 15 Feb 2006 00:55:42 GMT\r\nETag: "218015-9e65b83-43f27c0e"\r\nAccept-Ranges: bytes\r\nContent-Length: 88635807\r\nContent-Range: bytes 77455844-166091650/166091651\r\nKeep-Alive: timeout=15, max=100\r\nConnection: Keep-Alive\r\nContent-Type: application/zip\r\n\r] 1148533571.001 20338 192.168.7.54 TCP_MISS/206 299050 GET http://y.mrbass.org/ubcd34-full.zip arpc DIRECT/70.84.90.170 application/zip [Host: y.mrbass.org\r\nAccept: */*\r\nReferer: http://www.mrbass.org/ubcd/\r\nUser-Agent: Mozilla/4.0 (compatible; MSIE 5.00; Windows 98)\r\nRange: bytes=161378158-\r\nPragma: no-cache\r\nCache-Control: no-cache\r\nProxy-Authorization: Basic YXJwYzoxMjM0\r\nConnection: close\r\n] [HTTP/1.1 206 Partial Content\r\nDate: Tue, 23 May 2006 19:35:23 GMT\r\nServer: Apache/1.3.33 (Debian GNU/Linux)\r\nLast-Modified: Wed, 15 Feb 2006 00:55:42 GMT\r\nETag: "218015-9e65b83-43f27c0e"\r\nAccept-Ranges: bytes\r\nContent-Length: 4713493\r\nContent-Range: bytes 161378158-166091650/166091651\r\nKeep-Alive: timeout=15, max=100\r\nConnection: Keep-Alive\r\nContent-Type: application/zip\r\n\r] End I guess I should need something like this: acl lessthan3MB rep_header Content-Length ^(1|2)?+[0-9]{0,6}?$ acl multiparts req_header Range ^Bytes=[0-9]+ On 5/25/06, Henrik Nordstrom <[EMAIL PROTECTED]> wrote: > ons 2006-05-24 klockan 20:46 +0330 skrev Mehdi Sarmadi: > > Any clue or same experience? > > Send a few examples of what you want to match with "log_mime_hdrs on" > and it will be easier to discuss. > > Regards > Henrik > > > -BEGIN PGP SIGNATURE- > Version: GnuPG v1.4.3 (GNU/Linux) > > iD8DBQBEdMzEB5pTNio2V7IRAgBsAKC6pmpQdTLTSh9uEp/UcPZZ6O/jFQCgh21d > aKetV6YEHpebPhzH07F4z1c= > =5/uw > -END PGP SIGNATURE- > > > -- Mehdi Sarmadi -- Mehdi Sarmadi
Re: [squid-users] Best Caching Engine
How about datareactor? http://www.imimic.com/ I heard of it's awards. On 5/27/06, [EMAIL PROTECTED] <[EMAIL PROTECTED]> wrote: Thanks Aaron This is really useful info. Any idea how much traffic these box can handle ? For example one Netcache C 2300 box, what kind of load it can handle. Thanks - Lokesh -Original Message- From: Aaron Chu [mailto:[EMAIL PROTECTED] Sent: Friday, May 26, 2006 11:41 PM To: Lokesh Khanna Cc: squid-users@squid-cache.org Subject: Re: [squid-users] Best Caching Engine I've also been looking at commercial caching products. What I've found are: squid - of course netcache - appliances - they have a lot of large customers (yahoo, myspace, etc) and their c2300 unit is priced at $20k bluecoat - applicances - not too clear on their product line, but I spoke with them and it seems their product is a player in the market stratacache - servers with their own tuned OS and caching engine - they have very large systems, but they seem like a very brute-force approach (something like 48 disks in one chassis) jaguar3000 - software - an off-shore company, with limited info online. There are also a number of companies offering memory-based caching products, which are limited to a few gigs of cache size. Caching is also bundled into a lot of load balancers/traffic managers, application servers, etc. Aaron Chu On May 26, 2006, at 3:07 PM, <[EMAIL PROTECTED]> <[EMAIL PROTECTED]> wrote: > Hi > > Does anyone know which is the best (commercial or freeware) caching > engine for Large ISP? Is there any comparison sheet between different > cache engine? > > Thanks - LK > Disclaimer > ** > ** > The information contained in this e-mail, any attached files, and > response threads are confidential and > may be legally privileged. It is intended solely for the use of > individual(s) or entity to which it is addressed > and others authorised to receive it. If you are not the intended > recipient, kindly notify the sender by return > mail and delete this message and any attachment(s) immediately. > > Save as expressly permitted by the author, any disclosure, copying, > distribution or taking action in reliance > on the contents of the information contained in this e-mail is > strictly prohibited and may be unlawful. > > Unless otherwise clearly stated, and related to the official > business of Accelon Nigeria Limited, opinions, > conclusions, and views expressed in this message are solely > personal to the author. > > Accelon Nigeria Limited accepts no liability whatsoever for any > loss, be it direct, indirect or consequential, > arising from information made available in this e-mail and actions > resulting there from. > > For more information about Accelon Nigeria Limited, please see our > website at > http://www.accelonafrica.com > ** > Disclaimer The information contained in this e-mail, any attached files, and response threads are confidential and may be legally privileged. It is intended solely for the use of individual(s) or entity to which it is addressed and others authorised to receive it. If you are not the intended recipient, kindly notify the sender by return mail and delete this message and any attachment(s) immediately. Save as expressly permitted by the author, any disclosure, copying, distribution or taking action in reliance on the contents of the information contained in this e-mail is strictly prohibited and may be unlawful. Unless otherwise clearly stated, and related to the official business of Accelon Nigeria Limited, opinions, conclusions, and views expressed in this message are solely personal to the author. Accelon Nigeria Limited accepts no liability whatsoever for any loss, be it direct, indirect or consequential, arising from information made available in this e-mail and actions resulting there from. For more information about Accelon Nigeria Limited, please see our website at http://www.accelonafrica.com ** -- Mehdi Sarmadi
RE: [squid-users] Content-Length and range header
Any clue or same experience? -- Mehdi Sarmadi
[squid-users] Content-Length and range header
Everyone How could I define an acl for multipart file download or big filesize download? At the first look, It seems I need sth like this: acl lessthan3MB rep_header Content-Length ^(1|2)?+[0-9]{0,6}?$ acl multiparts rep_header range [0-9]+ Could I use them so? Am I in the right way? What are your oppinions? TIA Best Regards -- Mehdi Sarmadi
Re: [squid-users] Digest Authentication and Brute Force Attack
Dear Alberto I think the right place to look for such notification capability is the "external authenticator" itself. On 5/18/06, [EMAIL PROTECTED] <[EMAIL PROTECTED]> wrote: Hello, I'm using Digest Authentication and H1 hash data ( H1=hash("userid":"realm":"password") ) are on an LDAP server. My external authenticator read userid and realm from stdinput, make an ldap search against LDAP server and then return to Squid the H1 hash on stdoutput. Can Squid notify me if the current user authentication go wrong ? In fact, I think that my Squid 2.5.STABLE10 system is open to brute force password attack. In this situation in the access.log I see "TCP_DENIED/407" error messages but I don't know who is the user under attack. I'd like to know the userid under attack to suspend it at LDAP level. Thank you for your attention. Alberto. -- Mehdi Sarmadi
[squid-users] Acceleratiion for a NTLM Auth Required Host
Dears I tried to use squid for acceleration for a website that needs to authenticate it's client using NTLM Authentication. The problem is, when I use it to access that website it shows me the IIS Authentication Failure Page. Why it does not forward the authentication request to my IE, client, that comes from the webserver/IIS? Any Clue? TIA Best Regards -- Mehdi Sarmadi
Re: [squid-users] Squid for Windows
But you better look at -Known Limitations: --Squid features not operational: DISKD: needs to be ported - Volunteers are welcome WCCP: cannot work because GRE support on Windows is missing - Volunteers are welcome Transparent Proxy: missing Windows non commercial interception driver --Some code sections can make blocking calls --Some external helpers may not work --File Descriptors number hard-limited to 2048 On 5/2/06, Shawn Owens <[EMAIL PROTECTED]> wrote: Anyone know of a stable version of Squid for Windows and where I can dl it? Thanks. -- The information contained in this message may be confidential and is intended for the addressees only. If you have received this message in error or there are any problems please notify the sender immediately. The unauthorised use, disclosure, copying or alteration of this message is strictly prohibited by law without express permission of the original sender. BGC Contracting Pty Ltd will not be liable for direct, special, indirect or consequential damages arising from any appropriation, application or alteration of the contents of this message by a third party or as a result of any virus being passed on. BGC Contracting Pty Ltd reserves the right to monitor and record e-mail messages sent to and from this address for the purposes of investigating or detecting any unauthorised usage of its system and ensuring its effective operations. To unsubscribe from future communication please reply to the sender of this email or forward this email to [EMAIL PROTECTED] with the words "Unsubscribe" in the subject line. ------ -- Mehdi Sarmadi
Re: [squid-users] Squid for Windows
http://www.squid-cache.org/Doc/FAQ/FAQ-1.html#ss1.8 http://www.acmeconsulting.it/SquidNT/ On 5/2/06, Shawn Owens <[EMAIL PROTECTED]> wrote: Anyone know of a stable version of Squid for Windows and where I can dl it? Thanks. -- The information contained in this message may be confidential and is intended for the addressees only. If you have received this message in error or there are any problems please notify the sender immediately. The unauthorised use, disclosure, copying or alteration of this message is strictly prohibited by law without express permission of the original sender. BGC Contracting Pty Ltd will not be liable for direct, special, indirect or consequential damages arising from any appropriation, application or alteration of the contents of this message by a third party or as a result of any virus being passed on. BGC Contracting Pty Ltd reserves the right to monitor and record e-mail messages sent to and from this address for the purposes of investigating or detecting any unauthorised usage of its system and ensuring its effective operations. To unsubscribe from future communication please reply to the sender of this email or forward this email to [EMAIL PROTECTED] with the words "Unsubscribe" in the subject line. ------ -- Mehdi Sarmadi
Re: [squid-users] force the internet access only for my proxy server
Dear Rodrigo Sorry for calling you Roberto. :) On 4/28/06, Mehdi Sarmadi <[EMAIL PROTECTED]> wrote: Dear Roberto Two possible solutions there exists I my oppinion. Configure workstations to not let people change the proxy, in Microsoft Windows it is possible via editing Group Policy(Using Group Policy Editor). And another one is, to use your firewall to forbid or redirect the requests that did not pass trough your proxy. As you mentioned that users don't need to set a proxy server in the internet explorer, so it seems you have squid proxy operating in transparent mode. So the machine that the Squid running on most likely is the internet gateway and you can do your purpose there. I mean if there is no firewall running on that machine use the lovely powerfull IPTABLES ( better to call Netfilter ). Hope it helps Best Regards On 4/28/06, Rodrigo Brito <[EMAIL PROTECTED]> wrote: > i have a squid proxy in my company and i don`t need to configure the > proxy and port in the internet explore. > i denied a lot of sites , but the users find another proxy servers in > the internet and they configure the proxy in the internet explore for > access denied sites. > > is there any way to block the internet access using another proxy ? > i`d like to force the internet access only for my proxy server > > > > > -- Mehdi Sarmadi -- Mehdi Sarmadi
Re: [squid-users] force the internet access only for my proxy server
Dear Roberto Two possible solutions there exists I my oppinion. Configure workstations to not let people change the proxy, in Microsoft Windows it is possible via editing Group Policy(Using Group Policy Editor). And another one is, to use your firewall to forbid or redirect the requests that did not pass trough your proxy. As you mentioned that users don't need to set a proxy server in the internet explorer, so it seems you have squid proxy operating in transparent mode. So the machine that the Squid running on most likely is the internet gateway and you can do your purpose there. I mean if there is no firewall running on that machine use the lovely powerfull IPTABLES ( better to call Netfilter ). Hope it helps Best Regards On 4/28/06, Rodrigo Brito <[EMAIL PROTECTED]> wrote: i have a squid proxy in my company and i don`t need to configure the proxy and port in the internet explore. i denied a lot of sites , but the users find another proxy servers in the internet and they configure the proxy in the internet explore for access denied sites. is there any way to block the internet access using another proxy ? i`d like to force the internet access only for my proxy server -- Mehdi Sarmadi
Re: [squid-users] DelayPool + AUTH and ACL
Henrik As you can see as the last item of this page http://squid-docs.sourceforge.net/latest/html/x1982.html It seems it would be possible. quote: You can combine username/password access-lists and speed-limits. You can, for example. allow users that have not logged into the cache access to the Internet, but at a much slower speed than users who have logged in. Users that are logged in get access to dedicated bandwidth, but are charged for their downloads. from: http://squid-docs.sourceforge.net/latest/html/x1982.html The only thing that made me look for such functionality was that passage. Now what do you think? Did the author mean what I'm thinking of, or I'm wrong? On 4/28/06, Henrik Nordstrom <[EMAIL PROTECTED]> wrote: fre 2006-04-28 klockan 11:11 +0330 skrev Mehdi Sarmadi: > But as I configured, when I try to access with username/password it is > OK and have Bandwidth, but I cannot use squid with out providing > username/password it does not allow me to get through w/o > username/password and gives a access denied page. It's not so easy. You either use authentication or you don't. It's not something the user can select if he wants to do or not. Regards Henrik -BEGIN PGP SIGNATURE- Version: GnuPG v1.4.3 (GNU/Linux) iD8DBQBEUedAB5pTNio2V7IRAptLAJ49k51JxDvtW398Va2pBvuoxNNyPACfcqCR OL5GdD91VuowIYz+iusq/6s= =ZXKw -END PGP SIGNATURE- -- Mehdi Sarmadi
[squid-users] DelayPool + AUTH and ACL
Dear Bro's There are something ambiguous for me: I use these ACL definitions: acl LAN src 192.168.0.0/24 acl password proxy_auth user1 and these access rules: http_access allow password http_access allow LAN about proxy_acl access lists, I want have squid do this for me: -- Some people can use squid for internet access with username/password -- Also Allow people use squid for internet access without entering any username/password I want to use this in combination with delay_pools to give some people privilage to use special Bandwidth if they use their username password, and other use a regular limited Bandwidth. But as I configured, when I try to access with username/password it is OK and have Bandwidth, but I cannot use squid with out providing username/password it does not allow me to get through w/o username/password and gives a access denied page. The things that made me look for this were these sentences: quote: You can combine username/password access-lists and speed-limits. You can, for example. allow users that have not logged into the cache access to the Internet, but at a much slower speed than users who have logged in. Users that are logged in get access to dedicated bandwidth, but are charged for their downloads. from: http://squid-docs.sourceforge.net/latest/html/x1982.html Please let me know, what I did wrong? and how should I do what is sayed in that quote? TIA Best Regards -- Mehdi Sarmadi