Re: [squid-users] Problem of Patching progress
> > I failed to apply "squid-2.5.STABLE7-httpd_accel_no_pmtu_disc.patch" with > > message "Hunk #1 FAILED at 606 1 out of 1 hunk FAILED -- saving rejects to > > file src/structs.h.rej". I applied all patches of Squid2.5S7. > > > > Need your advise how to fix it. Your answer is very appreciated > > This patch unfortunately overlaps with another not yet published patch. > > You can grab the nightly snapshot which includes all (including the not > yet published patch), or add the failing lines by hand. > > Another user also kindly uploaded an adjusted patch in the bugzilla > report. > > Regards > Henrik Thanks for your info Henrik. Thx & Rgds, Awie
[squid-users] diskd
Hi all How can I implement and test diskd? I am using the squid integrated into Fedora Club 3. Cheers, Daniel Navarro Maracay, Venezuela www.csaragua.com/ecodiver _ Do You Yahoo!? Información de Estados Unidos y América Latina, en Yahoo! Noticias. Visítanos en http://noticias.espanol.yahoo.com
Re: [squid-users] Logfile analyzing
On Sat, 22 Jan 2005, [EMAIL PROTECTED] wrote: > I'd like to see this in the logfile: > > http://www.microsoft.com > http://www.microsoft.com/products > http://www.microsoft.com/products/visual_studio > > This is a theoretical example as if those are the > actual URL locations typed into the address bar, or > clicked via hyperlink. > > I don't see how the access.log can be used to provide > this kind of report. > > For example, if I simply type microsoft.com in my > address bar and click on "office" in the left pane, > then check my access.log, I see 35 entries have been > added just by clicking the "office" link once. I > understand that there is a separate entry for each > HTTP GET that the webpage calls for, but the > access.log doesn't seem to differentiate between what > the user clicked, and what the webpage requested to > display the whole page correctly. > > More specifically, the first 3 entries say: > > 127.0.0.1 - - [22/Jan/2005:15:56:31 -0500] "GET > http://g.microsoft.com/mh_mshp/2 HTTP/1.1" 301 538 > TCP_MISS:DIRECT > 127.0.0.1 - - [22/Jan/2005:15:56:32 -0500] "GET > http://office.microsoft.com/home/default.aspx > HTTP/1.1" 301 467 TCP_MISS:DIRECT > 127.0.0.1 - - [22/Jan/2005:15:56:32 -0500] "GET > http://office.microsoft.com/en-us/default.aspx > HTTP/1.1" 200 52134 TCP_MISS:DIRECT > > How is ANY logfile analyzer going to tell the > difference between the first entry (which the user > clicked on) and the second/third entries (which were > requested by the html from the first entry)? The few analysers that I've used pay attention to the status codes. In the above example, the page displayed in the browser was the last one where a 200 status was returned. The 301 status redirects the browser to the new location of the requested page. They, also, allowed me to select what I was interested in having reported. If I'm not interested in the graphical content on the page, I can tell it to suppress images. > Is there is a squid configuration parameter that will > allow the logs to be filtered appropriately? Why would you want to do this? If you have users complaining that a page doesn't display correctly, how would you identify the cause of the problem if you don't record what happened in the log? Merton Campbell Crockett -- BEGIN: vcard VERSION:3.0 FN: Merton Campbell Crockett ORG:General Dynamics Advanced Information Systems; Intelligence and Exploitation Systems N: Crockett;Merton;Campbell EMAIL;TYPE=internet:[EMAIL PROTECTED] TEL;TYPE=work,voice,msg,pref: +1(805)497-5045 TEL;TYPE=work,fax: +1(805)497-5050 TEL;TYPE=cell,voice,msg:+1(805)377-6762 END:vcard
[squid-users] squid performance
what is the squid performance parameter that shows me how much efficient it is? what is the squid parameter that shows me how much bandwidth have saved? I refer to calamaris reports. Yours, Daniel Navarro Maracay, Venezuela www.csaragua.com/ecodiver _ Do You Yahoo!? Información de Estados Unidos y América Latina, en Yahoo! Noticias. Visítanos en http://noticias.espanol.yahoo.com
RE: [squid-users] Using proxy address as outbound address for multiple ips
> -Original Message- > From: Henrik Nordstrom [mailto:[EMAIL PROTECTED] > Sent: January 22, 2005 5:48 PM > To: Mike Wesson > Cc: squid-users@squid-cache.org; [EMAIL PROTECTED] > Subject: RE: [squid-users] Using proxy address as outbound > address for multiple ips > > On Sat, 22 Jan 2005, Mike Wesson wrote: > > > I have looked at tcp_outgoing_address, but that does not > completely solve > > the problem unfortunately. I can't base the acl on the > clients IP as it is > > static, and would be connecting to the various IPs on the > server. So client > > IP 10.1.1.x should be able to connect to squid on any of > the IPs on the > > server, and have that IP as the outbound. Hope that made sense. :) > > There is many other acl types you can use... myip comes to mind here.. > > Regards > Henrik > Ahhh, thank you Henrik, that is working wonderfully. I didn't think to search the configuration file for additional acls as I was using the online documentation. --- Mike Wesson ([EMAIL PROTECTED]) Phatservers Professional Hosting http://www.phatservers.net
RE: [squid-users] Using proxy address as outbound address for multiple ips
On Sat, 22 Jan 2005, Mike Wesson wrote: I have looked at tcp_outgoing_address, but that does not completely solve the problem unfortunately. I can't base the acl on the clients IP as it is static, and would be connecting to the various IPs on the server. So client IP 10.1.1.x should be able to connect to squid on any of the IPs on the server, and have that IP as the outbound. Hope that made sense. :) There is many other acl types you can use... myip comes to mind here.. Regards Henrik
Re: [squid-users] Problem of Patching progress
On Sat, 22 Jan 2005, Awie wrote: I failed to apply "squid-2.5.STABLE7-httpd_accel_no_pmtu_disc.patch" with message "Hunk #1 FAILED at 606 1 out of 1 hunk FAILED -- saving rejects to file src/structs.h.rej". I applied all patches of Squid2.5S7. Need your advise how to fix it. Your answer is very appreciated This patch unfortunately overlaps with another not yet published patch. You can grab the nightly snapshot which includes all (including the not yet published patch), or add the failing lines by hand. Another user also kindly uploaded an adjusted patch in the bugzilla report. Regards Henrik
Re: [squid-users] user/error keywords in external_acl conf
On Fri, 21 Jan 2005, Marcos Machado wrote: [quote squid.conf] The helper receives lines per the above format specification, and returns lines starting with OK or ERR indicating the validity of the request and optionally followed by additional keywords with more details. General result syntax: OK/ERR keyword=value ... Defined keywords: user= The users name (login) error=Error description (only defined for ERR results) [/quote squid.conf] Does anyone know what its purpose? It was intended that this should be available for use in error pages, but this part didn't get finished for the 2.5 release. It will be available in the upcoming 3.0 release. Regards Henrik
Re: [squid-users] www.ovejero.org
On Fri, 21 Jan 2005, Gregori Andres wrote: I've a very good squid 2.4.stable7 and it works very good ! but today, squid fail when I've tried to access url: http://www.ovejero.org/ Quite broken/misconfigured server.. - Gives me "Access denied" - But works on a forced reload (shift+reload in Mozilla) - But none of the links works as they send the wrong content type (text/plain on HTML web pages...) Regards Henrik
Re: [squid-users] Show a message -- then redirect
On Fri, 21 Jan 2005, [ISO-8859-1] paul kölle wrote: python. Now I have a problem with tranparent proxying. My proxy doesn't receive the full path of the request (just the part after the hostname) when the request is REDIRECTed with iptables (if I configure the proxy in the client it works well). What I'd like to know: 1. Can I do the above with squid? Not easily. The problem is a state question. HTTP is stateless so it is somewhat hard for the proxy to know that the user has seen the message. 2. What is special about transparent proxying I have to implement in my own proxy? Interception proxying requires you to act as if you were the web server, so you must reconstruct the full URL from the pieces found in the request and connection In principle the data you have are: * Real destination IP address from the connection * URL-Path from the request method * Requested hostname from the Host header if there is one. Regards Henrik
Re: [squid-users] squid2.5.stable7 dying
On Fri, 21 Jan 2005, Lucia Di Occhi wrote: Once a day I get a: fqdncacheParse: No PTR record FATAL: Received Segment Violation...dying. Not good. Please file a bug report on this issue. See the Squid FAQ on how to report bugs for details. I cannot see any other errors in the DG or Squid logs beside the /var/log/message: Jan 21 08:34:59 bad dansguardian: Error connecting to proxy Jan 21 08:35:03 bad last message repeated 79 times Jan 21 08:35:04 bad squid[3345]: Squid Parent: child process 3347 exited due to signal 6 Your Squid is aborting due to a fatal error. See cache.log for details. Regards Henrik
Re: [squid-users] negative-ttl
On Thu, 20 Jan 2005, jzhao wrote: Set netgative-ttl to 0 minutes would turn off caching for failed requests, but is there any way to turn off only the failed requests with RC 500, but still caches the other failed requests, like RC 404, 503, etc. ? No. Regards Henrik
Re: [squid-users] centralised logging
On Thu, 20 Jan 2005, Rolf wrote: what do people use for centralised logging for multiple squid boxes? Most people I know periodically (usually nighlty) collect the logs from the proxy servers to the central log server, where they then run statistics etc.. About to put in another server, didn't see any relevant directives in squid.conf for redirecting log output (but I could have missed). There is none. Is it best to setup a method to transfer the logs to a master machine and modify the reporting/analysing tools to deal with those logfiles? scp is a good tool for transferring logs, easily scheduled by cron. What happens in the case of proxy A receiving a request and asking proxy B for it (in the case where they are configured to ask each other)? The respective logs would both record the client request where in fact there was only one? Yes, but in one of them the client IP is the other proxy so it isn't that hard to filter out if not wanted. Regards Henrik
Re: [squid-users] squid + squirm
On Wed, 19 Jan 2005, [iso-8859-2] Max Černý wrote: But they have discovered, that if they don't put www.google.com, but http://66.102.11.99 - the webpage will display. Is there any chance to block reguests, which does not have a DNS names, but only IP addresses? Not easily. This would require one to build a database of all the IP addresses of the sites you want to block as most has not registered the reverse lookup. In cases like google it's even worse as this requires building a database of all the google search servers around the globe. A standard DNS lookup of www.google.com only gives you some of them.. Regards Henrik
[squid-users] Logfile analyzing
I checked out the squid log analyzer programs, But haven't found one that can provide a sample output like what I need to see on the report. Say for example I go to microsoft.com, click on "products", then click on "visual studio .NET" I'd like to see this in the logfile: http://www.microsoft.com http://www.microsoft.com/products http://www.microsoft.com/products/visual_studio This is a theoretical example as if those are the actual URL locations typed into the address bar, or clicked via hyperlink. I don't see how the access.log can be used to provide this kind of report. For example, if I simply type microsoft.com in my address bar and click on "office" in the left pane, then check my access.log, I see 35 entries have been added just by clicking the "office" link once. I understand that there is a separate entry for each HTTP GET that the webpage calls for, but the access.log doesn't seem to differentiate between what the user clicked, and what the webpage requested to display the whole page correctly. More specifically, the first 3 entries say: 127.0.0.1 - - [22/Jan/2005:15:56:31 -0500] "GET http://g.microsoft.com/mh_mshp/2 HTTP/1.1" 301 538 TCP_MISS:DIRECT 127.0.0.1 - - [22/Jan/2005:15:56:32 -0500] "GET http://office.microsoft.com/home/default.aspx HTTP/1.1" 301 467 TCP_MISS:DIRECT 127.0.0.1 - - [22/Jan/2005:15:56:32 -0500] "GET http://office.microsoft.com/en-us/default.aspx HTTP/1.1" 200 52134 TCP_MISS:DIRECT How is ANY logfile analyzer going to tell the difference between the first entry (which the user clicked on) and the second/third entries (which were requested by the html from the first entry)? Is there is a squid configuration parameter that will allow the logs to be filtered appropriately? __ Do you Yahoo!? Meet the all-new My Yahoo! - Try it today! http://my.yahoo.com
Re: [squid-users] squid - dns server
On Sat, 2005-01-22 at 15:38 -0600, Daniel Navarro wrote: > is squid a dns server by itself? No. But you can tell squid to use DNS servers different than those you specify system-wide. Kinkie
[squid-users] squid - dns server
is squid a dns server by itself? So can I turn dnsmasq off? Regards, Daniel Navarro Maracay, Venezuela. www.csaragua.com/ecodiver _ Do You Yahoo!? Información de Estados Unidos y América Latina, en Yahoo! Noticias. Visítanos en http://noticias.espanol.yahoo.com
[squid-users] Stopping Squid
This is the extract w cache.log while restarting squid, seems to be ok 2005/01/22 15:07:19| NETDB state saved; 776 entries, 25 msec 2005/01/22 15:13:01| Detected REVIVED Parent: w3cache.tpnet.pl/8080/7 2005/01/22 15:35:32| Preparing for shutdown after 318470 requests 2005/01/22 15:35:32| Waiting 30 seconds for active connections to finish 2005/01/22 15:35:32| FD 11 Closing HTTP connection 2005/01/22 15:35:32| Closing Pinger socket on FD 17 2005/01/22 15:35:35| Pinger exiting. 2005/01/22 15:35:41| Starting Squid Cache version 2.5.STABLE7 for i686-pld-linux-gnu... but message on my console reads: Stopping Squid.[PROBLEMS] Waiting to shut down...[DONE] Starting squid.[DONE] I've got this line very often in log: 2005/01/22 15:07:19| NETDB state saved; 776 entries, 25 msec but it shouldn't interfere What happens is the swap usage grows 20% as much or more.I thought I had problem related to memory but the ratio [Number of HTTP requests received/Page faults with physical i/o, from FAQ] is 0.0007 Can anybody say where I shall start to search? FAQ says nothing. -- Rafal
Re: [squid-users] log filtering
Hi, Had a similar need. Have downloaded calamaris but not yet installed. Browsed the site and downloaded shrimp.pl as it looked promising for start. Ran it on the access log for a particular ip and the output is just fantastic. And its Perl-based so I have at last a tool that I need to analyze the log files the way I want to. Thank you for the pointers, BTW - my first post. alfred, Scott Phalen wrote: Can anyone suggest the best logfile analyzer for this, or squid configuration options that will make access.log more appropriate for this? A free solution would be Calamaris (http://cord.de/tools/squid/calamaris/Welcome.html.en) If your organization wishes to pay there is an excellent program called Cyfin (http://www.wavecrest.net/) Regards, Scott -- Perl - Making simple things easy, without making hard things impossible;
Re: [squid-users] log filtering
webalizer and sarg are good analizers You webmin let´s you filter some log parameters. I haven´t used it but calamaris is well mentioned. Regards, Daniel Navarro Maracay, Venezuela www.csaragua.com/ecodiver --- "[EMAIL PROTECTED]" <[EMAIL PROTECTED]> escribió: > I've got squid running and am asked to produce a > report showing requests by the USER not the WEBPAGE. > > i.e. only the URLS entered or clicked by the user, > filtering out all the extra info about the > supporting > files that the webpage requested (GIFS, popups, > banners, etc). > > I've not found any appropriate logfile analyzers for > this, and access.log doesn't seem to differentiate > between urls requested by the USER or the WEBPAGE. > > Can anyone suggest the best logfile analyzer for > this, > or squid configuration options that will make > access.log more appropriate for this? > > > > __ > Do you Yahoo!? > Read only the mail you want - Yahoo! Mail SpamGuard. > > http://promotions.yahoo.com/new_mail > _ Do You Yahoo!? Información de Estados Unidos y América Latina, en Yahoo! Noticias. Visítanos en http://noticias.espanol.yahoo.com
RE: [squid-users] log filtering
Can anyone suggest the best logfile analyzer for this, or squid configuration options that will make access.log more appropriate for this? A free solution would be Calamaris (http://cord.de/tools/squid/calamaris/Welcome.html.en) If your organization wishes to pay there is an excellent program called Cyfin (http://www.wavecrest.net/) Regards, Scott -- No virus found in this outgoing message. Checked by AVG Anti-Virus. Version: 7.0.300 / Virus Database: 265.7.2 - Release Date: 1/21/2005 -- No virus found in this outgoing message. Checked by AVG Anti-Virus. Version: 7.0.300 / Virus Database: 265.7.2 - Release Date: 1/21/2005
RE: [squid-users] log filtering
Can anyone suggest the best logfile analyzer for this, or squid configuration options that will make access.log more appropriate for this? A free solution would be Calamaris (http://cord.de/tools/squid/calamaris/Welcome.html.en) If your organization wishes to pay there is an excellent program called Cyfin (http://www.wavecrest.net/) Regards, Scott -- No virus found in this outgoing message. Checked by AVG Anti-Virus. Version: 7.0.300 / Virus Database: 265.7.2 - Release Date: 1/21/2005 -- No virus found in this outgoing message. Checked by AVG Anti-Virus. Version: 7.0.300 / Virus Database: 265.7.2 - Release Date: 1/21/2005
[squid-users] log filtering
I've got squid running and am asked to produce a report showing requests by the USER not the WEBPAGE. i.e. only the URLS entered or clicked by the user, filtering out all the extra info about the supporting files that the webpage requested (GIFS, popups, banners, etc). I've not found any appropriate logfile analyzers for this, and access.log doesn't seem to differentiate between urls requested by the USER or the WEBPAGE. Can anyone suggest the best logfile analyzer for this, or squid configuration options that will make access.log more appropriate for this? __ Do you Yahoo!? Read only the mail you want - Yahoo! Mail SpamGuard. http://promotions.yahoo.com/new_mail
Re: [squid-users] active directory authentication - not working
Hi, At 12.10 21/01/2005, [EMAIL PROTECTED] wrote: hello, please can you help me with subject? i have last version, i red about win32_check_group.exe, i have in AD2003 created group.. but it doesnt work.. when i try to run win32_check_group.exe -G nothing happens... no response, anything... This is correct: win32_check_group.exe is only an helper program, so to verify if it's working from command prompt you must manually type something into (in the following example debug is enabled): C:\squid\libexec>win32_check_group -G -d win32_check_group[692]: Member of Domain ACMECONSULTING win32_check_group[692]: External ACL win32 group helper build Jan 7 2005, 11:02:22 starting up... win32_check_group[692]: Domain Global group mode enabled. acmeconsulting\\guido.serassio Staff win32_check_group[692]: Got 'acmeconsulting\\guido.serassio Staff' from Squid (length: 36). win32_check_group[692]: Valid_Global_Groups: checking group membership of 'acmeconsulting\guido.serassio'. win32_check_group[692]: Using '\\HERA' as DC for 'acmeconsulting' local domain. win32_check_group[692]: Using '\\HERA' as DC for 'acmeconsulting' user's domain. win32_check_group[692]: Windows group: Staff, Squid group: Staff OK Regards Guido - Guido Serassio Acme Consulting S.r.l. - Microsoft Certified Partner Via Gorizia, 69 10136 - Torino - ITALY Tel. : +39.011.3249426 Fax. : +39.011.3293665 Email: [EMAIL PROTECTED] WWW: http://www.acmeconsulting.it/
Re: [squid-users] 3 questions...
On Sat, 2005-01-22 at 03:37 -0600, Daniel Navarro wrote: > Hi all, > > 1. I've found my squid port 8080 is listed as a safe > port and denied. I do use 8080 port. is it wrong? How > should it be configured. Safe port means that a client can request pages hosted on servers that answer on that port. Is it that what you're trying to do? > 2. How do I know is my squid server is using DDISK? > How can I activate it? maybe you meant diskd. You should see the child processes when doing a "ps" ("ps axf" is very handy on Linux systems if you want to look at those things). Also, there should be some message on cache.log about the cache_dirs being configured. > 3. Since I use ssh and webmin should I state 22 and > 1 ports as safe? If you want to access webmin throught the proxy, you should add port 1 to the safe_ports acl (but make sure that the ACL is effectively used in http_access clauses). SSH is not HTTP, so you wouldn't be using squid anyways to access it. Kinkie
[squid-users] 3 questions...
Hi all, 1. I've found my squid port 8080 is listed as a safe port and denied. I do use 8080 port. is it wrong? How should it be configured. 2. How do I know is my squid server is using DDISK? How can I activate it? 3. Since I use ssh and webmin should I state 22 and 1 ports as safe? Regards, Daniel Navarro Maracay, Venezuela www.csaragua.com/ecodiver _ Do You Yahoo!? Información de Estados Unidos y América Latina, en Yahoo! Noticias. Visítanos en http://noticias.espanol.yahoo.com
[squid-users] Help about delay pool
Hi Henrick... Can you tell me about the concept of "Dynamic Delay Pools". Is it possible using dynamic delay pools to assign unused bandwidth of one delay pool to heavily loaded pool. Can you give me some discription about how to get this and how to configure squid to implement Dynamic Delay Pool. Thanks Imtiyaz -- Netcore's New Website http://www.netcore.co.in --
RE: [squid-users] Need help regarding Addtional network card for my Squid Server
> -Original Message- > From: Sagun, Paul L. [mailto:[EMAIL PROTECTED] > Sent: Saturday, January 22, 2005 12:39 AM > To: squid-users@squid-cache.org > Subject: [squid-users] Need help regarding Addtional network card for my > Squid Server > > Currently I have Squid version 2.5 Stable 3 sitting in Linux 7.2 > (2.4.7-10smp), > > I have 2 network cards, 1 is for internet connection and the other is > for client connection. > > Then I have added another interface (3rd network card) for client > connection, when I used squid -k reconfigure to reconfigure my running > squid and detect the 3rd network card, but my desktop cannot surf thru > Internet using the 3rd interface. > > I try to restart my Linux box and able to detect the 3rd network card > without error but in my squid still cannot access Internet using this > interface, please advise me what I need to do. Squid is not in the business of detecting network cards. It binds to an network interface, and it listens on an IP address+port (or more). If you run ipconfig, does the third network interface' address show up? If you run netstat -a -n -p, can your see squid listening on its proxy port (default: 3128, but many people use 8080 instead, or even something else) on the correct IP address(es)? You mention that you use the third interface for client connection, and in the third paragraph you talk about outbound connection. This is confusing. An ASCII drawing about the network topology with the used IP addresses would help clarify your intentions. Can you ping from the client to both network interfaces on the squid server? Are the "acl"'s correct so that they "allow"? -- Vik