On 09/03/11 01:20, Osmany wrote:
On Tue, 2011-03-08 at 12:21 +1300, Amos Jeffries wrote:
On Tue, 08 Mar 2011 11:58:57 +1300, Amos Jeffries wrote:
On Mon, 07 Mar 2011 16:59:07 -0500, Osmany wrote:
Greetings everyone,

So I'm having trouble with my squid proxy-cache server. I recently
added
a redirect program because I had to make users go to my kaspersky
admin
kit and my WSUS services to get their updates and it works fine but
I
get constantly a warning and squid just collapses after a few
minutes of
run time. This is what I get in my cache.log:

2011/03/07 15:54:17| WARNING: All url_rewriter processes are busy.
2011/03/07 15:54:17| WARNING: up to 465 pending requests queued
2011/03/07 15:54:17| storeDirWriteCleanLogs: Starting...
2011/03/07 15:54:17| WARNING: Closing open FD 1455
2011/03/07 15:54:17| commSetEvents: epoll_ctl(EPOLL_CTL_DEL): failed
on
fd=1455: (1) Operation not permitted
2011/03/07 15:54:17|     65536 entries written so far.
2011/03/07 15:54:17|    131072 entries written so far.
2011/03/07 15:54:17| WARNING: Closing open FD 1456
2011/03/07 15:54:17| commSetEvents: epoll_ctl(EPOLL_CTL_DEL): failed
on
fd=1456: (1) Operation not permitted
2011/03/07 15:54:17|   Finished.  Wrote 139965 entries.
2011/03/07 15:54:17|   Took 0.1 seconds (1288729.1 entries/sec).
FATAL: Too many queued url_rewriter requests (465 on 228)
Squid Cache (Version 2.7.STABLE7): Terminated abnormally.

This is what I have in the squid.conf

#  TAG: url_rewrite_program
url_rewrite_program /etc/squid/redirect

#  TAG: url_rewrite_children
url_rewrite_children 100

#  TAG: url_rewrite_concurrency
url_rewrite_concurrency 50

#  TAG: url_rewrite_access
url_rewrite_access allow redirect

And this is what I have in my redirector script

#!/usr/bin/perl
BEGIN {$|=1}
while (<>) {
      @X = split;
      $url = $X[0];
      if ($url =~ /^http:\/\/dnl(.*)kaspersky(.*)com(.*)/) {
           print
"301:ftp:\/\/dnl-kaspersky\.quimefa\.cu\:2122\/Updates";
      }
      elsif ($url =~ /^http:\/\/(.*)windowsupdate(.*)/) {
           print "301:http:\/\/windowsupdate\.quimefa\.cu\:8530";
      }
}

Can you please help me to solve this?

Your script does not support concurrency. When that is configured in
squid there will be 2 space-delimited fields to handle.
First one being the ID of the request channel, not the URL.


  Oops, I missed a few other things too:
   * 'else' case is needed to print the no-change result back to Squid
   * newlines need to be printed in perl


    $url = $X[1];
    if ($url =~ /^http:\/\/dnl(.*)kaspersky(.*)com(.*)/) {
     print $X[0]."
  301:ftp:\/\/dnl-kaspersky\.quimefa\.cu\:2122\/Updates\n";
    }
    elsif ($url =~ /^http:\/\/(.*)windowsupdate(.*)/) {
     print $X[0]." 301:http:\/\/windowsupdate\.quimefa\.cu\:8530\n";
    }
    else {
     print $X[0]."\n";
    }

  Amos

So this is what I have now but it doesn't work. I've tried it manually:

#!/usr/bin/perl
BEGIN {$|=1}
while (<>) {
      @X = split;
      $url = $X[1];
    if ($url =~ /^http:\/\/dnl(.*)kaspersky(.*)com(.*)/) {
     print $X[0]."
  301:ftp:\/\/dnl-kaspersky\.quimefa\.cu\:2122\/Updates\n";
    }
    elsif ($url =~ /^http:\/\/(.*)windowsupdate(.*)/) {
     print $X[0]." 301:http:\/\/windowsupdate\.quimefa\.cu\:8530\n";
    }
    else {
     print $X[0]."\n";
    }
}

it just keeps on returning the same url that I enter. help please?


Did you add the concurrency channel ID before the URL on each manually entered line?
eg  $id $url $garbage

Amos
--
Please be using
  Current Stable Squid 2.7.STABLE9 or 3.1.11
  Beta testers wanted for 3.2.0.5

Reply via email to