I use awstats to report on my server logs.

I missed a couple days and wanted to go back and generate the full
history in chronological order (AWStats requirement).  I have a local
copy of all my log files in compressed .gz format, and due to the fact
that I use rsync with cron to copy them off my live site, I do not want
to convert them all to uncompressed files locally.  That plus the fact
that awstats can read from gzip if done properly.

I wrote a short shell script to try to run awstats in chronological
order (see below)

I found out that even when you supply the -logfile='' parameter on the
command line, that the config file must still have a valid value for
logFile, so I set that manually to be the first log file that I have. 
In subsequent loops, that data should be ignored as 'old records'.

I also found that even though backticks work on the command line to
gunzip the log, it didn't work in bash shell.  So I tried to use another
variable assignment to hold the contents of the uncompressed log file. 
This is probably a bad idea, because it will take up all memory, so I
guess I need to modify the script to write a temporary file, and then
remove it, or otherwise figure out how to get the pipe to work.

Since I was not able to get the script to work, I would appreciate any
comments on how to do this properly.


#!/bin/bash

for n in /var/log/httpd/www*.gz
do
  /path/to/my/cgi-bin/awstats.pl -config=www.buzgate.org -update
-logfile=`gzip -d <$n |`
  echo "processing $n"
  sleep 3m
done

echo "All log files processed successfully."


-- 
Greg Rundlett
Sr. Internet Systems Architect
Knowledge Institute
creators of the Business Utility Zone Gateway
at www.buzgate.org
(603) 642-4720
[EMAIL PROTECTED]

_______________________________________________
gnhlug-discuss mailing list
[EMAIL PROTECTED]
http://mail.gnhlug.org/mailman/listinfo/gnhlug-discuss

Reply via email to