How to write Backup / Restore Program- Reg

2003-10-28 Thread NIC-Email



Sir,

I seen ur mail on the web, through search engine.I want to 
write backup program myself. give possible idea and logics for write to backup 
program.

pls. don't reply.

Thanking u


N.Jai Sankar.
p[EMAIL PROTECTED]



Exim config

2003-10-28 Thread Craig
Hi Guys

Does anyone happen to know how I could have Exim parse a text file with
a list of users in, if they are in the file to send mail to another an
exhange server else deliver to local mailbox ?

Any suggestions would be welcomed.

Thanks
Craig


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Resource consumption.

2003-10-28 Thread Rudi Starcevic
Hi,

I'm pretty sure I have a cron job analysing apache logs which is
consuming too much of the system's resources.
So much is spent on Webalizer and Awstats that the web server stops
answering requests.

The output of `uptime` was something like 2.2 before I manually kill the
script and all is OK again.

What can I do about this ?

Here is my simple bash script:

# do webazolver
for i in /var/log/apache/access_tmp/*-access_log; do
webazolver -N 20 -D /var/log/webazolver/dns_cache.db $i
done

# do webalizer
for i in /var/log/apache/access_tmp/*-access_log; do
site=`echo $i | sed 's/\/var\/log\/apache\/access_tmp\///'`
site=`echo $site | sed 's/-access_log//'`
if [ -e /etc/webalizer/$site.webalizer.conf ];
then
webalizer -D /var/log/webazolver/dns_cache.db -c
/etc/webalizer/$site.webalizer.conf;
fi
done

It just loops through the apache logs and analyzes them.
I even use 'webazolver' to try and help but still grinds down the machine.
I currently have this script fire every 4 hours. So the logs are not
too big.

I'm thinking maybe to add a `sleep 300` or something to the script.
Maybe it's better to check if one instance to Webalizer is already 
running then sleep and try again.

Any suggestions.
I have about 20 virtual sites on this box and 400 on another.

Many thanks
Regards
Rudi.






-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: Resource consumption.

2003-10-28 Thread Rudi Starcevic
Hi,

OK sorry I found the answer.
Next time I'll try harder before I bother you.

I found out about the `wait` command in Bash scripting.
I'll try something like:

# do webalizer
for i in /var/log/apache/access_tmp/*-access_log; do
   site=`echo $i | sed 's/\/var\/log\/apache\/access_tmp\///'`
   site=`echo $site | sed 's/-access_log//'`
   if [ -e /etc/webalizer/$site.webalizer.conf ];
then
webalizer -D /var/log/webazolver/dns_cache.db -c \
/etc/webalizer/$site.webalizer.conf;

WEB_PID=$!;
wait $WEB_PID;
fi
done
 
Cheers
Rudi.




-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Configuring mod_ssl

2003-10-28 Thread adawes
Hi,

I'm trying to get mod_ssl configured on my server but it isn't coming
together and was hoping I could get some help from the field. Below is a
description of my setup and what I'm trying to do. Any guesses where I'm
going wrong?

I've got a server with multiple virtual hosts. For the most part, I really
only need https to work for my Squirrelmail webmail pages, but may also at
some point have to put in an ecommerce thing on a site to accept credit
cards. It seems to me that for simplicity sake, I'd ideally like to just
have all my sites be accessible identically via http and https. I'll just
put in a redirect for the http version of squirrelmail to go to 443
instead of 80. Is there any good reason why I shouldn't have my docs
available under http and https?

Environment (debian packages)
---
apache 1.3.27.0-2
apache-common 1.3.27.0-2
libapache-mod-ssl 2.8.14-3
openssl 0.9.7b-2
libssl0.9.6j-1

# apache -l
Compiled-in modules:
  http_core.c
  mod_so.c
  mod_macro.c
suexec: disabled; invalid wrapper /usr/lib/apache/suexec

What I've done
--
Initially, I planned to use apache-ssl to do the https, but then figured
if I could configure apache 1.3 with mod_ssl, I'd have a cleaner and
easier to maintain system. So, my attempt to do that resulted in the
following changes to my httpd.conf. These are in the main section and not
duplicated in the Virtual hosts sections.

LoadModule ssl_module /usr/lib/apache/1.3/mod_ssl.so

and

SSLVerifyClient 0
SSLVerifyDepth 10
# generated below with openssl
SSLCertificateKeyFile /etc/ssl/demoCA/certs/server.key
SSLCertificateFile /etc/ssl/demoCA/certs/server.crt
SSLCACertificateFile /etc/ssl/demoCA/certs/cacert.pem

What happens

When I try to hit my home page via https, I get the following in my
access_log:

10.0.0.16  - - [20/Oct/2003:23:02:07 -0700] \x80g\x01\x03\x01 501 -

And my Safari browser gives an immediate error message:
Could not open the page 10.0.0.22 because Safari could not establish a
secure connection to the server 10.0.0.22.

Previously, I was getting the following the following error when I tried
restarting apache:
Cannot load /usr/lib/apache/1.3/libssl.so into server:
/usr/lib/apache/1.3/libssl.so: undefined symbol: ap_conn_timeout

This happened when I tried to load the apache_ssl_module in my http.conf
file. I _think_ I'm not supposed to do that. If I'm correct, that module
isn't necessary to run mod_ssl and is only used for apache-ssl. True?




-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Modifying user privileges with

2003-10-28 Thread Mark Henry
Hi All,
I'm new to Linux administration and I'm trying to extend system log file 
read access to users via the linuxconf tool. How does linuxconf know which 
files are in this set? I am specifically trying to extend access to the ISC 
dhcp server logs in /var. I really don't want to change root's umask. Any 
ideas?
Linux conr-dhcp1 2.4.17-xfs
Linuxconf 1.26
ISC dhcp server V3.0.1rc9

Thanks,
Mark


--
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]


Re: Resource consumption.

2003-10-28 Thread Rudi Starcevic
HI,

Thanks Russell,

I'm pretty sure I have a cron job analysing apache logs which is
consuming too much of the system's resources.
So much is spent on Webalizer and Awstats that the web server stops
answering requests.
   

CPU time or IO bandwidth?
 

CPU time is what I meant. Sorry I should be more clear

The output of `uptime` was something like 2.2 before I manually kill the
script and all is OK again.
   

2.2 should not be a great problem.  A machine that has a single CPU and a 
single hard disk probably won't be giving good performance when it's load 
average exceeds 2.0, but it should still work.

I thought that is the load average went about 1.0 that this was bad and 
mean you need
to do something to help bring the load under 1.0.

Even one process of Awstats uses heaps of CPU - over 90%.
Maybe I need to create a user account for processing Apache logs and limit
CPU consumption with 'ulimit' or something ??
Cheers
Rudi.


--
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]


Re: Resource consumption.

2003-10-28 Thread Chris Foote
On Wed, 29 Oct 2003, Rudi Starcevic wrote:

 I'm pretty sure I have a cron job analysing apache logs which is
 consuming too much of the system's resources.
 So much is spent on Webalizer and Awstats that the web server stops
 answering requests.
 
 CPU time or IO bandwidth?

 CPU time is what I meant. Sorry I should be more clear

 The output of `uptime` was something like 2.2 before I manually kill the
 script and all is OK again.
 
 2.2 should not be a great problem.  A machine that has a single CPU and a
 single hard disk probably won't be giving good performance when it's load
 average exceeds 2.0, but it should still work.
 
 I thought that is the load average went about 1.0 that this was bad and
 mean you need
 to do something to help bring the load under 1.0.

 Even one process of Awstats uses heaps of CPU - over 90%.
 Maybe I need to create a user account for processing Apache logs and limit
 CPU consumption with 'ulimit' or something ??

I think you might be overlooking the value of the 'nice' shell
builtin - try:

# do webalizer
for i in /var/log/apache/access_tmp/*-access_log; do
   site=`echo $i | sed 's/\/var\/log\/apache\/access_tmp\///'`
   site=`echo $site | sed 's/-access_log//'`
   if [ -e /etc/webalizer/$site.webalizer.conf ];
then
nice webalizer -D /var/log/webazolver/dns_cache.db -c \
   /etc/webalizer/$site.webalizer.conf;
WEB_PID=$!;
wait $WEB_PID;
fi
done


Cheers,
Chris

Linux.Conf.Au Adelaide Jan 12-17 2004
Australia's Premier Linux Conference
http://lca2004.linux.org.au


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: Resource consumption.

2003-10-28 Thread Rudi Starcevic
Hi Chris,

I think you might be overlooking the value of the 'nice' shell builtin - try:
 

Indeed.
Thanks.
Regards
Rudi.
--
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]


command logging

2003-10-28 Thread Dan MacNeil

For a box that will have limited shell access, I'm looking for something
that will log all commands. The sudo log is nice but not everything is run
through sudo.

There won't be many privacy issues as most users won't have shell.

The goal is to review a daily report for anything unexpected: stuff like:

tar -xzf rootkit.tar.gz





-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: command logging

2003-10-28 Thread John Keimel
On Tue, Oct 28, 2003 at 10:56:53PM -0500, Dan MacNeil wrote:
 
 For a box that will have limited shell access, I'm looking for something
 that will log all commands. The sudo log is nice but not everything is run
 through sudo.
 
 There won't be many privacy issues as most users won't have shell.
 
 The goal is to review a daily report for anything unexpected: stuff like:
 
 tar -xzf rootkit.tar.gz

For several servers I maintain we took the bash code and hacked it to
log all commands, with usernames, to a log file. Yes, it's nosy. It's
actually called 'nosy bash' by us. It's not been sent to the bash
maintainers at all yet, but I could see if my coder can make a diff of
it. 

It's come in quite handy at times. Quite handy.

I didn't do that!
Well, yes, you did. At 1:43:00 you type 'rm -rf /' 
No I didn't
Yes, see, it's in the logs. 
Oh.. ummm...
disable account
Bu bye.

I regualrly grep the log for keywords or sometimes tail it if I'm
suspicious of someone. But for the most part, I don't ogle it
constantly. Who has time for that? 

I'm also running grsec patches as well. Grsec didn't do the nosy bash
like I wanted, so I'm keepign the nosy bash. 

j

-- 

==
+ It's simply not   | John Keimel+
+ RFC1149 compliant!| [EMAIL PROTECTED]+
+   | http://www.keimel.com  +
==


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



How to write Backup / Restore Program- Reg

2003-10-28 Thread NIC-Email



Sir,

I seen ur mail on the web, through search engine.I want to 
write backup program myself. give possible idea and logics for write to backup 
program.

pls. don't reply.

Thanking u


N.Jai Sankar.
p[EMAIL PROTECTED]



Exim config

2003-10-28 Thread Craig
Hi Guys

Does anyone happen to know how I could have Exim parse a text file with
a list of users in, if they are in the file to send mail to another an
exhange server else deliver to local mailbox ?

Any suggestions would be welcomed.

Thanks
Craig




Resource consumption.

2003-10-28 Thread Rudi Starcevic
Hi,

I'm pretty sure I have a cron job analysing apache logs which is
consuming too much of the system's resources.
So much is spent on Webalizer and Awstats that the web server stops
answering requests.

The output of `uptime` was something like 2.2 before I manually kill the
script and all is OK again.

What can I do about this ?

Here is my simple bash script:

# do webazolver
for i in /var/log/apache/access_tmp/*-access_log; do
webazolver -N 20 -D /var/log/webazolver/dns_cache.db $i
done

# do webalizer
for i in /var/log/apache/access_tmp/*-access_log; do
site=`echo $i | sed 's/\/var\/log\/apache\/access_tmp\///'`
site=`echo $site | sed 's/-access_log//'`
if [ -e /etc/webalizer/$site.webalizer.conf ];
then
webalizer -D /var/log/webazolver/dns_cache.db -c
/etc/webalizer/$site.webalizer.conf;
fi
done

It just loops through the apache logs and analyzes them.
I even use 'webazolver' to try and help but still grinds down the machine.
I currently have this script fire every 4 hours. So the logs are not
too big.

I'm thinking maybe to add a `sleep 300` or something to the script.
Maybe it's better to check if one instance to Webalizer is already 
running then sleep and try again.

Any suggestions.
I have about 20 virtual sites on this box and 400 on another.

Many thanks
Regards
Rudi.








Re: Resource consumption.

2003-10-28 Thread Rudi Starcevic
Hi,

Me again ..

I guess what I want to do is have this script execute webalizer
once at a time, waiting until webalizer is finshed before starting
again.
Instead the script fires off many webalizers at once.
Sorry I guess my simple bash skills are not up to scratch.
I'll head over to tldp.org to see if I can't find the answer.

 # do webalizer
 for i in /var/log/apache/access_tmp/*-access_log; do
 site=`echo $i | sed 's/\/var\/log\/apache\/access_tmp\///'`
 site=`echo $site | sed 's/-access_log//'`
 if [ -e /etc/webalizer/$site.webalizer.conf ];
 then
 webalizer -D /var/log/webazolver/dns_cache.db -c \
 /etc/webalizer/$site.webalizer.conf;
 fi
 done
 

Cheers
Rudi.




Re: Resource consumption.

2003-10-28 Thread Rudi Starcevic
Hi,

OK sorry I found the answer.
Next time I'll try harder before I bother you.

I found out about the `wait` command in Bash scripting.
I'll try something like:

# do webalizer
for i in /var/log/apache/access_tmp/*-access_log; do
   site=`echo $i | sed 's/\/var\/log\/apache\/access_tmp\///'`
   site=`echo $site | sed 's/-access_log//'`
   if [ -e /etc/webalizer/$site.webalizer.conf ];
then
webalizer -D /var/log/webazolver/dns_cache.db -c \
/etc/webalizer/$site.webalizer.conf;

WEB_PID=$!;
wait $WEB_PID;
fi
done
 
Cheers
Rudi.






Re: Resource consumption.

2003-10-28 Thread Russell Coker
On Tue, 28 Oct 2003 23:03, Rudi Starcevic wrote:
 I'm pretty sure I have a cron job analysing apache logs which is
 consuming too much of the system's resources.
 So much is spent on Webalizer and Awstats that the web server stops
 answering requests.

CPU time or IO bandwidth?

 The output of `uptime` was something like 2.2 before I manually kill the
 script and all is OK again.

2.2 should not be a great problem.  A machine that has a single CPU and a 
single hard disk probably won't be giving good performance when it's load 
average exceeds 2.0, but it should still work.

But if your processing takes longer than the cron interval then you will have 
serious problems.  Changing the cron interval from 4 hours to 24 may reduce 
the chance of getting two cron jobs running at the same time.

Also you may want to consider adding more RAM.  Webalizer can get a bit memory 
hungry at times, and it seems to have fairly linear data access patterns so 
when it starts paging it thrashes.

-- 
http://www.coker.com.au/selinux/   My NSA Security Enhanced Linux packages
http://www.coker.com.au/bonnie++/  Bonnie++ hard drive benchmark
http://www.coker.com.au/postal/Postal SMTP/POP benchmark
http://www.coker.com.au/~russell/  My home page




Configuring mod_ssl

2003-10-28 Thread adawes
Hi,

I'm trying to get mod_ssl configured on my server but it isn't coming
together and was hoping I could get some help from the field. Below is a
description of my setup and what I'm trying to do. Any guesses where I'm
going wrong?

I've got a server with multiple virtual hosts. For the most part, I really
only need https to work for my Squirrelmail webmail pages, but may also at
some point have to put in an ecommerce thing on a site to accept credit
cards. It seems to me that for simplicity sake, I'd ideally like to just
have all my sites be accessible identically via http and https. I'll just
put in a redirect for the http version of squirrelmail to go to 443
instead of 80. Is there any good reason why I shouldn't have my docs
available under http and https?

Environment (debian packages)
---
apache 1.3.27.0-2
apache-common 1.3.27.0-2
libapache-mod-ssl 2.8.14-3
openssl 0.9.7b-2
libssl0.9.6j-1

# apache -l
Compiled-in modules:
  http_core.c
  mod_so.c
  mod_macro.c
suexec: disabled; invalid wrapper /usr/lib/apache/suexec

What I've done
--
Initially, I planned to use apache-ssl to do the https, but then figured
if I could configure apache 1.3 with mod_ssl, I'd have a cleaner and
easier to maintain system. So, my attempt to do that resulted in the
following changes to my httpd.conf. These are in the main section and not
duplicated in the Virtual hosts sections.

LoadModule ssl_module /usr/lib/apache/1.3/mod_ssl.so

and

SSLVerifyClient 0
SSLVerifyDepth 10
# generated below with openssl
SSLCertificateKeyFile /etc/ssl/demoCA/certs/server.key
SSLCertificateFile /etc/ssl/demoCA/certs/server.crt
SSLCACertificateFile /etc/ssl/demoCA/certs/cacert.pem

What happens

When I try to hit my home page via https, I get the following in my
access_log:

10.0.0.16  - - [20/Oct/2003:23:02:07 -0700] \x80g\x01\x03\x01 501 -

And my Safari browser gives an immediate error message:
Could not open the page 10.0.0.22 because Safari could not establish a
secure connection to the server 10.0.0.22.

Previously, I was getting the following the following error when I tried
restarting apache:
Cannot load /usr/lib/apache/1.3/libssl.so into server:
/usr/lib/apache/1.3/libssl.so: undefined symbol: ap_conn_timeout

This happened when I tried to load the apache_ssl_module in my http.conf
file. I _think_ I'm not supposed to do that. If I'm correct, that module
isn't necessary to run mod_ssl and is only used for apache-ssl. True?






Modifying user privileges with

2003-10-28 Thread Mark Henry
Hi All,
I'm new to Linux administration and I'm trying to extend system log file 
read access to users via the linuxconf tool. How does linuxconf know which 
files are in this set? I am specifically trying to extend access to the ISC 
dhcp server logs in /var. I really don't want to change root's umask. Any 
ideas?
Linux conr-dhcp1 2.4.17-xfs
Linuxconf 1.26
ISC dhcp server V3.0.1rc9

Thanks,
Mark



Re: Resource consumption.

2003-10-28 Thread Rudi Starcevic
HI,
Thanks Russell,
I'm pretty sure I have a cron job analysing apache logs which is
consuming too much of the system's resources.
So much is spent on Webalizer and Awstats that the web server stops
answering requests.
   

CPU time or IO bandwidth?
 

CPU time is what I meant. Sorry I should be more clear
The output of `uptime` was something like 2.2 before I manually kill the
script and all is OK again.
   

2.2 should not be a great problem.  A machine that has a single CPU and a 
single hard disk probably won't be giving good performance when it's load 
average exceeds 2.0, but it should still work.

I thought that is the load average went about 1.0 that this was bad and 
mean you need
to do something to help bring the load under 1.0.

Even one process of Awstats uses heaps of CPU - over 90%.
Maybe I need to create a user account for processing Apache logs and limit
CPU consumption with 'ulimit' or something ??
Cheers
Rudi.




Re: Resource consumption.

2003-10-28 Thread Chris Foote
On Wed, 29 Oct 2003, Rudi Starcevic wrote:

 I'm pretty sure I have a cron job analysing apache logs which is
 consuming too much of the system's resources.
 So much is spent on Webalizer and Awstats that the web server stops
 answering requests.
 
 CPU time or IO bandwidth?

 CPU time is what I meant. Sorry I should be more clear

 The output of `uptime` was something like 2.2 before I manually kill the
 script and all is OK again.
 
 2.2 should not be a great problem.  A machine that has a single CPU and a
 single hard disk probably won't be giving good performance when it's load
 average exceeds 2.0, but it should still work.
 
 I thought that is the load average went about 1.0 that this was bad and
 mean you need
 to do something to help bring the load under 1.0.

 Even one process of Awstats uses heaps of CPU - over 90%.
 Maybe I need to create a user account for processing Apache logs and limit
 CPU consumption with 'ulimit' or something ??

I think you might be overlooking the value of the 'nice' shell
builtin - try:

# do webalizer
for i in /var/log/apache/access_tmp/*-access_log; do
   site=`echo $i | sed 's/\/var\/log\/apache\/access_tmp\///'`
   site=`echo $site | sed 's/-access_log//'`
   if [ -e /etc/webalizer/$site.webalizer.conf ];
then
nice webalizer -D /var/log/webazolver/dns_cache.db -c \
   /etc/webalizer/$site.webalizer.conf;
WEB_PID=$!;
wait $WEB_PID;
fi
done


Cheers,
Chris

Linux.Conf.Au Adelaide Jan 12-17 2004
Australia's Premier Linux Conference
http://lca2004.linux.org.au




Re: Resource consumption.

2003-10-28 Thread Rudi Starcevic
Hi Chris,
I think you might be overlooking the value of the 'nice' shell builtin - try:
 

Indeed.
Thanks.
Regards
Rudi.



command logging

2003-10-28 Thread Dan MacNeil

For a box that will have limited shell access, I'm looking for something
that will log all commands. The sudo log is nice but not everything is run
through sudo.

There won't be many privacy issues as most users won't have shell.

The goal is to review a daily report for anything unexpected: stuff like:

tar -xzf rootkit.tar.gz







Re: command logging

2003-10-28 Thread Steve Suehring

A couple ideas spring to mind.  The first and easiest to implement is 
process accounting.  It can be turned on within the kernel, BSD Process 
Accounting under General Setup.  The drawback there is that you don't get 
command line arguments.

Another option would be the logging that comes with something like the  
GrSecurity kernel patch.  http://www.grsecurity.net/  If you're going to 
be allowing shell access you'll probably want something like grsec 
anyway, among other things.

Hope that helps.

Steve

On Tue, Oct 28, 2003 at 10:56:53PM -0500, Dan MacNeil wrote:
 
 For a box that will have limited shell access, I'm looking for something
 that will log all commands. The sudo log is nice but not everything is run
 through sudo.
 
 There won't be many privacy issues as most users won't have shell.
 
 The goal is to review a daily report for anything unexpected: stuff like:
 
 tar -xzf rootkit.tar.gz




Re: command logging

2003-10-28 Thread John Keimel
On Tue, Oct 28, 2003 at 10:56:53PM -0500, Dan MacNeil wrote:
 
 For a box that will have limited shell access, I'm looking for something
 that will log all commands. The sudo log is nice but not everything is run
 through sudo.
 
 There won't be many privacy issues as most users won't have shell.
 
 The goal is to review a daily report for anything unexpected: stuff like:
 
 tar -xzf rootkit.tar.gz

For several servers I maintain we took the bash code and hacked it to
log all commands, with usernames, to a log file. Yes, it's nosy. It's
actually called 'nosy bash' by us. It's not been sent to the bash
maintainers at all yet, but I could see if my coder can make a diff of
it. 

It's come in quite handy at times. Quite handy.

I didn't do that!
Well, yes, you did. At 1:43:00 you type 'rm -rf /' 
No I didn't
Yes, see, it's in the logs. 
Oh.. ummm...
disable account
Bu bye.

I regualrly grep the log for keywords or sometimes tail it if I'm
suspicious of someone. But for the most part, I don't ogle it
constantly. Who has time for that? 

I'm also running grsec patches as well. Grsec didn't do the nosy bash
like I wanted, so I'm keepign the nosy bash. 

j

-- 

==
+ It's simply not   | John Keimel+
+ RFC1149 compliant!| [EMAIL PROTECTED]+
+   | http://www.keimel.com  +
==