Yeah, that might be a good idea :-) Well, I'm doing great; I answer my
email about once a week!
Well, I've thought about that occasionally - restoring that is. I guess
I'd have to just reload the system and then copy the target files back to
their original places.
I'll have to play with that on
[EMAIL PROTECTED] wrote:
Hey thanks Mark! Sorry to take so long responding; I forgot to renew my
domain name and wasn't getting any mail - please don't ask :}
thanks again! mike
No problem Mike. Glad I could help. I have yet to need to restore from
any of the backups I've been making with it, (
> On Wed, 2003-06-18 at 09:32, [EMAIL PROTECTED] wrote:
>> Hey thanks Mark! Sorry to take so long responding; I forgot to renew my
>> domain name and wasn't getting any mail - please don't ask :}
>> thanks again! mike
>
> Mike
>
>You blew it ... you could have blamed it on the problems with s
On Wed, 2003-06-18 at 09:32, [EMAIL PROTECTED] wrote:
> Hey thanks Mark! Sorry to take so long responding; I forgot to renew my
> domain name and wasn't getting any mail - please don't ask :}
> thanks again! mike
Mike
You blew it ... you could have blamed it on the problems with sympa
and h
Hey thanks Mark! Sorry to take so long responding; I forgot to renew my
domain name and wasn't getting any mail - please don't ask :}
thanks again! mike
> Michael Holt wrote:
>> Hey,
>> I'm looking for some help on my little backup script for my server. I'm
>> not a real savvy scripter so pleas
Michael Holt wrote:
Hey,
I'm looking for some help on my little backup script for my server. I'm
not a real savvy scripter so please bare with me :)
I googled around awhile back and put together a bu script that has served
my purpose well until now. It's just a few lines as you will see and I
use
Thanks Jack and Michael for the pointers! Sorry to take so long to get
back. I've used the --exclude option as Michael suggested for now. Also
thanks for pointing out the -z option for tar; believe it or not I thought
I was actually using that already, I guess my biggest problem is not
paying at
The man page for tar will give you several options that are available,
one option is --exclude.
You also might want to look at: --compress and --gzip to compress
the data as it is being tared.
Mike
On Sun, 2003-06-08 at 21:51, Michael Holt wrote:
> Hey,
> I'm looking for some help on my little b
why reinvent the wheel? Go to freshmeat.net and type cd backup into the
search. cdbkup is the one I picked, it seems to work. I only use it for
my MP3 repository though, everything else is just rsynced to multiple
systems.
On Sun, 2003-06-08 at 21:51, Michael Holt wrote:
> Hey,
> I'm looking for s
Hey,
I'm looking for some help on my little backup script for my server. I'm
not a real savvy scripter so please bare with me :)
I googled around awhile back and put together a bu script that has served
my purpose well until now. It's just a few lines as you will see and I
use it with cron to mak
On Monday 02 Jun 2003 7:29 am, Ray Warren wrote:
> On Sun, Jun 01, 2003 at 08:29:26PM +0100, Anne Wilson wrote:
> > I want to set up an additional task in the daily cron. The
> > script I wrote is
> >
> > mv -fu /Data/Backup/GnomeCard.vcf /Data/Backup/GnomeCard.vcf.bak
>
> after this step /Data/Ba
On Sun, Jun 01, 2003 at 08:29:26PM +0100, Anne Wilson wrote:
> I want to set up an additional task in the daily cron. The script I
> wrote is
>
> mv -fu /Data/Backup/GnomeCard.vcf /Data/Backup/GnomeCard.vcf.bak
after this step /Data/Backup/GnomeCard.vcf no longer exists if it has
been modified
So if I completely understand it, there is no .bak file in the beginning ?
mv and cp don't see this file, so they assume the sourcefile is newer than the not-existing destination file.
Steven
On Sun, 2003-06-01 at 21:56, Anne Wilson wrote:
On Sunday 01 Jun 2003 8:44 pm, you wrote:
> (I'm ma
I want to set up an additional task in the daily cron. The script I
wrote is
mv -fu /Data/Backup/GnomeCard.vcf /Data/Backup/GnomeCard.vcf.bak
cp -fpu /home/anne/GnomeCard.vcf /Data/Backup/GnomeCard.vcf
mv -fu /Data/Backup/Anne.ics /Data/Backup/Anne.ics.bak
cp -fpu /home/anne/KOrganizer/Anne.ics
> "Craig" == Craig Sprout <[EMAIL PROTECTED]> writes:
Craig> At 12:35 AM 7/5/2001 -0700, faisal gillani wrote:
>> i am stuck in how can i give the answers to the
>> command automatically ... is
>> it possible ?
Craig> As long as the answer is 'y', there is a command calle
From the fsck man page:
Currently, standardized file system-specific options are somewhat in flux.
Although not guaranteed, the following options are supported by most file
system checkers:
-a Automatically repair the file system without any questions (use
this option with ca
HI,
If you want the same answer to all questions for a particular command,
there is an amazing little utility by the name of "yes"
By default it echos an endless stream of y characters. If you pass it
anything as an argument, then that is what is echoed endlessly instead.
You can almost cert
At 12:35 AM 7/5/2001 -0700, faisal gillani wrote:
>i am stuck in how can i give the answers to the
>command automatically ... is
>it possible ?
As long as the answer is 'y', there is a command called 'yes' that will
output nothing put y's. You can pipe that output to fsck (which, I assume
is t
faisal gillani wrote:
> can any one help me in making a small script all i
> want is that my system
> runs itself in single user mode , then run fsck on / ,
> then return to normal
> mode...
>
> i know the commands
>
> 1, init 1
> 2, fsck /
> 3, init 6
>
> i am stuck in how can i give the a
hello there
can any one help me in making a small script all i
want is that my system
runs itself in single user mode , then run fsck on / ,
then return to normal
mode...
i know the commands
1, init 1
2, fsck /
3, init 6
i am stuck in how can i give the answers to the
command automatically
I have a little log analysis perl script that I am trying to run. When starting form a
terminal it works fine with the following in an executable script;
#! /bin/bash
echo "Mail Stats Log Analysis"
echo ""
su - root -c "DISPLAY=$DISPLAY; export DISPLAY; aterm -bg black -fg yellow -e perl
/usr/bi
21 matches
Mail list logo