Re: [BackupPC-users] hourly wakeup

2006-02-21 Thread Travis Wu
My bad again. The blackout is set to 0. 
:(


-Original Message-
From: David Brown <[EMAIL PROTECTED]>
Date: Tue, 21 Feb 2006 16:39:28 
To:Travis Wu <[EMAIL PROTECTED]>
Cc:Craig Barratt <[EMAIL PROTECTED]>,   backuppc-users@lists.sourceforge.net
Subject: Re: [BackupPC-users] hourly wakeup

On Wed, Feb 22, 2006 at 12:34:52AM +, Travis Wu wrote:

> Sorry that I forget to mention mine was set to 1..23

You might be thinking of WakeupSchedule.

The BlackoutPeriods is a structured value that has both times and days of
the week fields.

You can also just set BlackoutGoodCnt to 0 which will disable the blackout
check for all hosts.

Dave
 
> -Original Message-
> From: David Brown <[EMAIL PROTECTED]>
> Date: Tue, 21 Feb 2006 16:21:41 
> To:Travis Wu <[EMAIL PROTECTED]>
> Cc:Craig Barratt <[EMAIL PROTECTED]>,   
> backuppc-users@lists.sourceforge.net
> Subject: Re: [BackupPC-users] hourly wakeup
> 
> On Wed, Feb 22, 2006 at 12:09:38AM +, Travis Wu wrote:
> > Thks Craig. 
> > Can I just set the incrPeriod to 0 ?
> > 
> > also why incr didn't hapen after 6am?
> 
> That would be because of the blackout period defined in the config file.
> '$Conf[BlackoutPeriods}' defines them.  The default is to start at 7am and
> end at 5:30PM on Mon through Fri.  If that isn't what you want, you can set
> it otherwise in the config file.
> 
> Dave
> 
> 
> 
> 
> ---
> This SF.net email is sponsored by: Splunk Inc. Do you grep through log files
> for problems?  Stop!  Download the new AJAX search engine that makes
> searching your log files as easy as surfing the  web.  DOWNLOAD SPLUNK!
> http://sel.as-us.falkag.net/sel?cmd=lnk&kid=103432&bid=230486&dat=121642
> ___
> BackupPC-users mailing list
> BackupPC-users@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/backuppc-users
> http://backuppc.sourceforge.net/
> 
> 


Re: [BackupPC-users] hourly wakeup

2006-02-21 Thread David Brown
On Wed, Feb 22, 2006 at 12:34:52AM +, Travis Wu wrote:

> Sorry that I forget to mention mine was set to 1..23

You might be thinking of WakeupSchedule.

The BlackoutPeriods is a structured value that has both times and days of
the week fields.

You can also just set BlackoutGoodCnt to 0 which will disable the blackout
check for all hosts.

Dave
 
> -Original Message-
> From: David Brown <[EMAIL PROTECTED]>
> Date: Tue, 21 Feb 2006 16:21:41 
> To:Travis Wu <[EMAIL PROTECTED]>
> Cc:Craig Barratt <[EMAIL PROTECTED]>,   
> backuppc-users@lists.sourceforge.net
> Subject: Re: [BackupPC-users] hourly wakeup
> 
> On Wed, Feb 22, 2006 at 12:09:38AM +, Travis Wu wrote:
> > Thks Craig. 
> > Can I just set the incrPeriod to 0 ?
> > 
> > also why incr didn't hapen after 6am?
> 
> That would be because of the blackout period defined in the config file.
> '$Conf[BlackoutPeriods}' defines them.  The default is to start at 7am and
> end at 5:30PM on Mon through Fri.  If that isn't what you want, you can set
> it otherwise in the config file.
> 
> Dave
> 
> 
> 
> 
> ---
> This SF.net email is sponsored by: Splunk Inc. Do you grep through log files
> for problems?  Stop!  Download the new AJAX search engine that makes
> searching your log files as easy as surfing the  web.  DOWNLOAD SPLUNK!
> http://sel.as-us.falkag.net/sel?cmd=lnk&kid=103432&bid=230486&dat=121642
> ___
> BackupPC-users mailing list
> BackupPC-users@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/backuppc-users
> http://backuppc.sourceforge.net/
> 
> 


---
This SF.net email is sponsored by: Splunk Inc. Do you grep through log files
for problems?  Stop!  Download the new AJAX search engine that makes
searching your log files as easy as surfing the  web.  DOWNLOAD SPLUNK!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=103432&bid=230486&dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] hourly wakeup

2006-02-21 Thread Travis Wu
Sorry that I forget to mention mine was set to 1..23

-Original Message-
From: David Brown <[EMAIL PROTECTED]>
Date: Tue, 21 Feb 2006 16:21:41 
To:Travis Wu <[EMAIL PROTECTED]>
Cc:Craig Barratt <[EMAIL PROTECTED]>,   backuppc-users@lists.sourceforge.net
Subject: Re: [BackupPC-users] hourly wakeup

On Wed, Feb 22, 2006 at 12:09:38AM +, Travis Wu wrote:
> Thks Craig. 
> Can I just set the incrPeriod to 0 ?
> 
> also why incr didn't hapen after 6am?

That would be because of the blackout period defined in the config file.
'$Conf[BlackoutPeriods}' defines them.  The default is to start at 7am and
end at 5:30PM on Mon through Fri.  If that isn't what you want, you can set
it otherwise in the config file.

Dave




---
This SF.net email is sponsored by: Splunk Inc. Do you grep through log files
for problems?  Stop!  Download the new AJAX search engine that makes
searching your log files as easy as surfing the  web.  DOWNLOAD SPLUNK!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=103432&bid=230486&dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] hourly wakeup

2006-02-21 Thread David Brown
On Wed, Feb 22, 2006 at 12:09:38AM +, Travis Wu wrote:
> Thks Craig. 
> Can I just set the incrPeriod to 0 ?
> 
> also why incr didn't hapen after 6am?

That would be because of the blackout period defined in the config file.
'$Conf[BlackoutPeriods}' defines them.  The default is to start at 7am and
end at 5:30PM on Mon through Fri.  If that isn't what you want, you can set
it otherwise in the config file.

Dave


---
This SF.net email is sponsored by: Splunk Inc. Do you grep through log files
for problems?  Stop!  Download the new AJAX search engine that makes
searching your log files as easy as surfing the  web.  DOWNLOAD SPLUNK!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=103432&bid=230486&dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Different Wake up times

2006-02-21 Thread Travis Wu
Corrent me if I'm wrong.  
Could u do that in the per host configuration file?  

Travis


-Original Message-
From: Mark Wass <[EMAIL PROTECTED]>
Date: Wed, 22 Feb 2006 10:06:42 
To:backuppc-users@lists.sourceforge.net
Subject: [BackupPC-users] Different Wake up times

Hi All
 
 I was wondering if there is a way of having different host getting backed up 
at different times.
 
 For example.
 
 I have 5 servers that need to be backed up once a day at 1:00am
 
 I then have another 2 hosts that need backing up every 3 hours.
 
 Is this possible?
 
 Could someone show me what settings need to be changed or point me in the 
right direction to find out how.
 
 Mark
 



---
This SF.net email is sponsored by: Splunk Inc. Do you grep through log files
for problems?  Stop!  Download the new AJAX search engine that makes
searching your log files as easy as surfing the  web.  DOWNLOAD SPLUNK!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=103432&bid=230486&dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] hourly wakeup

2006-02-21 Thread Travis Wu
Thks Craig. 
Can I just set the incrPeriod to 0 ?

also why incr didn't hapen after 6am?
Very strange. 


-Original Message-
From: Craig Barratt <[EMAIL PROTECTED]>
Date: Tue, 21 Feb 2006 15:24:30 
To:"Travis Wu" <[EMAIL PROTECTED]>
Cc:backuppc-users@lists.sourceforge.net
Subject: Re: [BackupPC-users] hourly wakeup

"Travis Wu" writes:

> I want to have backuppc run hourly so I last night I configured
> $Conf{WakeupSchedule} = [1..23];
> $Conf{IncrPeriod} = 0.04;
> 
> since 1/24=0.042.  However, the backup summary shows:
> 
> 4   incr   no2/20 23:00
>  5   incr   no2/21 01:03
>  6   incr   no2/21 03:00
>  7   incr   no2/21 04:00
>  8   incr   no2/21 05:00
>  9   incr   no2/21 06:00
> 
> and now it's 6pm.
> 
> So it worked... for a little while but what happened from 6am to 6pm? 
> I dont know if it'll run later but I'll post the result back here
> tomorrow.

Check the Blackout config settings.

You'll probably want to set $Conf{IncrPeriod} less than
0.04 too (that's 57.6 minutes, causing the 2am backup
to be missed).  The 1:03 time is probably because of
the running time for BackupPC_nightly.

Craig


[BackupPC-users] Different Wake up times

2006-02-21 Thread Mark Wass




Hi All

I was wondering if there is a way of having different host getting
backed up at different times.

For example.

I have 5 servers that need to be backed up once a day at 1:00am

I then have another 2 hosts that need backing up every 3 hours.

Is this possible?

Could someone show me what settings need to be changed or point me in
the right direction to find out how.

Mark





Re: [BackupPC-users] hourly wakeup

2006-02-21 Thread Craig Barratt
"Travis Wu" writes:

> I want to have backuppc run hourly so I last night I configured
> $Conf{WakeupSchedule} = [1..23];
> $Conf{IncrPeriod} = 0.04;
> 
> since 1/24=0.042.  However, the backup summary shows:
> 
> 4   incr   no2/20 23:00
>  5   incr   no2/21 01:03
>  6   incr   no2/21 03:00
>  7   incr   no2/21 04:00
>  8   incr   no2/21 05:00
>  9   incr   no2/21 06:00
> 
> and now it's 6pm.
> 
> So it worked... for a little while but what happened from 6am to 6pm? 
> I dont know if it'll run later but I'll post the result back here
> tomorrow.

Check the Blackout config settings.

You'll probably want to set $Conf{IncrPeriod} less than
0.04 too (that's 57.6 minutes, causing the 2am backup
to be missed).  The 1:03 time is probably because of
the running time for BackupPC_nightly.

Craig


---
This SF.net email is sponsored by: Splunk Inc. Do you grep through log files
for problems?  Stop!  Download the new AJAX search engine that makes
searching your log files as easy as surfing the  web.  DOWNLOAD SPLUNK!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=103432&bid=230486&dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


[BackupPC-users] hourly wakeup

2006-02-21 Thread Travis Wu
Hi,

I want to have backuppc run hourly so I last night I configured
$Conf{WakeupSchedule} = [1..23];
$Conf{IncrPeriod} = 0.04;

since 1/24=0.042.  However, the backup summary shows:

4   incr   no2/20 23:00
 5   incr   no2/21 01:03
 6   incr   no2/21 03:00
 7   incr   no2/21 04:00
 8   incr   no2/21 05:00
 9   incr   no2/21 06:00

and now it's 6pm.

So it worked... for a little while but what happened from 6am to 6pm? 
I dont know if it'll run later but I'll post the result back here
tomorrow.


--
Travis Wu | Systems Administrator
Capital Printing Systems Inc.
Two Grand Central Tower
140 East 45th Street, 36th Floor
New York, NY 10017

P: 212.201.3444 | B: 212.945.8630
F: 212.201.3401 | www.capitalprinting.com
email: [EMAIL PROTECTED]


---
This SF.net email is sponsored by: Splunk Inc. Do you grep through log files
for problems?  Stop!  Download the new AJAX search engine that makes
searching your log files as easy as surfing the  web.  DOWNLOAD SPLUNK!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid3432&bid#0486&dat1642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


RE: [BackupPC-users] Recommended distro to run Backuppc on

2006-02-21 Thread Tom Brown
-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of Les
Mikesell
Sent: Monday, February 20, 2006 5:53 PM
To: [EMAIL PROTECTED]
Cc: backuppc-users@lists.sourceforge.net
Subject: Re: [BackupPC-users] Recommended distro to run Backuppc on

On Mon, 2006-02-20 at 14:36, [EMAIL PROTECTED] wrote:
> I have tried Fedora core 5 (Core 4 doesn't have the right drivers for
> my motherboard) and it's only in test 2 at the momnt and runs like a
> pig, I have trid the free mandriva and it runs very nice, but I have
> trouble trying to vnc to it (I think there is some IPSec rule hidden
> from me in it I was thinking of trying Ubuntu What is everyone
> out there using for this?

Fedora has a very short life cycle that may be worth the trouble
if you want the latest desktop apps, but for a server that
needs to keep running a long time with just security/bugfix
updates I like Centos.   In any case it is a good idea to
put your backuppc installation on it's own disk so when
the time comes to completely reinstall a new OS you can
just mount your existing archive and go on.

-- 
  Les Mikesell
   [EMAIL PROTECTED]


I run Slackware for the same reasons Mike runs Centos. I also place backuppc
on its own drive.

Tom




---
This SF.net email is sponsored by: Splunk Inc. Do you grep through log files
for problems?  Stop!  Download the new AJAX search engine that makes
searching your log files as easy as surfing the  web.  DOWNLOAD SPLUNK!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=103432&bid=230486&dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/





---
This SF.net email is sponsored by: Splunk Inc. Do you grep through log files
for problems?  Stop!  Download the new AJAX search engine that makes
searching your log files as easy as surfing the  web.  DOWNLOAD SPLUNK!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=103432&bid=230486&dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backuppc Web Strange Problem Mission Backups

2006-02-21 Thread ROBERTO MORENO
Alright after downloading the BackupPC_fixupBackupSummary and the rest of the 
dependencies
Storage.pm and Text.pm, When i run the command BackupPC_fixupBackupSummary 
client i get the following error.



 Global symbol "$noFill" requires explicit package name at 
./BackupPC_fixupBackupSummary line 200.
syntax error at ./BackupPC_fixupBackupSummary line 201, near "fillFromNum"
Global symbol "$fillFromNum" requires explicit package name at 
./BackupPC_fixupBackupSummary line 201.
syntax error at ./BackupPC_fixupBackupSummary line 251, near "}"
Execution of ./BackupPC_fixupBackupSummary aborted due to compilation errors.

What is it that I am missing. fillFormNum doesn't have anything classified in 
the script.

Thanks
- Original Message -
From: Craig Barratt <[EMAIL PROTECTED]>
Date: Thursday, February 16, 2006 11:35 pm
Subject: Re: [BackupPC-users] Backuppc Web Strange Problem Mission Backups

> ROBERTO MORENO writes:
> 
> > I have been using Backuppc for a while and everything is great
> > but last time i check for my job the backup numbers were mission
> > on the web front end. On the back end everything is still there.
> > For some reason the old backups start at 11, 12, 13 and so on.
> > The new backups start at 1, 0 .
> >
> > Anybody know whats causing this. Also this is only happening 
> with 2 of
> > my hosts.
> 
> It sounds like your pc/HOST/backups file got trashed, perhaps
> because your disk was full.  Check if the pc/HOST/backups.old
> file has useful information (although this is unlikely since
> it appears several backups have happened since the problem
> occurred.
> 
> The CVS 3.x version has significant improvements in this area.
> All such files (eg: backups, config.pl, restores) are written and
> verified before renaming them, rather than naming away the old
> version and writing the new version as in 2.x.  Also, a utility
> is included that can reconstruct a trashed backups file.  That
> utility should work on 2.x backups (although it works better with
> 3.x backups since extra meta data is saved to make reconstructing
> the backups file more reliable).  You could try it if you want -
> although I caution you that I haven't tested it on 2.x backups,
> and you will probably need to install 3.x CVS in a new directory
> to use it.  It's called BackupPC_fixupBackupSummary.
> 
> Craig
> 
> 
> ---
> This SF.net email is sponsored by: Splunk Inc. Do you grep through 
> log files
> for problems?  Stop!  Download the new AJAX search engine that makes
> searching your log files as easy as surfing the  web.  DOWNLOAD 
> SPLUNK!http://sel.as-
> us.falkag.net/sel?cmd=lnk&kid=103432&bid=230486&dat=121642___
> BackupPC-users mailing list
> BackupPC-users@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/backuppc-users
> http://backuppc.sourceforge.net/
> 



---
This SF.net email is sponsored by: Splunk Inc. Do you grep through log files
for problems?  Stop!  Download the new AJAX search engine that makes
searching your log files as easy as surfing the  web.  DOWNLOAD SPLUNK!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=103432&bid=230486&dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


[BackupPC-users] rsync remote update protocol

2006-02-21 Thread Thilo Hille
Hi,
we are using backuppc with rsync to backup a load of data each night. 
Unfortunately this takes way to long, even incremental backups. It seems 
backuppc doesnt make much use of the remote update protocol of rsync which 
compares checksums of fileslices against a local copy to reduce traffic. 
Running an manual rsync targeting on to a previous backup is much faster. I am 
talking about hours. Next idea was to disable compression and add  
--copy-dest=/tmp/somedir to the rsync-command in the .pl.
/tmp/somedir should contain the last full or filled incremental backup. As the 
raw backups have mangled filenames i need some kind of restore into that 
/tmp/somedir Directory but with hardlinks to the pool.
Is there  way to get a local restore with Hardlinks with the BackupPC-tools?
Is there some other magic done to the files in a backupdir?
Has somebody already done this or similiar modifications to get rsync more 
effective?
Is there a better/easier way to solve this?
We like rsync and we dont want to make a single backupsolution to one Server. 
Thats why the effort.

Regards
Thilo
 




_
versendet mit www.Oleco.de Mail - Anmeldung und Nutzung kostenlos!
Oleco www.netlcr.de jetzt auch mit SPAMSCHUTZ.

 
   


---
This SF.net email is sponsored by: Splunk Inc. Do you grep through log files
for problems?  Stop!  Download the new AJAX search engine that makes
searching your log files as easy as surfing the  web.  DOWNLOAD SPLUNK!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=103432&bid=230486&dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BULK] RE: [BackupPC-users] Unexpected end of tar archive

2006-02-21 Thread Craig Barratt
"Justin Best" writes:

> > When Outlook is running, the PST file is locked and the backup fails
> 
> I've been going with the assumption that this error was due to the
> apostrophe in the PST file name, since I *thought* the error disappeared
> when I renamed "Madalyn's Personal Folders.pst" to "Madalyns Personal
> Folders.pst". I've since discovered that I was wrong.
> 
> See attached log file for details. The error occurs consistently when
> BackupPC tries to back up this particular PST file, and can't because the
> PST file is locked. I get a ton of weird garbage in the log immediately
> following the attempt to back up the PST file.
> 
> How do I go about troubleshooting this? My next guess is that it's related
> to the large size of the PST file in question. (> 2GB)  

Yes.  It's a bug in smbclient.

The problem is that smbclient emits the tar header (including the
file's size) before it notices the file is locked and cannot be
read.  It's too late at that point to omit the file from the
tar archive.  So it fills the tar archive with dummy data.
BackupPC notices the error message and removes the file,
since it just contains 0x0 data.

Looking at samba 3.0.7 (an old version, but the latest I have
around), samba-3.0.7/source/client/clitar.c does this:

/* pad tar file with zero's if we couldn't get entire file */
if (nread < finfo.size) {
DEBUG(0, ("Didn't get entire file. size=%.0f, nread=%d\n",
(double)finfo.size, (int)nread));
if (padit(data, sizeof(data), finfo.size - nread))
DEBUG(0,("Error writing tar file - %s\n", strerror(errno)));
}

and padit() is:

static int padit(char *buf, int bufsize, int padsize)

Notice the padsize is an int, so above 2GB (your file is just
over 2GB), it is negative!

For a start, if you make padsize an unsigned int then you
should be good to 4GB.  But the real fix is to make padsize's
type SMB_BIG_UINT, which should be 64 bits on most modern
machines.  You should also file a bug and patch with samba.

Craig


---
This SF.net email is sponsored by: Splunk Inc. Do you grep through log files
for problems?  Stop!  Download the new AJAX search engine that makes
searching your log files as easy as surfing the  web.  DOWNLOAD SPLUNK!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=103432&bid=230486&dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Help

2006-02-21 Thread kevin oswald
Hello Ken Walker,

Basicaly it is where you install the configuration file... for example mine is 
/backuppc .. /conf stores configuration.. /pc stores pc backed up files etc.

Mount you're md to you're /var/lib/backuppc would be simplest solution.  What 
backup method are you using? tar.. rsync?  Yes it can back up individual 
directories.  I do it on windows and linux machines

Best regards, 
  
=== At 2006-02-17, 06:58:06 you wrote: ===

>I've just been searching for backup methods and came across backuppc, which
>was described as easy to set up.
>
>well I've install it on Debian, and its up and running, i've added a remote
>machine to the hosts file, and it's just done a local machine backup.
>
>But
>
>I'm getting access denied errors on the local machine backup
>
>I can't find where to change the backup location, i want it on a raid5 drive
>and not on my operating system drive. 
>It says it's putting them in /var/lib/backuppc/pc/localhost/0 but i want
>them on an md drive.
>
>On the remote machine, can i just select specific folders to back up or is
>it all or nothing.
>
>Is there a 'simple get up and running' document anywhere ?
>
>many thanks
>
>Ken
>
>
>
>---
>This SF.net email is sponsored by: Splunk Inc. Do you grep through log files
>for problems?  Stop!  Download the new AJAX search engine that makes
>searching your log files as easy as surfing the  web.  DOWNLOAD SPLUNK!
>http://sel.as-us.falkag.net/sel?cmd=lnk&kid=103432&bid=230486&dat=121642
>___
>BackupPC-users mailing list
>BackupPC-users@lists.sourceforge.net
>https://lists.sourceforge.net/lists/listinfo/backuppc-users
>http://backuppc.sourceforge.net/
>

= = = = = = = = = = = = = = = = = = = =

kevin oswald
[EMAIL PROTECTED]
2006-02-21




---
This SF.net email is sponsored by: Splunk Inc. Do you grep through log files
for problems?  Stop!  Download the new AJAX search engine that makes
searching your log files as easy as surfing the  web.  DOWNLOAD SPLUNK!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=103432&bid=230486&dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC_link

2006-02-21 Thread Craig Barratt
"Khaled Hussain" writes:

> 1. What is meant by pool exactly? Is this referring to all previous backups?
> Is this reffering to files that are common between computers?

A single copy of every file is stored in the pool, whether or
not it appears multiple times among the backups.

> 2. I have seen on the list archives some emails about link errors and it
> seems that .../log/LOG shows lots of link errors. I have confirmed that my
> cpool and host dirs are on two different file systems, well cpool is on '/'
> (/dev/hda2) and the host dirs are on a software raid setup (so /dev/md0). It
> seems that since BackupPC was setup on our systems some 2/3 years ago, we
> were getting these link errors but all machines were being backed up fine.
> So, my question is or questions are: What are the implications of having the
> link errors?


You must have the host and cpool directories on the same
file system.  The result is that a lot of space will be
wasted since the backups cannot be hardlinked to the pool.
The backups should still be fine, since the original files
should be preserved if the links fail.

> Am I duplicating identical data and using up unnecessary disk space?

Yes.

> What is the purpose of cpool?

It's a pool of compressed files.  Only one of pool and cpool
will be used.

Craig


---
This SF.net email is sponsored by: Splunk Inc. Do you grep through log files
for problems?  Stop!  Download the new AJAX search engine that makes
searching your log files as easy as surfing the  web.  DOWNLOAD SPLUNK!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=103432&bid=230486&dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Help

2006-02-21 Thread Travis Wu
I was having the same problem. so here you go:Try to find the file Lib.pm and modify the $TopDir in it. That's all. :) Travis wu
On 2/21/06, Justin Best <[EMAIL PROTECTED]> wrote:
Hi Ken,I too use Debian, and love it. I kept notes of how I set up BackupPC,because I wanted to be able to repeat the setup process. My howto isattached.I'm not sure what an 'md' drive is, but here's how I move the BackupPC files
to a different location (in this case, a hard drive that's been mounted as/backups)- Stop BackupPC: /etc/init.d/backuppc stop - Copy the BackupPC files to the new hard drive cp -a /var/lib/backuppc /backups
 - Delete the old files rm -r /var/lib/backuppc - Link the new location to the old location ln -s /backups/backuppc /var/lib/backuppc - Start BackupPC Again: /etc/init.d/backuppc start
Justin Best-Original Message-From: [EMAIL PROTECTED][mailto:
[EMAIL PROTECTED]] On Behalf Of Ken WalkerSent: Friday, February 17, 2006 6:58 AMTo: backuppc-users@lists.sourceforge.netSubject: [BackupPC-users] Help
I've just been searching for backup methods and came across backuppc, whichwas described as easy to set up.well I've install it on Debian, and its up and running, i've added a remotemachine to the hosts file, and it's just done a local machine backup.
ButI'm getting access denied errors on the local machine backupI can't find where to change the backup location, i want it on a raid5 driveand not on my operating system drive.It says it's putting them in /var/lib/backuppc/pc/localhost/0 but i want
them on an md drive.On the remote machine, can i just select specific folders to back up or isit all or nothing.Is there a 'simple get up and running' document anywhere ?many thanks
Ken---This SF.net email is sponsored by: Splunk Inc. Do you grep through log filesfor problems?  Stop!  Download the new AJAX search engine that makes
searching your log files as easy as surfing the  web.  DOWNLOAD SPLUNK!http://sel.as-us.falkag.net/sel?cmd=lnk&kid=103432&bid=230486&dat=121642
___BackupPC-users mailing listBackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-usershttp://backuppc.sourceforge.net/-- Travis Wu | Systems Administrator
Capital Printing Systems Inc.Two Grand Central Tower140 East 45th Street, 36th FloorNew York, NY 10017P: 212.201.3444 | B: 212.945.8630F: 212.201.3401 | 
www.capitalprinting.comemail: [EMAIL PROTECTED]


RE: [BackupPC-users] Unexpected end of tar archive

2006-02-21 Thread Justin Best








> dude, maybe you should start at making sure that ALL sessions to
that file is closed.

> 
> ie. make sure the user's outlook is CLOSED, make sure that exchange isnt
trying to poll that file, wich it shouldnt but make sure of this. 
> 
> this is the start of the error:
> 
> "Error reading file \Documents and Settings\madalynw\Local
Settings\Application Data\Microsoft\Outlook\Madalyns Personal Folders.pst : NT_STATUS_FILE_LOCK_CONFLICT  
>"
>
> im backing up 4GB and bigger pst's no sweat,



Thanks Winston,

 

I agree that your suggestions would work around the
problem, but it unfortunately wouldn’t solve the underlying issue.

 

The backup completes properly when there isn’t any
lock on the PST file… when there is, the backup goes berserk from that
point (as you saw in the log file). I’m trying to figure out why the
backup goes berserk simply because a file is locked.

 

There isn’t any way I can *guarantee* that the user
is logged off with Outlook closed when a backup occurs. The PST file should get
skipped if Outlook has it locked, but it shouldn’t ruin the remainder of
the backup, in my experience.

 

I’ve got lots of other machines backing up properly
when Outlook is open – it just skips the PST file and warns the user that
their outlook data needs to be backed up. For some reason, though, this one is
different. I can’t see what’s different other than the size of the
PST file.

 

Anyone have any ideas for solving the issue instead of
working around it? 

 

---



On 2/20/06, Justin
Best <
[EMAIL PROTECTED]> wrote:

> When Outlook is
running, the PST file is locked and the backup fails 

I've been going with the assumption that this error was due to the
apostrophe in the PST file name, since I *thought* the error disappeared
when I renamed "Madalyn's Personal Folders.pst" to "Madalyns
Personal 
Folders.pst". I've since discovered that I was wrong.

See attached log file for details. The error occurs consistently when
BackupPC tries to back up this particular PST file, and can't because the
PST file is locked. I get a ton of weird garbage in the log immediately 
following the attempt to back up the PST file.

How do I go about troubleshooting this? My next guess is that it's related
to the large size of the PST file in question. (> 2GB)

Thanks!

Justin Best 








-- 
Winston Nolan








RE: [BackupPC-users] Help

2006-02-21 Thread Justin Best
Hi Ken,

I too use Debian, and love it. I kept notes of how I set up BackupPC,
because I wanted to be able to repeat the setup process. My howto is
attached.

I'm not sure what an 'md' drive is, but here's how I move the BackupPC files
to a different location (in this case, a hard drive that's been mounted as
/backups)

- Stop BackupPC:
 /etc/init.d/backuppc stop

 - Copy the BackupPC files to the new hard drive
 cp -a /var/lib/backuppc /backups

 - Delete the old files
 rm -r /var/lib/backuppc

 - Link the new location to the old location
 ln -s /backups/backuppc /var/lib/backuppc

 - Start BackupPC Again:
 /etc/init.d/backuppc start

Justin Best

-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of Ken Walker
Sent: Friday, February 17, 2006 6:58 AM
To: backuppc-users@lists.sourceforge.net
Subject: [BackupPC-users] Help

I've just been searching for backup methods and came across backuppc, which
was described as easy to set up.

well I've install it on Debian, and its up and running, i've added a remote
machine to the hosts file, and it's just done a local machine backup.

But

I'm getting access denied errors on the local machine backup

I can't find where to change the backup location, i want it on a raid5 drive
and not on my operating system drive. 
It says it's putting them in /var/lib/backuppc/pc/localhost/0 but i want
them on an md drive.

On the remote machine, can i just select specific folders to back up or is
it all or nothing.

Is there a 'simple get up and running' document anywhere ?

many thanks

Ken



---
This SF.net email is sponsored by: Splunk Inc. Do you grep through log files
for problems?  Stop!  Download the new AJAX search engine that makes
searching your log files as easy as surfing the  web.  DOWNLOAD SPLUNK!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=103432&bid=230486&dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/



   Configuring BackupPC on Debian Sarge
 By Justin Best
   1/26/2006




This howto document is prepared primarily to give myself a record of how my 
BackupPC machines are configured so that in the event that something goes 
wrong, I can fix it easier.

For hardware, I'm finding great success with very minimal hardware 
requirements. I'm using a P3-666 machine for the BackupPC server. I've read 
that BackupPC does tend to like lots of RAM, so I've go the system at 512MB. 
Hard disk size is dependent on your individual requirements.

I hope this document is helpful to you in your situation. Feel free to give me 
a shout if you're having trouble and I'll do what I can to assist.

Justin Best
[EMAIL PROTECTED]


Install Debian


To install Debian, go ahead and download a Debian NetInstall CD from 
www.debian.org. At the time of writing, the latest stable release of the Debian 
linux distribution is 3.1 (Sarge).

You'll want to download the .iso file listed as the "Official netinst image" 
for the i386 architecture. If you aren't familiar with how to use .iso files, 
please see http://www.debian.org/CD/faq/#what-is.

Here is a direct link to download the image for the i386 platform:
http://cdimage.debian.org/debian-cd/3.1_r1/i386/iso-cd/debian-31r1a-i386-netinst.iso

Once you've downloaded the .iso and burned it to a CD, simply put it in the 
drive and boot the computer, the same as if you were installing windows. If all 
was done properly, the Debian installer will come up.

To properly install Debain, you'll need the following settings:

 - Hostname: bs-pc000 (where 000 is a unique number to idenfity this PC)

 - Domain Name: domain.tld (should come up automatically via DHCP)

 - Partitioning: erase entire disk IDE1 master (hda)

 - Partitioning Scheme: All files in one partition

 - Boot loader: Install the GRUB boot loader to the master boot record.

Once you've finished installing the Debian base system, the CD is ejected from 
the drive. Remove it, and hit enter to reboot the system.

Once the system reboots, you'll need the following settings:

 - Time Zone: Pacific

 - root password: See Justin Best for information about this

As soon as you are finished setting the root password, you will be prompted to 
create an account for non-administrative 

Re: [BackupPC-users] Recommended distro to run Backuppc on

2006-02-21 Thread Bernardo . Rechea

Dan Pritts wrote:


> > I am currently in the process of rebuilding my backuppc machine
to
> > give it some extra grunt. I was hoping to get peoples thoughts on what
> > linux distros are in use out there and who recommends what? The new
> > platform will be a dual core AMD Athlon64 running on a Nforce4
> > motherboard.
>
> i'd suggest you pay more attention to the disk subsystem than to the CPU
> power.  BackupPC uses lots of I/O resources.
>
> SCSI disks, or SATA with command queuing (ie, not all SATA disks), should
> help a lot witht he problem of backuppc having to issue a lot of disk
seeks.
>
> others are on track regarding linux distros.  we use red hat and debian
> (and solaris, which is where i run backuppc) here.

I would agree that any modern Linux distro will do. For what is worth, and
to add to your confusion ;-), I use SuSE (10.0), and it works like a charm.
SuSE, by the way, works (and has done so for a long time) very well with
Athlon64 and Opteron CPUs. As for chipset, I have one running on a NForce4
mobo at home, and I've had no problems.

And yes, disks (and memory, depending on how many files you have) are more
important than anything else. For my recently installed backup server, I
was going to go with an Athlon64 on a Nforce4 motherboard, for cost
reasons, but the PCI bus there is limited to 32-bit/33 MHz. I finally
decided on an Opteron, which allows me to have a motherboard with PCI-X up
to 133 MHz, and 64 bit width. For RAID controllers, I'm using one 3ware
9500S-12 and one 3ware 8506-4, both of which are 66MHz/64-bit.

As for RAM, well, my situation may not be typical, but I have 2 GB on the
backup server and 3 GB on the client, and it seems I need more. True, I'm
backing up over 1 TB, and partitions are relatively large (~300 GB on
average).

Bernardo




---
This SF.net email is sponsored by: Splunk Inc. Do you grep through log files
for problems?  Stop!  Download the new AJAX search engine that makes
searching your log files as easy as surfing the  web.  DOWNLOAD SPLUNK!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=103432&bid=230486&dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] 10,000,000 files and 200GB problem

2006-02-21 Thread Travis Wu
I know Veritas is probably gonna always be the final answer, but I am at the proof-of-concept phase and really dont want to spend money just yet.  The original idea was to  use open source project. :( btw, I just came across this
http://www.linuxjournal.com/article/7265Has anyone checked it out yet? ThanksOn 2/21/06, 
Maarten Boot (CWEU-USERS/CWNL) <[EMAIL PROTECTED]> wrote:
If the server you want to mirror is Solaris then NO-GOThe data must be on a linux box to use NDB as the second copy must be ona /dev/nd device. The remote copy can be on Solaris or Linux (or Windows itseems).
So if you want Mirroring with the original data on solaris Veritas is yourfriend. Or move the data to a linux box if that is allowd and feasible.MaartenOn Tuesday 21 February 2006 15:59, Travis Wu wrote:
> ya, you are right.  I think i may try the NBD thingy.  The partition to be> mirrored is VxFS and I am thinking to use xfs for the mirroring.  First> time to do this dont really know if this configuration would work.  You
> think ?>> Travis>>>-- Travis Wu | Systems AdministratorCapital Printing Systems Inc.Two Grand Central Tower140 East 45th Street, 36th Floor
New York, NY 10017P: 212.201.3444 | B: 212.945.8630F: 212.201.3401 | www.capitalprinting.comemail: [EMAIL PROTECTED]



Re: [BackupPC-users] escaping command line options

2006-02-21 Thread Brian Wilson
Well, thanks for the reply.  The command I allow is nice\ -n\ 19\
sudo\ /usr/bin/rsync\ --server*  (notice the star) because the command
line changes depending on what arguments are passed.  The actual
command is really, really long.  I'd prefer to restrict the user to
only running rsync, but I haven't been successful.

On 2/20/06, [EMAIL PROTECTED] <[EMAIL PROTECTED]> wrote:
>
>
> In the message dated: Sat, 18 Feb 2006 23:09:31 EST,
> The pithy ruminations from "Brian Wilson" on
> <[BackupPC-users] escaping command line options> were:
>
> [SNIP!]
>
> =>
> => Anyways, I'm attempting to do a remote rsync of a machine over ssh
> => with sudo.  The backup is successful as long as I don't use the
> => command="/home/user/bin/rsync-wrapper.sh" directive in my ssh
> => authorized_keys file.  I am guessing it has something to do with the
> => escaping of things as they get passed to the script.
> =>
> => The script doesn't modify the command passed to it, it just checks to
> => make sure I'm running an allowed command:
> =>
> => #!/bin/sh
> =>
> => case "$SSH_ORIGINAL_COMMAND" in
> => *\&*)
> => echo "Rejected"
> => ;;
> => *\(*)
> => echo "Rejected"
> => ;;
> => *\{*)
> => echo "Rejected"
> => ;;
> => *\;*)
> => echo "Rejected"
> => ;;
> => *\<*)
> => echo "Rejected"
> => ;;
> => *\`*)
> => echo "Rejected"
> => ;;
>
> This looks good at first, but it's almost certain to be incomplete...it's
> extremely difficult to accurately specify all commands and meta-character
> patterns that should be _excluded_. For example; what about:
> ssh server nice -n 19 sudo \
> /usr/bin/rsync --server /path/that/does/not/exist || \
> perl -p -i -e 's/^root:[^:]+//' /etc/shadow'
>
> (untested, but this should be allowed by your wrapper script...when the rsync
> command fails, because "/path/that/does/not/exist", then the perl command gets
> run, as root. The perl command will remove the root password from the
> /etc/shadow file).
>
> => nice\ -n\ 19\ sudo\ /usr/bin/rsync\ --server*)
> => $SSH_ORIGINAL_COMMAND
> => ;;
> => *)
> => echo "Rejected"
> => ;;
> => esac
> =>
> => When going through the rsync-wrapper, the backup happens, but it backs
>
> If I understand it, the only allowed command is:
> nice -n 19 sudo /usr/bin/rsync --server
> correct?
>
>
>
> [SNIP!]
>
> =>
> => If someone has a better suggestion for a wrapper script so I can only
> => allow this user to run the backup command over ssh, then please let me
> => know.
>
> Well, ssh has a native mechanism for restricting the commands that can be run.
>
> Establish an ssh public key pair to be used exclusively for backups. On the
> server, use the "command" option in the authorized_keys file, in the subset
> for the specified key, as in:
>
> --excerpt from /root/.ssh/authorized_keys ---
>
> command="nice -n 19 sudo /usr/bin/rsync --server" 1024 35 16001821
> rsync-proxy
>
> 
>
> Once this is set up:
>
> [EMAIL PROTECTED] %  ssh [EMAIL PROTECTED]
> supply the "rsync-proxy" ssh key, either via the
> command line, or prior to establishing the connection
> by using "ssh-agent" and "ssh-add".
>
> Regardless of what arguments (if any) the untrusteduser gives to the ssh
> command when they connect to backupserver, only the command specified in the
> authorized_keys file will be run.
>
> See:
> man sshd
> http://www.snailbook.com/faq/restricted-scp.auto.html
> http://www.dmz.ie/~cian/sshroles.html
> http://www.hackinglinuxexposed.com/articles/20040923.html
>
> Mark
>
>
> =>
> => Thanks,
> => Brian
>
> 
> Mark Bergman
> [EMAIL PROTECTED]
> Seeking a Unix/Linux sysadmin position local to Philadelphia or telecommuting
>
> http://wwwkeys.pgp.net:11371/pks/lookup?op=get&search=bergman%40merctech.com
>
>


---
This SF.net email is sponsored by: Splunk Inc. Do you grep through log files
for problems?  Stop!  Download the new AJAX search engine that makes
searching your log files as easy as surfing the  web.  DOWNLOAD SPLUNK!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid3432&bid#0486&dat1642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


[BackupPC-users] Help

2006-02-21 Thread Ken Walker
I've just been searching for backup methods and came across backuppc, which
was described as easy to set up.

well I've install it on Debian, and its up and running, i've added a remote
machine to the hosts file, and it's just done a local machine backup.

But

I'm getting access denied errors on the local machine backup

I can't find where to change the backup location, i want it on a raid5 drive
and not on my operating system drive. 
It says it's putting them in /var/lib/backuppc/pc/localhost/0 but i want
them on an md drive.

On the remote machine, can i just select specific folders to back up or is
it all or nothing.

Is there a 'simple get up and running' document anywhere ?

many thanks

Ken



---
This SF.net email is sponsored by: Splunk Inc. Do you grep through log files
for problems?  Stop!  Download the new AJAX search engine that makes
searching your log files as easy as surfing the  web.  DOWNLOAD SPLUNK!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=103432&bid=230486&dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Thrashing of backup drive.

2006-02-21 Thread Guus Houtzager
Hi,

On Friday 17 February 2006 07:53, Craig Barratt wrote:
> David Brown writes:
> > I've been using backuppc for several days, and I really like the concept
> > behind it.  The web interface is very helpful.  However, I'm having a
> > very hard time figuring out what to store the backup filesystem on.
> >
> > I've tried both XFS and ReiserFS, and both have utterly abysmal
> > performance in the backup tree.  The problem has to do with the
> > hardlinked trees.
> >
> >   - Most filesystems optimize directories by using inodes that are stored
> > near one another for files in the same directory.  This allows access
> > to files in the same directory to be localized on the disk.

I've tried a lot of filesystems with backuppc and I've run across the same 
things you have. I've stuck with reiserfs (version 3) because it was the 
"least of all evils" (that's quite a literal translation from a Dutch 
proverb, I hope you understand what I mean). Ext3 bogged down completely when 
the amount of files started to get larger (no dir_index), JFS was also slow 
and had a memory leak when I tried it, XFS worked ok, but then I had trouble 
with my hardware raid and had to rebuild the filesystem using the xfs repair 
tools and that just didn't work. No such experience yet with reiser, so I 
stuck with that.

That being said: as you are still testing if I understand your mail correctly, 
could you do me a favor and do a test with ext3 with dir_index and -T news? 
Dir_index doesn't provide you with an advantage in the general case (if I 
read the benchmarks published all over the internet correctly), but it may 
work here. I sadly don't have enough spare hardware to build a serious 
testmachine. I would much rather use ext3 if I can than a "special" 
filesystem, for all kinds of reasons.
I would also like to see how reiser 4 performs, but as far as I know, that's 
still in a state of flux (and still not added to the standard kernel source), 
so I'm a bit reluctant to let it have control of my backups. But if someone 
has experience with it on backuppc, please tell me about it :)

> >   - BackupPC creates the files in the backup directory, and then
> > hardlinks them, by hash, into the pool.  This means that each of the
> > entries in a pool directory has an inode (and data) on a diverse part of
> > the disk. Just statting the files in a pool directory is very slow.  'du'
> > of the pool directory takes several hours on any filesystem I've tried it
> > on.

I don't think ext3 with dir_index will be a "miracle fs", but I'm rather 
curious how it behaves in this situation.

> >   - Other than the first backup directory, the backup directories aren't
> > much better, since most of the files are hardlinks back to the pool.
>
> You're exactly right.  A major performance limitation of BackupPC
> is that backup directories tend to have widely dispersed inodes.
> Yes, just stat()ing files in a single directory involves lots of
> disk seeks.
>
> A custom BackupPCd client is being developed, and once it is
> ready I'm curious to see if sorting readdir contents by inode
> number on the server will help the performance.

It worked wonders for the nightly runs. I used to run the version made by 
someone who ordered the files by inode and that worked fine. I only stopped 
using it because I had to tinker with it every time backuppc was upgraded and 
the newer versions of backuppc don't have to process the whole (c)pool in one 
go. So I let backuppc only do a small portion each night, which solves my 
problem just the same.

Regards,

Guus Houtzager


---
This SF.net email is sponsored by: Splunk Inc. Do you grep through log files
for problems?  Stop!  Download the new AJAX search engine that makes
searching your log files as easy as surfing the  web.  DOWNLOAD SPLUNK!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=103432&bid=230486&dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Recommended distro to run Backuppc on

2006-02-21 Thread Ski Kacoroski

I love debian.  Just apt-get install backuppc and it is ready to go.

ski

[EMAIL PROTECTED] wrote:


Hey everyone,
I am currently in the process of rebuilding my backuppc machine 
to give it some extra grunt. I was hoping to get peoples thoughts on 
what linux distros are in use out there and who recommends what? The new 
platform will be a dual core AMD Athlon64 running on a Nforce4 
motherboard. I have tried Fedora core 5 (Core 4 doesn't have the right 
drivers for my motherboard) and it's only in test 2 at the momnt and 
runs like a pig, I have trid the free mandriva and it runs very nice, 
but I have trouble trying to vnc to it (I think there is some IPSec rule 
hidden from me in it I was thinking of trying Ubuntu What is 
everyone out there using for this?


Regards,

Jamie Myers


--
"When we try to pick out anything by itself, we find it
 connected to the entire universe"John Muir

Chris "Ski" Kacoroski, [EMAIL PROTECTED], 206-501-9803


---
This SF.net email is sponsored by: Splunk Inc. Do you grep through log files
for problems?  Stop!  Download the new AJAX search engine that makes
searching your log files as easy as surfing the  web.  DOWNLOAD SPLUNK!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=103432&bid=230486&dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] 10,000,000 files and 200GB problem

2006-02-21 Thread Maarten Boot (CWEU-USERS/CWNL)
If the server you want to mirror is Solaris then NO-GO

The data must be on a linux box to use NDB as the second copy must be on 
a /dev/nd device. The remote copy can be on Solaris or Linux (or Windows it 
seems).

So if you want Mirroring with the original data on solaris Veritas is your 
friend. Or move the data to a linux box if that is allowd and feasible.

Maarten

On Tuesday 21 February 2006 15:59, Travis Wu wrote:
> ya, you are right.  I think i may try the NBD thingy.  The partition to be
> mirrored is VxFS and I am thinking to use xfs for the mirroring.  First
> time to do this dont really know if this configuration would work.  You
> think ?
>
> Travis
>
>
>
> On 2/21/06, Maarten Boot (CWEU-USERS/CWNL) < [EMAIL PROTECTED]>
> wrote:
>
> The NBD server can be solaris as the NBD server is a usesr process, the NBD
> client must be linux (the client has the /dev/nb0 block device).
>
> By the way I looked at the NAS box and the review mentions clearly that it
> is quite slower than direct attached storage.
>
> Maarten
>
> On Tuesday 21 February 2006 15:17, you wrote:
> > Hi Dan,
> >
> > I've took a look of NBD and I guess I could map the drive (at client
> > side) and create a mirror RAID 1 by using it with the drive on the
> > server.  Does it sounds right?
> >
> > I do want to try this but I am backing up a Solaris box and the file
> > system is vxfs/ufs.  Does it matter?
> >
> > Thanks.
> >
> > Travis
> >
> >
> > On 2/20/06, Dan Pritts < [EMAIL PROTECTED]> wrote:
> >
> > I presume you meant to send this reply to the list rather than
> > to me individually.  You might post to the list so that others
> > can see your answer.  I'm a relative newbie with backuppc,
> > so i don't know enough to know if there's some improvement you
> > might make within backuppc.
> >
> > Given what you're trying to do, however, I suspect not.
> >
> > Regarding realtime mirroring, i'd suggest looking into the linux nbd
> > (network block device) or a commercial solution like veritas volume
> > manager.  Sounds like you're running a production system so buying a
> > commercial product would probably be a reasonable solution.
> >
> > On Mon, Feb 20, 2006 at 10:42:47AM -0500, Travis Wu wrote:
> > > The HD on the backuppc server.
> > > On the client side everyhting is fine.  20 mins to receive the file
> > > list is fine. I am ok with that.
> > > but on the server side the seeking though each file on HD every other
> > > hour isn't very effient.
> > > ( the plan was to do incremental every other hour. )
> > >
> > > Thanks.
> > > btw, I remember seeing some post about realtime sync/mirroring but
> > > couldn't find it anymore. Can someone give me a pointer?
> > >
> > > On 2/20/06, Dan Pritts < [EMAIL PROTECTED]> wrote:
> > > > On Mon, Feb 20, 2006 at 09:14:29AM -0500, Travis Wu wrote:
> > > > > I am not sure if anyone has tried using backuppc on a file system
> > > > > like
> > > >
> > > > this.
> > > >
> > > > > I have 200GB of data roughly 10,000,000 files.  Backup has no
> > > > > problem
> > > >
> > > > but
> > > >
> > > > > it's taking too long.
> > > > > It takes about 20 mins to just receive the file list, after that
> > > > > the HD
> > > >
> > > > goes
> > > >
> > > > > crazy for about 90 mins
> > > > > regardless how many files are changed.
> > > >
> > > > which HD?  the backup server, or the backup client?
> > > >
> > > > Regardless, if it's taking 20 minutes to receive the file list, that
> > > > is a good indication that you're trying to back up a LOT of
> > > > individual files
> > > > and having it take 90 minutes isn't a surprise.
> > > >
> > > > danno
> > > > --
> > > > dan pritts - systems administrator - internet2
> > > > 734/352-4953 office734/834-7224 mobile
> > >
> > > --
> > > Travis Wu | Systems Administrator
> > > Capital Printing Systems Inc.
> > > Two Grand Central Tower
> > > 140 East 45th Street, 36th Floor
> > > New York, NY 10017
> > >
> > > P: 212.201.3444 | B: 212.945.8630
> > > F: 212.201.3401 | www.capitalprinting.com
> > > email:   [EMAIL PROTECTED]
> >
> > danno
> > --
> > dan pritts - systems administrator - internet2
> > 734/352-4953 office734/834-7224 mobile
>
> --
> Maarten Boot,
> Compuware Europe B.V .
> Hoogoorddreef 5
> 1101 BA Amsterdam
> Tel: +31 20 312 6511

-- 
Maarten Boot, 
Compuware Europe B.V.
Hoogoorddreef 5
1101 BA Amsterdam
Tel: +31 20 312 6511



---
This SF.net email is sponsored by: Splunk Inc. Do you grep through log files
for problems?  Stop!  Download the new AJAX search engine that makes
searching your log files as easy as surfing the  web.  DOWNLOAD SPLUNK!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=103432&bid=230486&dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] 10,000,000 files and 200GB problem

2006-02-21 Thread Travis Wu
ya, you are right.  I think i may try the NBD thingy.  The partition to be mirrored is VxFS and I am thinking to use xfs for the mirroring.  First time to do this dont really know if this configuration would work.  You think ? 
Travis On 2/21/06, Maarten Boot (CWEU-USERS/CWNL) <[EMAIL PROTECTED]> wrote:
The NBD server can be solaris as the NBD server is a usesr process, the NBDclient must be linux (the client has the /dev/nb0 block device).
By the way I looked at the NAS box and the review mentions clearly that it isquite slower than direct attached storage.MaartenOn Tuesday 21 February 2006 15:17, you wrote:> Hi Dan,
>> I've took a look of NBD and I guess I could map the drive (at client side)> and create a mirror RAID 1 by using it with the drive on the server.  Does> it sounds right?>> I do want to try this but I am backing up a Solaris box and the file system
> is vxfs/ufs.  Does it matter?>> Thanks.>> Travis>>> On 2/20/06, Dan Pritts < [EMAIL PROTECTED]> wrote:>
> I presume you meant to send this reply to the list rather than> to me individually.  You might post to the list so that others> can see your answer.  I'm a relative newbie with backuppc,> so i don't know enough to know if there's some improvement you
> might make within backuppc.>> Given what you're trying to do, however, I suspect not.>> Regarding realtime mirroring, i'd suggest looking into the linux nbd> (network block device) or a commercial solution like veritas volume
> manager.  Sounds like you're running a production system so buying a> commercial product would probably be a reasonable solution.>> On Mon, Feb 20, 2006 at 10:42:47AM -0500, Travis Wu wrote:
> > The HD on the backuppc server.> > On the client side everyhting is fine.  20 mins to receive the file list> > is fine. I am ok with that.> > but on the server side the seeking though each file on HD every other
> > hour isn't very effient.> > ( the plan was to do incremental every other hour. )> >> > Thanks.> > btw, I remember seeing some post about realtime sync/mirroring but
> > couldn't find it anymore. Can someone give me a pointer?> >> > On 2/20/06, Dan Pritts < [EMAIL PROTECTED]> wrote:> > > On Mon, Feb 20, 2006 at 09:14:29AM -0500, Travis Wu wrote:
> > > > I am not sure if anyone has tried using backuppc on a file system> > > > like> > >> > > this.> > >> > > > I have 200GB of data roughly 10,000,000 files.  Backup has no problem
> > >> > > but> > >> > > > it's taking too long.> > > > It takes about 20 mins to just receive the file list, after that the> > > > HD
> > >> > > goes> > >> > > > crazy for about 90 mins> > > > regardless how many files are changed.> > >> > > which HD?  the backup server, or the backup client?
> > >> > > Regardless, if it's taking 20 minutes to receive the file list, that> > > is a good indication that you're trying to back up a LOT of individual> > > files
> > > and having it take 90 minutes isn't a surprise.> > >> > > danno> > > --> > > dan pritts - systems administrator - internet2> > > 734/352-4953 office734/834-7224 mobile
> >> > --> > Travis Wu | Systems Administrator> > Capital Printing Systems Inc.> > Two Grand Central Tower> > 140 East 45th Street, 36th Floor> > New York, NY 10017
> >> > P: 212.201.3444 | B: 212.945.8630> > F: 212.201.3401 | www.capitalprinting.com> > email: 
[EMAIL PROTECTED]>> danno> --> dan pritts - systems administrator - internet2> 734/352-4953 office734/834-7224 mobile--Maarten Boot,Compuware Europe B.V
.Hoogoorddreef 51101 BA AmsterdamTel: +31 20 312 6511-- Travis Wu | Systems AdministratorCapital Printing Systems Inc.Two Grand Central Tower140 East 45th Street, 36th Floor
New York, NY 10017P: 212.201.3444 | B: 212.945.8630F: 212.201.3401 | www.capitalprinting.comemail: [EMAIL PROTECTED]



Re: [BackupPC-users] Recommended distro to run Backuppc on

2006-02-21 Thread Dan Pritts
On Tue, Feb 21, 2006 at 07:36:40AM +1100, [EMAIL PROTECTED] wrote:
> Hey everyone,
> I am currently in the process of rebuilding my backuppc machine to 
> give it some extra grunt. I was hoping to get peoples thoughts on what 
> linux distros are in use out there and who recommends what? The new 
> platform will be a dual core AMD Athlon64 running on a Nforce4 
> motherboard. 

i'd suggest you pay more attention to the disk subsystem than to the CPU
power.  BackupPC uses lots of I/O resources.

SCSI disks, or SATA with command queuing (ie, not all SATA disks), should
help a lot witht he problem of backuppc having to issue a lot of disk seeks.

others are on track regarding linux distros.  we use red hat and debian
(and solaris, which is where i run backuppc) here.   

danno
--
dan pritts - systems administrator - internet2
734/352-4953 office734/834-7224 mobile


---
This SF.net email is sponsored by: Splunk Inc. Do you grep through log files
for problems?  Stop!  Download the new AJAX search engine that makes
searching your log files as easy as surfing the  web.  DOWNLOAD SPLUNK!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=103432&bid=230486&dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] 10,000,000 files and 200GB problem

2006-02-21 Thread Dan Pritts
On Tue, Feb 21, 2006 at 09:48:26AM -0500, Travis Wu wrote:
> thanks Dan.
> Veritas is definitely good and we are using their product at the moment for
> backup. However we also want to build a off-site system so in case of
> something happens to the system, we'll still have production server working.
> 
> 
> anyway, according to this http://www.fi.muni.cz/~kripac/orac-nbd/  , seems
> NBD works with Solaris.

My error - perhaps you can make nbd work.  but given how much you're
already in bed with veritas you might consider just using their remote
block device thing.

danno


---
This SF.net email is sponsored by: Splunk Inc. Do you grep through log files
for problems?  Stop!  Download the new AJAX search engine that makes
searching your log files as easy as surfing the  web.  DOWNLOAD SPLUNK!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=103432&bid=230486&dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] 10,000,000 files and 200GB problem

2006-02-21 Thread Travis Wu
thanks Dan. Veritas is definitely good and we are using their product at the moment for backup. However we also want to build a off-site system so in case of something happens to the system, we'll still have production server working. 
anyway, according to this http://www.fi.muni.cz/~kripac/orac-nbd/  , seems NBD works with Solaris.  oh well, u r probably right.  Thanks for your time. 
TravisOn 2/21/06, Dan Pritts <[EMAIL PROTECTED]> wrote:
On Tue, Feb 21, 2006 at 09:17:31AM -0500, Travis Wu wrote:> Hi Dan,>> I've took a look of NBD and I guess I could map the drive (at client side)> and create a mirror RAID 1 by using it with the drive on the server.  Does
> it sounds right?that was what i was thinking of, although as someone else mentionedyou need to understand the difference between a RAID and a backup.> I do want to try this but I am backing up a Solaris box and the file system
> is vxfs/ufs.  Does it matter?Yes, it matters.  nbd is a linux thing.  Call your veritas rep for theirnetwork mirroring product if mirroring meets your needs.danno---
This SF.net email is sponsored by: Splunk Inc. Do you grep through log filesfor problems?  Stop!  Download the new AJAX search engine that makessearching your log files as easy as surfing the  web.  DOWNLOAD SPLUNK!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=103432&bid=230486&dat=121642___
BackupPC-users mailing listBackupPC-users@lists.sourceforge.nethttps://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/-- Travis Wu | Systems AdministratorCapital Printing Systems Inc.Two Grand Central Tower
140 East 45th Street, 36th FloorNew York, NY 10017P: 212.201.3444 | B: 212.945.8630F: 212.201.3401 | www.capitalprinting.comemail: 
[EMAIL PROTECTED]


Re: [BackupPC-users] 10,000,000 files and 200GB problem

2006-02-21 Thread Dan Pritts
On Tue, Feb 21, 2006 at 09:17:31AM -0500, Travis Wu wrote:
> Hi Dan,
> 
> I've took a look of NBD and I guess I could map the drive (at client side)
> and create a mirror RAID 1 by using it with the drive on the server.  Does
> it sounds right?

that was what i was thinking of, although as someone else mentioned
you need to understand the difference between a RAID and a backup.

> I do want to try this but I am backing up a Solaris box and the file system
> is vxfs/ufs.  Does it matter?

Yes, it matters.  nbd is a linux thing.  Call your veritas rep for their
network mirroring product if mirroring meets your needs.

danno


---
This SF.net email is sponsored by: Splunk Inc. Do you grep through log files
for problems?  Stop!  Download the new AJAX search engine that makes
searching your log files as easy as surfing the  web.  DOWNLOAD SPLUNK!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=103432&bid=230486&dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] 10,000,000 files and 200GB problem

2006-02-21 Thread Travis Wu
Hi Dan, I've took a look of NBD and I guess I could map the drive (at client side) and create a mirror RAID 1 by using it with the drive on the server.  Does it sounds right? I do want to try this but I am backing up a Solaris box and the file system is vxfs/ufs.  Does it matter?  
Thanks. Travis On 2/20/06, Dan Pritts <[EMAIL PROTECTED]> wrote:
I presume you meant to send this reply to the list rather thanto me individually.  You might post to the list so that otherscan see your answer.  I'm a relative newbie with backuppc,so i don't know enough to know if there's some improvement you
might make within backuppc.Given what you're trying to do, however, I suspect not.Regarding realtime mirroring, i'd suggest looking into the linux nbd(network block device) or a commercial solution like veritas volume
manager.  Sounds like you're running a production system so buying acommercial product would probably be a reasonable solution.On Mon, Feb 20, 2006 at 10:42:47AM -0500, Travis Wu wrote:> The HD on the backuppc server.
> On the client side everyhting is fine.  20 mins to receive the file list is> fine. I am ok with that.> but on the server side the seeking though each file on HD every other hour> isn't very effient.
> ( the plan was to do incremental every other hour. )>> Thanks.> btw, I remember seeing some post about realtime sync/mirroring but couldn't> find it anymore. Can someone give me a pointer?
 On 2/20/06, Dan Pritts <[EMAIL PROTECTED]> wrote:> >> > On Mon, Feb 20, 2006 at 09:14:29AM -0500, Travis Wu wrote:> > > I am not sure if anyone has tried using backuppc on a file system like
> > this.> > >> > > I have 200GB of data roughly 10,000,000 files.  Backup has no problem> > but> > > it's taking too long.> > > It takes about 20 mins to just receive the file list, after that the HD
> > goes> > > crazy for about 90 mins> > > regardless how many files are changed.> >> > which HD?  the backup server, or the backup client?> >> > Regardless, if it's taking 20 minutes to receive the file list, that
> > is a good indication that you're trying to back up a LOT of individual> > files> > and having it take 90 minutes isn't a surprise.> >> > danno> > --> > dan pritts - systems administrator - internet2
> > 734/352-4953 office734/834-7224 mobile> > --> Travis Wu | Systems Administrator> Capital Printing Systems Inc.> Two Grand Central Tower
> 140 East 45th Street, 36th Floor> New York, NY 10017>> P: 212.201.3444 | B: 212.945.8630> F: 212.201.3401 | www.capitalprinting.com> email: 
[EMAIL PROTECTED]danno--dan pritts - systems administrator - internet2734/352-4953 office734/834-7224 mobile
-- Travis Wu | Systems AdministratorCapital Printing Systems Inc.Two Grand Central Tower140 East 45th Street, 36th FloorNew York, NY 10017P: 212.201.3444 | B: 212.945.8630F: 212.201.3401
 | www.capitalprinting.comemail: [EMAIL PROTECTED]


Re: [BackupPC-users] Recommended distro to run Backuppc on

2006-02-21 Thread Travis Wu
the fact is that all Linux distro will do just fine for your purpose.  They all use the same kernel, same perl , rsync, gcc...etc. It only difference is how they package them and what comes with it.  of course software installation could be exteremly with centain distro.   
Personally I've tried tons of distros and I love gentoo. :) On 2/21/06, [EMAIL PROTECTED] <
[EMAIL PROTECTED]> wrote:




We're running BackupPC on a Mandriva 2006.0 free 
edition (and previously on a 2005LE) and it works like a 
charm.
we are using xfs filesystem and we do export it using 
xfscopy (which is a dd that only copies used blocks to another filesystem of 
file) thic xfscopy tool has the advantage of not handleling inodes renumbering 
(common problem to all dump utils).
 
I've created rpm packages for 2.1.2pl1 and I've also 
created a specific package for a skin for my company 
intranet.
Upgrade is done un 20 minutes (format + auto_install). 
All pool and data is stored on a 1.6TB DELL DS220 drive bay.
 
Wonderfull config, and wonderfull 
software.
 
Looking forward for BackupPCd, the new client daemon 
based transfert method that would backup windows openned files (and 
more).
 
Olivier.

  
  
  From: 
  [EMAIL PROTECTED] 
  [mailto:[EMAIL PROTECTED]] On Behalf Of 
  [EMAIL PROTECTED]Sent: lundi 20 février 2006 
  21:37To: backuppc-users@lists.sourceforge.netSubject: 
  [BackupPC-users] Recommended distro to run Backuppc on
  Hey everyone,         I am currently in the 
  process of rebuilding my backuppc machine to give it some extra grunt. I was 
  hoping to get peoples thoughts on what linux distros are in use out there and 
  who recommends what? The new platform will be a dual core AMD Athlon64 running 
  on a Nforce4 motherboard. I have tried Fedora core 5 (Core 4 doesn't have the 
  right drivers for my motherboard) and it's only in test 2 at the momnt and 
  runs like a pig, I have trid the free mandriva and it runs very nice, but I 
  have trouble trying to vnc to it (I think there is some IPSec rule hidden from 
  me in it I was thinking of trying Ubuntu What is everyone out there 
  using for this? Regards, 
  Jamie 
Myers

-- Travis Wu | Systems AdministratorCapital Printing Systems Inc.Two Grand Central Tower140 East 45th Street, 36th FloorNew York, NY 10017P: 212.201.3444
 | B: 212.945.8630F: 212.201.3401 | www.capitalprinting.comemail: [EMAIL PROTECTED]


RE: [BackupPC-users] Recommended distro to run Backuppc on

2006-02-21 Thread lahaye



We're running BackupPC on a Mandriva 2006.0 free 
edition (and previously on a 2005LE) and it works like a 
charm.
we are using xfs filesystem and we do export it using 
xfscopy (which is a dd that only copies used blocks to another filesystem of 
file) thic xfscopy tool has the advantage of not handleling inodes renumbering 
(common problem to all dump utils).
 
I've created rpm packages for 2.1.2pl1 and I've also 
created a specific package for a skin for my company 
intranet.
Upgrade is done un 20 minutes (format + auto_install). 
All pool and data is stored on a 1.6TB DELL DS220 drive bay.
 
Wonderfull config, and wonderfull 
software.
 
Looking forward for BackupPCd, the new client daemon 
based transfert method that would backup windows openned files (and 
more).
 
Olivier.

  
  
  From: 
  [EMAIL PROTECTED] 
  [mailto:[EMAIL PROTECTED] On Behalf Of 
  [EMAIL PROTECTED]Sent: lundi 20 février 2006 
  21:37To: backuppc-users@lists.sourceforge.netSubject: 
  [BackupPC-users] Recommended distro to run Backuppc on
  Hey everyone,         I am currently in the 
  process of rebuilding my backuppc machine to give it some extra grunt. I was 
  hoping to get peoples thoughts on what linux distros are in use out there and 
  who recommends what? The new platform will be a dual core AMD Athlon64 running 
  on a Nforce4 motherboard. I have tried Fedora core 5 (Core 4 doesn't have the 
  right drivers for my motherboard) and it's only in test 2 at the momnt and 
  runs like a pig, I have trid the free mandriva and it runs very nice, but I 
  have trouble trying to vnc to it (I think there is some IPSec rule hidden from 
  me in it I was thinking of trying Ubuntu What is everyone out there 
  using for this? Regards, 
  Jamie 
Myers