I'm running BackupPC on a ProxMox virtual machine.
The VM's disk storage is on a Qnap NAS share (mount via NFS).
I want to realize a remote backup on cloud (I'm considering Amazon
Glacier) to disaster recovery purpose.
I'm evaluating the pros and cons between backing up:
1) original data
2)
On 5/25/2011 5:02 AM, samuel_w...@t-online.de wrote:
Its not really easy to backup a backuppc, read in the mailing list rsync is
not a good idea because there a lot of hardlinks.
Whats the best way to creat offline backups?
First, if you are changing topics, please create a new thread rather
Matthias Meyer wrote at about 11:43:23 +0200 on Monday, April 25, 2011:
Hi Jeffrey,
Thanks for sending your perl script.
Hmmm... not sure I even remember which of my scripts you are talking about...
Unfortunately I can't answer you because:
- The following addresses had
Hi Jeffrey,
Thanks for sending your perl script.
Unfortunately I can't answer you because:
- The following addresses had permanent fatal errors -
j...@kosowsky.org
- Transcript of session follows -
... while talking to smtp.secureserver.net.:
550 5.7.1 SPF unauthorized
Jeffrey J. Kosowsky wrote:
I think there is a 3rd camp:
3. Scripts that understand the special structure of the pool and pc
trees and efficiently create lists of all hard links in pc
directory.
a] BackupPC_tarPCCOPY
Included in standard BackupPC installations. It uses a perl
i've been having good success with a script that selects only the most
recent full and most recent incremental for each backup in the pc
directory, as well as the set of backups last successfully
transferred, and rsync's that set offsite, with -H. for me, this
still deduplicates, and keeps a
@ Tim
I did *briefly* search before making this email thread. As I found it
similar to looking for the right kind of pants in a clothing store, I made
the thread with particular keywords with the hope that Google would extract
this email rather than the others don't don't so *briefly* get to the
On 2/22/2011 9:17 PM, Dennis Blewett wrote:
13,849 items, totalling 3.8 GB
It would appear that I have a feasible number of files. I'm not sure how
many more files I will have by the end of April, though.
I've read about that rsync -H would be a practical command to use on
the backuppc
Dennis Blewett dennis.blew...@gmail.com wrote on 02/24/2011 04:41:00 PM:
@ Tim
I did *briefly* search before making this email thread. As I found
it similar to looking for the right kind of pants in a clothing
store, I made the thread with particular keywords with the hope that
Google
Les Mikesell lesmikes...@gmail.com wrote on 02/24/2011 05:21:27 PM:
On 2/22/2011 9:17 PM, Dennis Blewett wrote:
What I'm also curious about is if I should be rsyncing any other
files,
thus allowing me to restore from the offsite backup in the case I lose
everything and rebuild a backuppc
Timothy J Massey wrote at about 23:13:52 -0500 on Tuesday, February 22, 2011:
Dennis Blewett dennis.blew...@gmail.com wrote on 02/22/2011 10:17:29 PM:
13,849 items, totalling 3.8 GB
It would appear that I have a feasible number of files. I'm not sure
how many more files I will
On 2/22/2011 11:13 PM, Timothy J Massey wrote:
Dennis Blewett dennis.blew...@gmail.com wrote on 02/22/2011 10:17:29 PM:
13,849 items, totalling 3.8 GB
It would appear that I have a feasible number of files. I'm not sure
how many more files I will have by the end of April, though.
Carl Wilhelm Soderstrom chrome at real-time.com writes:
On 02/21 11:00 , Dennis Blewett wrote:
Will I come across many problems in later restoring the pool's data if I
just rsync /var/lib/backuppc to the server?
Are there other files and folders I should be rsync'ing to the server?
John Goerzen wrote at about 14:26:33 + on Wednesday, February 23, 2011:
Carl Wilhelm Soderstrom chrome at real-time.com writes:
On 02/21 11:00 , Dennis Blewett wrote:
Will I come across many problems in later restoring the pool's data if I
just rsync /var/lib/backuppc to
rsync'ing the BackupPC data pool is generally recommended against. The
number of hardlinks causes an explosive growth in memory consumption by
rsync and while you may be able to get away with it if you have 20GB of
data (depending on how much memory you have); you will likely run out of
gregwm backuppc-us...@whitleymott.net wrote on 02/22/2011 11:26:51 AM:
this issue sure comes up alot, and perhaps i should just keep quiet
since i personally am in no position to do it or even go off looking
for an rsync forum, nor do i have any knowledge of just how convoluted
the rsync
gregwm wrote at about 10:26:51 -0600 on Tuesday, February 22, 2011:
rsync'ing the BackupPC data pool is generally recommended against. The
number of hardlinks causes an explosive growth in memory consumption by
rsync and while you may be able to get away with it if you have 20GB of
13,849 items, totalling 3.8 GB
It would appear that I have a feasible number of files. I'm not sure how
many more files I will have by the end of April, though.
I've read about that rsync -H would be a practical command to use on the
backuppc folder.
What I'm also curious about is if I should
Dennis Blewett dennis.blew...@gmail.com wrote on 02/22/2011 10:17:29 PM:
13,849 items, totalling 3.8 GB
It would appear that I have a feasible number of files. I'm not sure
how many more files I will have by the end of April, though.
I've read about that rsync -H would be a practical
Hey, folks.
I've come back into the arena of using BackupPC.
At the moment, one of my main goals is to rsync the pool and whatever else
to an offsite storage area.
I'm a student with access to a server for storing about 20GB of stuff on. I
can use the intranet and rsync to move stuff onto it.
Farmol SPA wrote:
Hi list.
I would like to ask which is the simplest yet effective way to dump
backuppc stuff (mainly __TOPDIR__) eg to a removable hard disk that will
be used in a disaster recovery scenario where the plant were destroyed
and I need to restore data from this survivor device.
On 8/31/2010 12:09 PM, Josh Malone wrote:
Farmol SPA wrote:
Hi list.
I would like to ask which is the simplest yet effective way to dump
backuppc stuff (mainly __TOPDIR__) eg to a removable hard disk that will
be used in a disaster recovery scenario where the plant were destroyed
and I need
Original Message
Subject: Re: [BackupPC-users] Backup backuppc
From: Carl Wilhelm Soderstrom chr...@real-time.com
To: General list for user discussion, questions and support
backuppc-users@lists.sourceforge.net
Date: Mon Aug 23 2010 17:42:43 GMT+0200 (ora Legale Europa
On 08/20 11:02 , Mirco Piccin wrote:
i usually use an LVM volume (easy to extend) as TOPDIR.
In this case a FAST solution could be the lvm volume snapshot.
Of course, you need to save also BackupPC conf files.
I've tried this in the past. Taking a snapshot of a live backuppc instance
and then
On Monday 23 Aug 2010 16:42:43 Carl Wilhelm Soderstrom wrote:
I've tried this in the past. Taking a snapshot of a live backuppc instance
and then backing up the snapshot (especially to tape, like I tried) turns
out to be murderously slow due to all the disk updates.
At least to tape (where a
On 08/23 05:04 , Tyler J. Wagner wrote:
What was your disk configuration? Any RAID involved?
It was RAIDed, don't remember the details. Possibly RAID 5 (yes, I know,
poor performance -- but low price, and acceptably fast for most needs).
--
Carl Soderstrom
Systems Administrator
Real-Time
On Monday 23 Aug 2010 18:27:59 Carl Wilhelm Soderstrom wrote:
On 08/23 05:04 , Tyler J. Wagner wrote:
What was your disk configuration? Any RAID involved?
It was RAIDed, don't remember the details. Possibly RAID 5 (yes, I know,
poor performance -- but low price, and acceptably fast for most
Original Message
Subject: Re: [BackupPC-users] Backup backuppc
From: Mirco Piccin pic...@gmail.com
To: General list for user discussion, questions and support
backuppc-users@lists.sourceforge.net
Date: Wed Aug 18 2010 14:25:38 GMT+0200 (ora Legale Europa Occidentale)
Hi
Hi
I would like to ask which is the simplest yet effective way to dump
backuppc stuff (mainly __TOPDIR__) eg to a removable hard disk that will
be used in a disaster recovery scenario where the plant were destroyed
and I need to restore data from this survivor device. Is a rsync -aH
enough?
Original Message
Subject: Re: [BackupPC-users] Backup backuppc
From: Mirco Piccin pic...@gmail.com
To: General list for user discussion, questions and support
backuppc-users@lists.sourceforge.net
Date: Fri Aug 20 2010 11:02:53 GMT+0200 (ora Legale Europa Occidentale)
i
On Friday 20 Aug 2010 10:23:39 Farmol SPA wrote:
TOPDIR resides on an LVM volume. I know about snapshots but I've never
tried them in practice.
Is it difficult? How could I take a snapshot of a volume and put it on
another disk (external USB device not LVM)?
It's easy. I've done it on two
Original Message
Subject: Re: [BackupPC-users] Backup backuppc
From: Tyler J. Wagner ty...@tolaris.com
To: backuppc-users@lists.sourceforge.net
Cc: Farmol SPA farmol...@gmail.com
Date: Fri Aug 20 2010 12:48:15 GMT+0200 (ora Legale Europa Occidentale)
On Friday 20 Aug 2010 10
Farmol SPA wrote:
Original Message
Subject: Re: [BackupPC-users] Backup backuppc
From: Mirco Piccin pic...@gmail.com
To: General list for user discussion, questions and support
backuppc-users@lists.sourceforge.net
Date: Wed Aug 18 2010 14:25:38 GMT+0200 (ora Legale
On Friday 20 Aug 2010 12:32:10 Farmol SPA wrote:
A question: the source logical volume and the snapshot one must reside
in the same volume group for this feature to work?
I believe a snapshot of a logical volume must necessarily reside in the same
volume group, yes. So if you have an LV which
Original Message
Subject: Re: [BackupPC-users] Backup backuppc
From: Tyler J. Wagner ty...@tolaris.com
To: backuppc-users@lists.sourceforge.net
Cc: Farmol SPA farmol...@gmail.com
Date: Fri Aug 20 2010 15:00:37 GMT+0200 (ora Legale Europa Occidentale)
On Friday 20 Aug 2010 12
Hi
So, whenever I create the snapshot I have a static copy of the source
lv that I can copy with any method (eg rsync or netcat). At this point,
please forgive me, I don't see the advantage of LVM snapshots than using
directly the rsync approach on the live lv provided backuppc is
sleeping
On Friday 20 Aug 2010 15:09:02 Farmol SPA wrote:
So, whenever I create the snapshot I have a static copy of the source
lv that I can copy with any method (eg rsync or netcat). At this point,
please forgive me, I don't see the advantage of LVM snapshots than using
directly the rsync approach on
Original Message
Subject: Re: [BackupPC-users] Backup backuppc
From: Mirco Piccin pic...@gmail.com
To: General list for user discussion, questions and support
backuppc-users@lists.sourceforge.net
Date: Fri Aug 20 2010 16:28:05 GMT+0200 (ora Legale Europa Occidentale)
the point
On 8/20/2010 9:46 AM, Farmol SPA wrote:
Original Message
Subject: Re: [BackupPC-users] Backup backuppc
From: Mirco Piccin pic...@gmail.com
To: General list for user discussion, questions and support
backuppc-users@lists.sourceforge.net
Date: Fri Aug 20 2010 16:28:05 GMT
Les Mikesell wrote at about 07:51:56 -0500 on Friday, August 20, 2010:
Farmol SPA wrote:
Original Message
Subject: Re: [BackupPC-users] Backup backuppc
From: Mirco Piccin pic...@gmail.com
To: General list for user discussion, questions and support
backuppc
Original Message
Subject: Re: [BackupPC-users] Backup backuppc
From: Les Mikesell lesmikes...@gmail.com
To: General list for user discussion, questions and support
backuppc-users@lists.sourceforge.net
Date: Wed Aug 18 2010 14:31:46 GMT+0200 (ora Legale Europa Occidentale
Hi,
On Wed, Aug 18, 2010 at 2:00 PM, Farmol SPA farmol...@gmail.com wrote:
Hi list.
I would like to ask which is the simplest yet effective way to dump
backuppc stuff (mainly __TOPDIR__) eg to a removable hard disk that will
be used in a disaster recovery scenario where the plant were
Farmol SPA wrote:
Hi list.
I would like to ask which is the simplest yet effective way to dump
backuppc stuff (mainly __TOPDIR__) eg to a removable hard disk that will
be used in a disaster recovery scenario where the plant were destroyed
and I need to restore data from this survivor
Hi list.
I would like to ask which is the simplest yet effective way to dump
backuppc stuff (mainly __TOPDIR__) eg to a removable hard disk that will
be used in a disaster recovery scenario where the plant were destroyed
and I need to restore data from this survivor device. Is a rsync -aH
On Wed, Oct 28, 2009 at 09:25:31PM -0600, dan wrote:
You can tar up the whole pool directory and put it on an external drive
pretty easily.
Serious question: of what value is a backup of (only) the pool directory?
danno
--
Dan Pritts, Sr. Systems Engineer
Internet2
office: +1-734-352-4953 |
I have a setup with /var/lib/backuppc mounted to a 1 TB Firewire 800
drive. From a standpoint local to BackupPC, it's transparent. With USB
you'll be limited to 480 Mbps minus overhead, but that's about the
only real consideration I'm aware of--and that's still 60 MB/s
theoretical, which you
On 10/28/09, Chris Owen chriso...@eigersecurities.com wrote:
Would like some advise on the best way to backup backuppc. I have a
1TB USB drive that I would like to copy all of our backups to. Has
anyone done this? How easy is it to roll back from USB?
Hi Chris - I have a backup program in
I am currently backing up BackupPc using rsync 3.0.6 to a USB2 drive
with no problems so far.
The total amount of data being backed up is about 300 Gb with 5M files.
The total time to sync all the BackupPc directories to the USB disk is
about 1 hour.
It's important to notice that the
Hey
Thanks for the posts back, I am going to have a look at the ideas
posted here and will let you know how I get on and how I am going to
do this.
Thanks for your help.
Chris Owen
Sent from my iPhone
On 29 Oct 2009, at 03:47, Shawn Perry redmo...@comcast.net wrote:
If you are using
Hey
Would like some advise on the best way to backup backuppc. I have a
1TB USB drive that I would like to copy all of our backups to. Has
anyone done this? How easy is it to roll back from USB?
Many Thanks
Chris Owen
Sent from my iPhone
I have a setup with /var/lib/backuppc mounted to a 1 TB Firewire 800
drive. From a standpoint local to BackupPC, it's transparent. With USB
you'll be limited to 480 Mbps minus overhead, but that's about the
only real consideration I'm aware of--and that's still 60 MB/s
theoretical, which you
You can tar up the whole pool directory and put it on an external drive
pretty easily. Just make sure that backuppc is not running when you do this
-OR- do an LVM snapshot and then backup the snapshot.
I have been using rsync to sync two servers for a long time but have
recently started
If you are using LVM, just use pvmove and vgsplit with only a little
downtime.
On Wed, Oct 28, 2009 at 9:25 PM, dan danden...@gmail.com wrote:
You can tar up the whole pool directory and put it on an external drive
pretty easily. Just make sure that backuppc is not running when you do this
On Fri, 2006-11-10 at 16:49 +0100, GATOUILLAT Pierre-Damien wrote:
Perhaps with dd ? Something like :
(on the old server)#dd if=/dev/old_partition | ssh new_server dd
of=/dev/new_partition (perhaps indicate the bs= ? But with which ?)
That approach will work as long as (a) the
On 11/10 04:49 , GATOUILLAT Pierre-Damien wrote:
Perhaps with dd ? Something like :
(on the old server)#dd if=/dev/old_partition | ssh new_server dd
of=/dev/new_partition (perhaps indicate the bs= ? But with which ?)
Or with netcat and dd like that :
(new_server)#nc -l -p 1
On Thursday, November 09, 2006 5:00 AM, daniel berteaud wrote:
You can use rsync to do that, something like
rsync -avP -H /old_location [EMAIL PROTECTED]:/new_location
should do the trick
You can also just tar up the backuppc directory and move it as one chunk. You
are going to have to
56 matches
Mail list logo