Matthias Meyer wrote at about 11:43:23 +0200 on Monday, April 25, 2011:
Hi Jeffrey,
Thanks for sending your perl script.
Hmmm... not sure I even remember which of my scripts you are talking about...
Unfortunately I can't answer you because:
- The following addresses had
Hi Jeffrey,
Thanks for sending your perl script.
Unfortunately I can't answer you because:
- The following addresses had permanent fatal errors -
j...@kosowsky.org
- Transcript of session follows -
... while talking to smtp.secureserver.net.:
550 5.7.1 SPF unauthorized
Jeffrey J. Kosowsky wrote:
I think there is a 3rd camp:
3. Scripts that understand the special structure of the pool and pc
trees and efficiently create lists of all hard links in pc
directory.
a] BackupPC_tarPCCOPY
Included in standard BackupPC installations. It uses a perl
i've been having good success with a script that selects only the most
recent full and most recent incremental for each backup in the pc
directory, as well as the set of backups last successfully
transferred, and rsync's that set offsite, with -H. for me, this
still deduplicates, and keeps a
@ Tim
I did *briefly* search before making this email thread. As I found it
similar to looking for the right kind of pants in a clothing store, I made
the thread with particular keywords with the hope that Google would extract
this email rather than the others don't don't so *briefly* get to the
On 2/22/2011 9:17 PM, Dennis Blewett wrote:
13,849 items, totalling 3.8 GB
It would appear that I have a feasible number of files. I'm not sure how
many more files I will have by the end of April, though.
I've read about that rsync -H would be a practical command to use on
the backuppc
Dennis Blewett dennis.blew...@gmail.com wrote on 02/24/2011 04:41:00 PM:
@ Tim
I did *briefly* search before making this email thread. As I found
it similar to looking for the right kind of pants in a clothing
store, I made the thread with particular keywords with the hope that
Google
Les Mikesell lesmikes...@gmail.com wrote on 02/24/2011 05:21:27 PM:
On 2/22/2011 9:17 PM, Dennis Blewett wrote:
What I'm also curious about is if I should be rsyncing any other
files,
thus allowing me to restore from the offsite backup in the case I lose
everything and rebuild a backuppc
Timothy J Massey wrote at about 23:13:52 -0500 on Tuesday, February 22, 2011:
Dennis Blewett dennis.blew...@gmail.com wrote on 02/22/2011 10:17:29 PM:
13,849 items, totalling 3.8 GB
It would appear that I have a feasible number of files. I'm not sure
how many more files I will
On 2/22/2011 11:13 PM, Timothy J Massey wrote:
Dennis Blewett dennis.blew...@gmail.com wrote on 02/22/2011 10:17:29 PM:
13,849 items, totalling 3.8 GB
It would appear that I have a feasible number of files. I'm not sure
how many more files I will have by the end of April, though.
Carl Wilhelm Soderstrom chrome at real-time.com writes:
On 02/21 11:00 , Dennis Blewett wrote:
Will I come across many problems in later restoring the pool's data if I
just rsync /var/lib/backuppc to the server?
Are there other files and folders I should be rsync'ing to the server?
John Goerzen wrote at about 14:26:33 + on Wednesday, February 23, 2011:
Carl Wilhelm Soderstrom chrome at real-time.com writes:
On 02/21 11:00 , Dennis Blewett wrote:
Will I come across many problems in later restoring the pool's data if I
just rsync /var/lib/backuppc to
rsync'ing the BackupPC data pool is generally recommended against. The
number of hardlinks causes an explosive growth in memory consumption by
rsync and while you may be able to get away with it if you have 20GB of
data (depending on how much memory you have); you will likely run out of
gregwm backuppc-us...@whitleymott.net wrote on 02/22/2011 11:26:51 AM:
this issue sure comes up alot, and perhaps i should just keep quiet
since i personally am in no position to do it or even go off looking
for an rsync forum, nor do i have any knowledge of just how convoluted
the rsync
gregwm wrote at about 10:26:51 -0600 on Tuesday, February 22, 2011:
rsync'ing the BackupPC data pool is generally recommended against. The
number of hardlinks causes an explosive growth in memory consumption by
rsync and while you may be able to get away with it if you have 20GB of
13,849 items, totalling 3.8 GB
It would appear that I have a feasible number of files. I'm not sure how
many more files I will have by the end of April, though.
I've read about that rsync -H would be a practical command to use on the
backuppc folder.
What I'm also curious about is if I should
Dennis Blewett dennis.blew...@gmail.com wrote on 02/22/2011 10:17:29 PM:
13,849 items, totalling 3.8 GB
It would appear that I have a feasible number of files. I'm not sure
how many more files I will have by the end of April, though.
I've read about that rsync -H would be a practical
Hey, folks.
I've come back into the arena of using BackupPC.
At the moment, one of my main goals is to rsync the pool and whatever else
to an offsite storage area.
I'm a student with access to a server for storing about 20GB of stuff on. I
can use the intranet and rsync to move stuff onto it.
18 matches
Mail list logo