[BackupPC-users] Huge remote directory (20GB): how will it be transferred?

2010-11-09 Thread Boniforti Flavio
Hello everybody. One of my remote servers has grown a single directory from a couple of GB to 20GB in one day. Now backups (rsync through ssh) seem to be taking ages and time-out after 72000 seconds. My question: when such a backup gets stopped, will the next task consider the already transferred

[BackupPC-users] How to stop starting new backup jobs?

2010-11-09 Thread Pavel Hofman
Hi, Is there a way to tell backuppc to finish the currently running backup jobs and not to start new ones? We mirror backuppc partitions to external drives via SW RAID and need to stop backuppc and umount the filesystem to keep the data. I do not want to interrupt the long-running backups but

Re: [BackupPC-users] How to stop starting new backup jobs?

2010-11-09 Thread Mirco Piccin
Hi, Is there a way to tell backuppc to finish the currently running backup jobs and not to start new ones? maybe not the better way, but you could obtain that using DumpPreUserCmd and DumpPostUserCmd You need also to set UserCmdCheckStatus = 1; In the DumpPreUserCmd, you can use a diy script

Re: [BackupPC-users] How to stop starting new backup jobs?

2010-11-09 Thread Pavel Hofman
Mirco Piccin napsal(a): Hi, Is there a way to tell backuppc to finish the currently running backup jobs and not to start new ones? maybe not the better way, but you could obtain that using DumpPreUserCmd and DumpPostUserCmd You need also to set UserCmdCheckStatus = 1; In the

Re: [BackupPC-users] How to stop starting new backup jobs?

2010-11-09 Thread Pavel Hofman
Tyler J. Wagner napsal(a): On Tue, 2010-11-09 at 12:41 +0100, Pavel Hofman wrote: Thanks a lot for your suggestion. In fact we use the PreDumpUserCmd to lock the backed-up machines to disable shutdown while the backup is in progress. You are right, it will work. Though IMHO it is an unpretty

Re: [BackupPC-users] Huge remote directory (20GB): how will it be transferred?

2010-11-09 Thread Boniforti Flavio
Hello Pavel. for huge dirs with millions of files we got almost an order of magnitude faster runs with the tar mode instead of rsync (which eventually consumed all the memory anyways :) ) How would I be able to use tar over a remote DSL connection? Flavio Boniforti PIRAMIDE INFORMATICA

Re: [BackupPC-users] How to stop starting new backup jobs?

2010-11-09 Thread Tyler J. Wagner
On Tue, 2010-11-09 at 12:41 +0100, Pavel Hofman wrote: Thanks a lot for your suggestion. In fact we use the PreDumpUserCmd to lock the backed-up machines to disable shutdown while the backup is in progress. You are right, it will work. Though IMHO it is an unpretty workaround :-) , especially

Re: [BackupPC-users] Huge remote directory (20GB): how will it be transferred?

2010-11-09 Thread Pavel Hofman
Les Mikesell napsal(a): On 11/9/10 2:13 AM, Boniforti Flavio wrote: Hello everybody. One of my remote servers has grown a single directory from a couple of GB to 20GB in one day. Now backups (rsync through ssh) seem to be taking ages and time-out after 72000 seconds. My question: when such a

Re: [BackupPC-users] Huge remote directory (20GB): how will it be transferred?

2010-11-09 Thread Boniforti Flavio
Hello Les. An rsync full should be marked as a 'partial' with the completed portion merged into the previous full as the comparison base when it restarts. I think an incomplete incremental is discarded. I'd bump up the timeout and add a -C (compress) option to your ssh command if you

Re: [BackupPC-users] Huge remote directory (20GB): how will it be transferred?

2010-11-09 Thread Robin Lee Powell
On Tue, Nov 09, 2010 at 04:11:15PM +0100, Boniforti Flavio wrote: Hello Pavel. for huge dirs with millions of files we got almost an order of magnitude faster runs with the tar mode instead of rsync (which eventually consumed all the memory anyways :) ) How would I be able to use tar

Re: [BackupPC-users] Huge remote directory (20GB): how will it be transferred?

2010-11-09 Thread Les Mikesell
On 11/9/10 2:13 AM, Boniforti Flavio wrote: Hello everybody. One of my remote servers has grown a single directory from a couple of GB to 20GB in one day. Now backups (rsync through ssh) seem to be taking ages and time-out after 72000 seconds. My question: when such a backup gets stopped,

Re: [BackupPC-users] How to view log of running tasks

2010-11-09 Thread Les Mikesell
On 11/9/10 1:24 AM, Boniforti Flavio wrote: Hello list. How may I take a look at the log of the *actually running* processes? I feel something may be stuck, but the process still is shown as running, therefore I'd like to have a look at what's happening, or at what point it arrived. I'm not

Re: [BackupPC-users] Huge remote directory (20GB): how will it be transferred?

2010-11-09 Thread Richard Shaw
On Tue, Nov 9, 2010 at 9:27 AM, Robin Lee Powell rlpow...@digitalkingdom.org wrote: On Tue, Nov 09, 2010 at 04:11:15PM +0100, Boniforti Flavio wrote: Hello Pavel. for huge dirs with millions of files we got almost an order of magnitude faster runs with the tar mode instead of rsync (which

Re: [BackupPC-users] Huge remote directory (20GB): how will it be transferred?

2010-11-09 Thread Robin Lee Powell
On Tue, Nov 09, 2010 at 09:37:01AM -0600, Richard Shaw wrote: On Tue, Nov 9, 2010 at 9:27 AM, Robin Lee Powell rlpow...@digitalkingdom.org wrote: On Tue, Nov 09, 2010 at 04:11:15PM +0100, Boniforti Flavio wrote: Hello Pavel. for huge dirs with millions of files we got almost an order

Re: [BackupPC-users] How to view log of running tasks

2010-11-09 Thread Mirco Piccin
Hi How may I take a look at the log of the *actually running* processes? I feel something may be stuck, but the process still is shown as running, therefore I'd like to have a look at what's happening, or at what point it arrived. I'm not sure about how the logs are buffered, but you might

Re: [BackupPC-users] Huge remote directory (20GB): how will it be transferred?

2010-11-09 Thread Robin Lee Powell
On Tue, Nov 09, 2010 at 05:13:58PM +0100, Boniforti Flavio wrote: Well, by hand you'd do: ssh host 'tar -czvf - /dir' /backups/foo.tgz But wouldn't this create *one huge tarball*??? That's not what I'd like to get... It was an example for your benefit in future; it has nothing to do

Re: [BackupPC-users] Huge remote directory (20GB): how will it be transferred?

2010-11-09 Thread Boniforti Flavio
Well, by hand you'd do: ssh host 'tar -czvf - /dir' /backups/foo.tgz But wouldn't this create *one huge tarball*??? That's not what I'd like to get... Flavio Boniforti PIRAMIDE INFORMATICA SAGL Via Ballerini 21 6600 Locarno Switzerland Phone: +41 91 751 68 81 Fax: +41 91 751 69 14 URL:

Re: [BackupPC-users] Huge remote directory (20GB): how will it be transferred?

2010-11-09 Thread Les Mikesell
On 11/9/2010 9:10 AM, Boniforti Flavio wrote: Hello Les. An rsync full should be marked as a 'partial' with the completed portion merged into the previous full as the comparison base when it restarts. I think an incomplete incremental is discarded. I'd bump up the timeout and add a -C

[BackupPC-users] Am I going about this wrong?

2010-11-09 Thread Rob Poe
I'm archiving the BackupPC backup folder (/var/lib/BackupPC) folder to external disk with rsync. However, it looks like rsync is filling the links? My total disk usage on the backup server is 407g, and the space used on the external drive is up to 726g. (using rsync -avh --delete --quiet

Re: [BackupPC-users] Am I going about this wrong?

2010-11-09 Thread Rob Poe
And I should mention, too, that this is the first rsync of a freshly formatted USB drive :) On 11/9/2010 2:44 PM, Rob Poe wrote: I'm archiving the BackupPC backup folder (/var/lib/BackupPC) folder to external disk with rsync. However, it looks like rsync is filling the links? My total disk

[BackupPC-users] Got fatal error during xfer (Total bytes written: 640049664)

2010-11-09 Thread Guillaume Filion
Hi, I've been using BackupPC for a several years without problem but since I started adding Windows 7 clients, they are only getting partial backups with this reported as an error: tarExtract: Done: 0 errors, 3720 filesExist, 623786455 sizeExist, 496652191 sizeExistComp, 3755 filesTotal,

Re: [BackupPC-users] Am I going about this wrong?

2010-11-09 Thread gregwm
I'm archiving the BackupPC backup folder (/var/lib/BackupPC) folder to external disk with rsync. However, it looks like rsync is filling the links? My total disk usage on the backup server is 407g, and the space used on the external drive is up to 726g. (using rsync -avh --delete --quiet

[BackupPC-users] Perl module requirements

2010-11-09 Thread Richard Shaw
Quick question I could not find the answer to elsewhere. I currently use BackupPC on Fedora which is stuck at 3.1. The reason given is that there are now some Perl module dependencies that are built into the BackupPC package which Fedora requires be packages separately (no bundled libraries).

Re: [BackupPC-users] How to stop starting new backup jobs?

2010-11-09 Thread Luis Paulo
On Tue, Nov 9, 2010 at 10:19, Pavel Hofman pavel.hof...@ivitera.com wrote: Hi, Is there a way to tell backuppc to finish the currently running backup jobs and not to start new ones? We mirror backuppc partitions to external drives via SW RAID and need to stop backuppc and umount the

[BackupPC-users] copied over files, incremental backup didnt get them

2010-11-09 Thread maeck
Hi, we have the same problem like Chris Baker. Backuppc makes an incremental backup of some folders, but don't copy the modified files! For example: There is a file called concept.odt, which would be modified 2010-11-09 09:45. BackupPC starts a incremental backup on 2010-11-09 21:00 but don't

[BackupPC-users] Net:FTP::RetrHandle

2010-11-09 Thread Andrew Spiers
Hi, I've downloaded the debian version of BackupPC from http://packages.debian.org/sid/backuppc and am setting about installing it on a Ubuntu Maverick system for evaluation. Following the documentation, I've installed the modules mentioned via cpan. All of them seem to have installed except

Re: [BackupPC-users] How to stop starting new backup jobs?

2010-11-09 Thread Pavel Hofman
Luis Paulo napsal(a): On Tue, Nov 9, 2010 at 10:19, Pavel Hofman pavel.hof...@ivitera.com wrote: Hi, Is there a way to tell backuppc to finish the currently running backup jobs and not to start new ones? We mirror backuppc partitions to external drives via SW RAID and need to stop backuppc

Re: [BackupPC-users] Net:FTP::RetrHandle

2010-11-09 Thread Oliver Dauter
Hi On Wed, Nov 10, 2010 at 05:07, Andrew Spiers aspi...@vpac.org wrote: Hi, I've downloaded the debian version of BackupPC  from http://packages.debian.org/sid/backuppc what's wrong with http://packages.ubuntu.com/maverick/backuppc