-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
David Birnbaum wrote:
> Greetings,
>
> I've been using BackupPC for several years now, but one problem that I've
> never
> come up with a good answer for is when a single large file is too big to
> transfer completely in the time the backup can run
On Thu, 10 Apr 2008, Adam Goryachev wrote:
>> Does anyone have a workaround or fix for this? Is it possible to change
>> BackupPC so it doesn't remove the in-progress file, but instead copies it
>> into
>> the pool so rsync will pick up where it left off last time? There doesn't
>> seem
>> to
The --partial option seems like it would do the trick.
Can anyone comment as to how hard it would be to put it into File::RsyncP?
David.
-
On Wed, 9 Apr 2008, Les Mikesell wrote:
> David Birnbaum wrote:
>
>>>
>>> One approach is to use a VPN connection to the remote site. Openvpn has
>>
David Birnbaum wrote:
>>
>> One approach is to use a VPN connection to the remote site. Openvpn
>> has an option to do keepalives on the line - and to do lzo compression
>> on the data.
>
> These files are often compressed already - for example, one client has
> very large MP3 files that cont
On Wed, 9 Apr 2008, Les Mikesell wrote:
>> I've been using BackupPC for several years now, but one problem that I've
>> never come up with a good answer for is when a single large file is too big
>> to transfer completely in the time the backup can run before timing out.
>> For example, a 10M l
On Wed, Apr 09, 2008 at 09:20:13AM -0400, David Birnbaum wrote:
> Does anyone have a workaround or fix for this? Is it possible to change
> BackupPC so it doesn't remove the in-progress file, but instead
> copies it into the pool so rsync will pick up where it left off last time?
> There doesn't
David Birnbaum wrote:
>
> I've been using BackupPC for several years now, but one problem that I've
> never
> come up with a good answer for is when a single large file is too big to
> transfer completely in the time the backup can run before timing out. For
> example, a 10M local datafile, b
Greetings,
I've been using BackupPC for several years now, but one problem that I've never
come up with a good answer for is when a single large file is too big to
transfer completely in the time the backup can run before timing out. For
example, a 10M local datafile, backing up over a 768k up