Re: [Bacula-users] FileDaemon vs manual transfer

2005-12-14 Thread Ferdinando Pasqualetti

--
Ferdinando Pasqualetti
G.T.Dati srl
Tel. 0557310862 - 3356172731 - Fax 055720143




Florian Daniel Otel [EMAIL PROTECTED]
wrote on 14/12/2005 09.28.11:

 On 12/13/05, Ferdinando Pasqualetti [EMAIL PROTECTED] wrote:
 
  I was giving that suggestion because, as far as I know bacula
is naot
  capable of resuming a job aborted by a broken connection.
 
 So, if I read you correctly, a temporary drop in the connection btw.
 the FileDaemon and StorageDaemon will cause the job to be aborted.
Any
 workarounds (in bacula) ?

AFAIK the only way you can try (to avoid the connection
breaking, not to resume the job) is to setup a Hearthbeat Interval
= someseconds, if this can help.

 
 
 Florian
 



---
This SF.net email is sponsored by: Splunk Inc. Do you grep through log files
for problems?  Stop!  Download the new AJAX search engine that makes
searching your log files as easy as surfing the  web.  DOWNLOAD SPLUNK!
http://ads.osdn.com/?ad_id=7637_id=16865=click
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] FileDaemon vs manual transfer

2005-12-12 Thread Ferdinando Pasqualetti

20 Gb is a small amount of data. I would
suggest to use rsync to get a local copy of data (using compression, and
other features), and than copy data on tape, hopefully with bacula.

--
Ferdinando Pasqualetti
G.T.Dati srl
Tel. 0557310862 - 3356172731 - Fax 055720143




[EMAIL PROTECTED] wrote on
12/12/2005 17.01.15:

 Hello all,
 
 I have the following robustness-related question:
 
 I am deploying bacula as a multi-site backup soultion, where a central
 tape storage gets relatively large quantities of data (aprox 20 
 GBytes per day) from multiple sites via net links of dubious
 reliability.
 
 I am considering two possible options of getting the data to the
 storage server:
 
 1) Install a File Daemon on each server to be backed up and let it
 pump the data over the net to the storage server
 
 2) Get the director (running on the same machine as the storage
 server) run a script that tars the data over the network
on a local
 temp directory and -- if the connection didn't break in the
mean time
 -- dump the result to tape.
 
 My question is: While in alternative 2) I can check for connection
 losses in the middle of the transfer and restart the data transfer,
I
 have no idea how the File Daemon behaves in the same situation: Does
 it save any state locally such that it makes it robust to
 connectivity losses ? Can it resume an interrupted job ? How
 resilient is the storage server to such interruption in the data
 stream from the client ? Can I compress the data as it is sent from
 the File Server to the Storage server ?
 
 TIA for any hints and pointers,
 
 Florian
 
 
 ---
 This SF.net email is sponsored by: Splunk Inc. Do you grep through
log files
 for problems? Stop! Download the new AJAX search engine
that makes
 searching your log files as easy as surfing the web. DOWNLOAD
SPLUNK!
 http://ads.osdn.com/?ad_idv37alloc_id865op=click
 ___
 Bacula-users mailing list
 Bacula-users@lists.sourceforge.net
 https://lists.sourceforge.net/lists/listinfo/bacula-users
 



---
This SF.net email is sponsored by: Splunk Inc. Do you grep through log files
for problems?  Stop!  Download the new AJAX search engine that makes
searching your log files as easy as surfing the  web.  DOWNLOAD SPLUNK!
http://ads.osdn.com/?ad_idv37_id865=click
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users