[Bacula-users] Quantum Scalar i500 slow write speed

2010-08-05 Thread ekke85
of anything I can try, please let me know. ekke85 +-- |This was sent by ekk...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com

[Bacula-users] Quantum Scalar i500 slow write speed

2010-08-05 Thread ekke85
;0m28.415s #93;# The 11Tb I have to backup is on a NetApp, the NetApp is mounted via NFS on the backup host and is getting the data from there to write to disk. ekke85 +-- |This was sent by ekk...@gmail.com via Backup Central

[Bacula-users] 11TB backup run out of memory and dies

2010-06-17 Thread ekke85
-batch-insert). It has now been backingup for 13 hours with no problem. It is a bit slow(23mb/sec), but that is something ill look into later, or if someone has any suggestion please send them. Thanks for all your help guys! ekke85

[Bacula-users] 11TB backup run out of memory and dies

2010-06-14 Thread ekke85
to backup. I do run atop and also atopsar to try and see where and when it dies, but it is really hard to find it. The other problem that I think might cause it, is that some of the files is 1.1TB on its own. ekke85

[Bacula-users] 11TB backup run out of memory and dies

2010-06-14 Thread ekke85
rows in the File table. ekke85 +-- |This was sent by ekk...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com

[Bacula-users] 11TB backup run out of memory and dies

2010-06-14 Thread ekke85
, I do not spool the 600GB. ekke85 +-- |This was sent by ekk...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com

[Bacula-users] 11TB backup run out of memory and dies

2010-06-11 Thread ekke85
the box running out of memory after doing about 600GB, if I spool data it is also very slow. ekke85 +-- |This was sent by ekk...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com

[Bacula-users] 11TB backup run out of memory and dies

2010-06-09 Thread ekke85
Hi I have a Scalar i500 library with 5 drives and 135 slots. I have a Red Hat 5 server with a 1gb nic. The setup works fine for backups on most systems. The problem I have is a NFS share with 11TB of data that I need to backup. Every time I run this job it will write about 600GB of data to a

[Bacula-users] Maximum Concurrent Jobs

2010-06-04 Thread ekke85
the backup to use all 5 drives to write the backup to tapes? or what should i check to try and make the backup run faster? Any help will be much appreciated. ekke85 +-- |This was sent by ekk...@gmail.com via Backup Central

[Bacula-users] label barcodes on Quantum Scalar i500

2010-04-28 Thread ekke85
a slot, pick a drive, load a tape from that slot into that drive with mtx, then write a small bit of data using dd or tar or whatever, then rewind, eject unload with mt and/or mtx. Regards, Alex On Mon, 26 Apr 2010 12:27:25 -0400 ekke85 bacula-forum at backupcentral.com wrote: Quote

[Bacula-users] label barcodes on Quantum Scalar i500

2010-04-26 Thread ekke85
information that you might need. ekke85 +-- |This was sent by ekk...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com

[Bacula-users] label barcodes on Quantum Scalar i500

2010-04-26 Thread ekke85
ekke85 wrote: Hi I hope someone can help me or point me in the right direction. I have a bran new Quantum Scalar i500 with 5 drives and 125 slots. When ever I try to do label barcodes it fails with timeout errors. Bacula is reading the barcodes from the tape drive but is not labeling