I had the exact same problem, running out of free memory while having 2GB installed, I had to split the backups into 10 separate jobs, where the biggest consist of 3,8M files and 250GB data, and even this takes a lot of time on my dual xenon system with fibre channels and scsi disks, while building the directory tree.

(and please notice the problem I posted earlier to day, where File tables was full)

Roger Kvam

Ryan Sizemore wrote:
All,

We have run our first full backup of our data server, and we are in the
process of restoring the entire system to a temporary drive. Everything
is going great, although when bconsole goes to grab all of the files
(before we are able to mark the ones that we want), it take 10-15
minutes before we can mark any files.

A full backup consists of 803.4 GB and about 10 million file.

mysql> select count(*) from File;
+----------+
| count(*) |
+----------+
|  9978625 |
+----------+
1 row in set (0.00 sec)

Currently, with just this full backup (no incrementals yet), the size of
the database is 1.7 GB. While the query is executing, bacula-dir
consumes over 2GB of memory, then the building of the file tree takes
another 600MB. We are planning to keep a record of full backups for a
year, and with the current growth rate of the database, it will likely
get quite large indeed. The foreseen problem here is that as the size of
our full backups grow, so will the amount of memory that the queries
need to run. This will pose the biggest problem for situations where a
use wants to restore just a few files. Also, the current query takes up
almost all of the 1 GB of memory plus 2 GB of swap. The system is a dual
PIII-800 with 1G of RAM. Are there people out there restoring a similar
size/number of files? If so, I would like to hear what hardware you are
running.

On the other hand, once the file tree is built, marking files is
relatively quick. The restore begins almost as soon as the job is
started, whereas our current backup manager (Networker) takes over 12
hours to select all of the files it is going to need, then dies because
it has lost its connection to the client. So my question to the
community is: Is there any way to speed up the building of the initial
file tree? Or is there way of limiting the query size?

Best Regards,

Ryan Sizemore


-------------------------------------------------------
This SF.net email is sponsored by: Splunk Inc. Do you grep through log files
for problems?  Stop!  Download the new AJAX search engine that makes
searching your log files as easy as surfing the  web.  DOWNLOAD SPLUNK!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=103432&bid=230486&dat=121642



-------------------------------------------------------
This SF.net email is sponsored by: Splunk Inc. Do you grep through log files
for problems?  Stop!  Download the new AJAX search engine that makes
searching your log files as easy as surfing the  web.  DOWNLOAD SPLUNK!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=103432&bid=230486&dat=121642
_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users

Reply via email to