Hi,

I am installing bacula on a dedicated backup server running gentoo. The server is a 3 GHz P4. The tapestreamer is an IBM Ultrium LTO 3 (400 GB tapes uncompressed).

I am going to backup 11 servers. Most of them are straightforward, but 4 of them run large Postgresql databases, and I want to implement the new point in time recovery feature of postgresql 8.

So far it seems to work ok. But my full backups take too long. I enabled concurrent jobs because no single server can deliver data as fast as the streamer can write it. The problem seems to be that the backup server goes to 100% CPU usage and the total throughput stays at 10-15 MB/s.

I tested with dd, and managed to get about 73 MB/s copying a large file with random data to the tape. If possible, I would want to get as close to that as possible with the real backups.

I installed with SQLlite for simplicy, and I realise that might be the problem. Also my catalog size seems to be 1 GB already after just a few days testing, and I read that SQLlite will only handle 2 GB catalog. So I am going to recompile to postgresql, unless you guys tell me mysql is really much better for this. I suppose I could let one of the large postgresql database servers take the load, but I would rather not, to prevent the backup system from generating huge amounts of useless transaction logs.

The network is gigabit ethernet to all servers.

Any suggestions for how to optimize my setup?

Thanks,

Baldur


-------------------------------------------------------
SF.Net email is sponsored by:
Tame your development challenges with Apache's Geronimo App Server. Download it for free - -and be entered to win a 42" plasma tv or your very
own Sony(tm)PSP.  Click here to play: http://sourceforge.net/geronimo.php
_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users

Reply via email to