> I'd also like to hear success stories of implementations in different
> infrastructures. F.e. how many machines, what's being backed up, how
> offsite backups are being taken care of and such...

I have wished in the past to have an easy-to-browse collection of working
setups, so that it's easy to compare hardware, methods and performance of
the different solutions. Maybe a wiki would be more appropriate for that
kind of document... Anyway, here is my setup, for the record:

BackupPC server
--------------------------
-- Motherboard: Supermicro dual Opteron, with PCI-X 133 and PCI-X 66 slots.
-- CPU: only one AMD Opteron 248 (2.2 GHz) for now.
-- RAM: 2GB DDR 3200, ECC.
-- RAID controllers: 1 x 3ware 9500S-12 (for BackupPC's filesystem), 1 x
3ware 8506-4 (from a previous system, used for standby server functions).
-- HDs:
    *  on the 9500S-12 controller: 5 x Seagate Barracuda 500GB SATA
(ST3500641AS), for BackupPC filesystem, Western Digital 80 GB for system
partition, and 2 x Seagate 160 GB for other stuff;
    *  on the 3ware 8506-4 controller: 4 x 400 GB Hitachi Deskstar 7K4 for
standby partition

-- OS: SuSE Linux 10.0
-- Partitions:
    *  single large partition 1.8 TB, hardware RAID 5 + LVM, on the
9506S-12 controller;
    *  single partition 1.2 TB, hardware RAID 5 + LVM, on 8506-4
controller.
-- BackupPC setup: version 2.1.2pl0, Apache 2, KeyChain

Client
--------
Only one client for now, our file server.

-- Motherboard: Intel SE7501CW2: E7501 chipset, PCI-X slots, dual Xeon
sockets.
-- CPU: single Xeon 2.8 GHz/533 MHz
-- RAM: 2 x 512 MB DDR (PC2100) + 2 x 1 GB ECCREG (total 3 GB)
-- SATA backplane
-- RAID card: 3Ware Escalade 8506-8 SATA
-- HDs:
    *  on integrated IDE controller: 1 Maxtor 52049U4, 20 GB, PATA, 7200
RPM (system);
    *  on 8506-8 controller: 8 x Maxtor Diamond Plus 9 250 GB SATA 7200 RPM
(data).
-- Tape library: ADIC FastStor2, LTO-2 (8 slots x 400 GB)

-- OS: SuSE Linux 9.1
-- Partitions: 5 partitions, hardwareRAID 5 + LVM, on the 8506-8
controller:
    1.  200 GB, 195 used
    2.  200 GB, 172 used
    3.  250 GB, 207 used
    4.  300 GB, 0 used
    5.  500 GB, 472 used
    6.  Some free space, used for LVM snapshots.
-- Other: KeyChain.

BackupPC configuration
-----------------------------------
-- Authentication: KeyChain on both client and server to enable automated
logins of server's backuppc user as root on client. backuppc user needs to
log in once after every reboot on server, root needs to log in once after
every reboot on client.
-- PreDump script: before initiating a dump, this script is called on the
client, to make an LVM snapshot of the appropriate partition.
-- PostDump script: after the dump is finished, this script is called to
remove the LVM snapshot and gather some statistics on snapshot usage (in
case it looks like we need to allocate more space in the future: right now,
the snapshot size is calculated as 10% of the used space on the backed up
partition).
-- Backup frequencies: one weekly full, one daily incremental.
-- CGI script running suid (no dedicated Apache server); no mod_perl
either.
-- Other: compressed (achieves ~50% compression); transfer method is rsync.

Performance
-------------------

Partition 1 (home directories, more dynamic): fulls take ~12 hours;
incrementals take 3.5 hours
Partition 2: fulls: 5 hours; incrementals: 0.75 hours
Partition 3: fulls: 8.5 hours; incrementals: 2 hours
Partition 4: not being backed up yet
Partition 5: fulls: 15 hours; incrementals: 2.5 hours

As you can see, very long backup times. I'm looking into ways to reduce
them. I think I'm network bound for the fulls, and memory bound for the
incrementals (the server uses all the 2 GB of RAM, plus another ~1.5 GB of
swap). Swapping is obviously bad, so my first inclination would be to
purchase 2 more GB of RAM for the server.

As for network, ours is 100 Mb/s, with cat 5e cabling, but I might try
connecting the Gbit ports of server and client, point to point (our Cisco
switch doesn't support Gbit). While this won't give full Gbit speed (to
guarantee that I'd need cat 6 cable), I bet it'll provide a good fraction
of that. But first I need to do better monitoring and correlate network and
memory usage to specific backups; Zabbix might be the ticket for that.

I'm also going to balance partitions 4 and 5 soon, moving data from 5 to 4
which should help with memory and individual times.

Offsite backups
-----------------------
As for offsiting, I'm not doing it yet. I really have tried to avoid tape
backups (the client has the ADIC library), but I can no longer postpone
offsiting data. My plan initially was to run Bacula on the file server (the
client), but that would entail setting up a completely different system,
managing the dailies, weeklies, monthlies, with differentials/incrementals,
and all that stuff, to which I'm quite averse (my primary function here is
not sysadmin, this was only an accident; quite literally, come to think of
it...). So my new plan is to do, perhaps biweekly fulls of BackupPC's
partition, spanning as many tapes as necessary, and that's it. This
wouldn't need Bacula, or other tape backup manager, and would keep whatever
history BackupPC had at each point, which is more that can be achieved
easily with conventional tape backups.

Since the tape library is attached to a different machine than the BackupPC
server, I'd have to do tape backups over the network, for which having the
point-to-point connection might come handy too.

I also have looked into online backup services like LiveVault and
AmeriVault, but they are outrageously expensive, around 20-30 $ per month
per GB on the compressed data, which might mean around to $8,000 dollars a
month: definitely not doable for us.

So, as you can see, still quite a bit to do here until I can breath easily,
but getting there.


Hope this rant is useful to somebody.
Bernardo Rechea



|---------+------------------------------------------>
|         |           Andy <[EMAIL PROTECTED]>         |
|         |           Sent by:                       |
|         |           [EMAIL PROTECTED]|
|         |           ceforge.net                    |
|         |                                          |
|         |                                          |
|         |           01/24/2006 05:17 AM            |
|         |                                          |
|---------+------------------------------------------>
  
>------------------------------------------------------------------------------------------------------------------------------|
  |                                                                             
                                                 |
  |       To:       backuppc-users@lists.sourceforge.net                        
                                                 |
  |       cc:       Andri <[EMAIL PROTECTED]>                                   
                                                 |
  |       bcc:                                                                  
                                                 |
  |       Subject:  Re: [BackupPC-users] New user, few questions                
                                                 |
  
>------------------------------------------------------------------------------------------------------------------------------|




Andri wrote:
> Hello everyone,
>
> I'm new to BackupPC and I have a few questions that I hope some of you
> might be able to answer.
>
> 1.
>
> How do I backup f.e. mySQL databases. Does anyone have a good addon
> for BackupPC to access MySQL dumps. My other option is ofcourse to
> make the dumps in crontab and make BackupPC grab it.
>
>
> 2.
>
> I have to have offsite backups and to do that I have an external USB
> drive that is exchanged daily.
>
> Why do I have to manually activate the 'Archive' and how do I restore
> from it. It's lacking all the pretty frontend in the CGI interface and
> makes me uneasy.
>
> 3.
>
> Can I activate the 'archive backup' automaticly at a given time, can't
> find much about archives in the documentation.
>
> --
>
> I'd also like to hear success stories of implementations in different
> infrastructures. F.e. how many machines, what's being backed up, how
> offsite backups are being taken care of and such...
>
> Best regards,
> Andri
>
>
> -------------------------------------------------------
> This SF.net email is sponsored by: Splunk Inc. Do you grep through log
files
> for problems?  Stop!  Download the new AJAX search engine that makes
> searching your log files as easy as surfing the  web.  DOWNLOAD SPLUNK!
> http://sel.as-us.falkag.net/sel?cmd=k&kid3432&bid#0486&dat1642
> _______________________________________________
> BackupPC-users mailing list
> BackupPC-users@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/backuppc-users
> http://backuppc.sourceforge.net/

Hello Andri,

I use a script run from cron daily to dump the MySQL databases. Seems
like fairly standard practice. This is the script I use:

http://worldcommunity.com/opensource/utilities/mysql_backup.html

Like you, I intend to use BackupPC's archive feature to produce tar
archives suitable for off-site backup on (several) external hard disks.

There are other ways to go about off-site backup, as discussed in great
detail here frequently, but for now I prefer to keep it simple.

Although I haven't gotten around to looking at it yet, it should be
strait forward to automate the archive process from cron. Just figure
out the command you need to send to BackupPC.

I'm sure this was discussed on the list recently.

With regards to restoring from TAR, I have begun to write up (and test)
some restore procedures for my own purposes here:

http://www.besy.co.uk/projects/debian/backuppc_howto.htm

The first procedure is for a complete restore using the web-interface,
but the second is for restoring from TAR. Works for me on Debian.

I should point out that quite probably my procedures are not perfect. I
don't claim to be an expert on backup - it just seemed like a good idea
to document by restore procedures in case I need them one day!

HTH, Andy



-------------------------------------------------------
This SF.net email is sponsored by: Splunk Inc. Do you grep through log
files
for problems?  Stop!  Download the new AJAX search engine that makes
searching your log files as easy as surfing the  web.  DOWNLOAD SPLUNK!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=103432&bid=230486&dat=121642
_______________________________________________
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/






-------------------------------------------------------
This SF.net email is sponsored by: Splunk Inc. Do you grep through log files
for problems?  Stop!  Download the new AJAX search engine that makes
searching your log files as easy as surfing the  web.  DOWNLOAD SPLUNK!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid3432&bid#0486&dat1642
_______________________________________________
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/

Reply via email to