Re: [BackupPC-users] How to restore an 8GB archive file?
-Original Message- From: Jeffrey J. Kosowsky [mailto:backu...@kosowsky.org] Sent: Wednesday, April 13, 2011 3:33 PM To: General list for user discussion, questions and support Cc: sorin.s...@orgfarm.uu.se Subject: Re: [BackupPC-users] How to restore an 8GB archive file? That limit is long gone: root@frances:/tmp# uname -a Linux frances 2.6.32-30-generic #59-Ubuntu SMP Tue Mar 1 21:30:21 UTC 2011 i686 GNU/Linux I believe the OP was talking about 32bit Windows. Though even on WinXP or Win2000 I don't believe that is a limitation (unless you use FAT32 rather than NTFS). Perhaps the OP was talking about FAT32... No, it was actually linux. However it was my misunderstanding, as I thought it was a 32b kernel-problem, when in fact it's a file system limitation according to Google. The problem first came up on a 32b linux machine running ext3 file system. Moving the 8GB archive to a machine with ext4, solved the problem. OTOH, ext3 is said to have a max file size limit from about 16GB up to some 2TB, depending on block size. So why I would have a problem with an 8GB file is anybody's guess. -- /Sorin -- Benefiting from Server Virtualization: Beyond Initial Workload Consolidation -- Increasing the use of server virtualization is a top priority.Virtualization can reduce costs, simplify management, and improve application availability and disaster protection. Learn more about boosting the value of server virtualization. http://p.sf.net/sfu/vmware-sfdev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] How to restore an 8GB archive file?
-Original Message- From: Les Mikesell [mailto:lesmikes...@gmail.com] Sent: Wednesday, April 13, 2011 5:10 PM To: backuppc-users@lists.sourceforge.net Subject: Re: [BackupPC-users] How to restore an 8GB archive file? Why don't you just restore it back to his machine, using the typical option 1? If BackupPC archived it in the first place, it can restore it the same way. I've never had that option to work. This time I got a weird unable to read 4 bytes-error when trying a direct restore. Usually that means the restore is configured to use ssh in some way, and the ssh keys aren't set up correctly. Is there something different about the way your restore command works? I do use passwordless login for the backups to work. The backup works fine using ssh this way; I don't get prompted for a password. Not sure though, how you mean different for restoring. Could you elaborate a bit? I haven't really looked into the first restore option, ie tweaked in any way, as #2 and #3 have worked fine so far, until now. -- /Sorin -- Benefiting from Server Virtualization: Beyond Initial Workload Consolidation -- Increasing the use of server virtualization is a top priority.Virtualization can reduce costs, simplify management, and improve application availability and disaster protection. Learn more about boosting the value of server virtualization. http://p.sf.net/sfu/vmware-sfdev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] How to restore an 8GB archive file?
-Original Message- From: Holger Parplies [mailto:wb...@parplies.de] Sent: Thursday, April 14, 2011 12:38 AM To: sorin.s...@orgfarm.uu.se; General list for user discussion, questions and support Subject: Re: [BackupPC-users] How to restore an 8GB archive file? - Which user on the target host do you need to connect as? Perhaps root? When the backuppc user connects to a host to do a backup, it uses a passwordless login with ssh keys. The password entered the very first time I transferred the key, was root's. So does this mean it's user backuppc that does the actual restore or user root? If the first, then I can understand that the user backuppc can't write to anywhere, right? Personally, I wouldn't use the web interface for downloading large amounts of data anyway. On the command line, your imagination is the limit to what you can do. If it's not available as a filter yet, the BPC-author would likely need to implement the functionality. A generic tar2zipsplit filter would be more useful to the world than a specific implementation inside BackupPC, don't you think? Dunno', I only ever use the web gui, as it's so easy, practical and straight-forward to use. Actually it's the main reason why I stick with BPC; IMHO a backup-system is as good as the gui is and how admin-friendly it is. Personally I don't want to jump through hoops when I need to restore stuff quickly - a few clicks in the gui and I'm done. As I said, it's my personal opinions and maybe not really on-topic. 8-) -- /Sorin -- Benefiting from Server Virtualization: Beyond Initial Workload Consolidation -- Increasing the use of server virtualization is a top priority.Virtualization can reduce costs, simplify management, and improve application availability and disaster protection. Learn more about boosting the value of server virtualization. http://p.sf.net/sfu/vmware-sfdev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] Making errors in log stand out
-Original Message- From: Holger Parplies [mailto:wb...@parplies.de] Sent: Thursday, April 14, 2011 1:03 AM To: sorin.s...@orgfarm.uu.se; General list for user discussion, questions and support Subject: Re: [BackupPC-users] Making errors in log stand out [...] Thanks Bowie. Seemed to have done the trick, but I don't see anything red in the logs. 8-/ Checked all logs, the machine specific as well as the general summary log. the log files are conceptually ASCII, not HTML. You can't really get colour in there. You *could* make the code that displays the log file contents on the web page parse the log file and highlight anything you want (similar to the ability to extract only errors). That's definitely more complicated than adding a HTML tag somewhere, though. You could probably put HTML tags in the text log files, but I'd expect the characters to be quoted by the displaying code, so aside from looking ugly in the log files when viewed as text, it would probably look just as ugly on the web page rather than work ;-). Ah... Gotcha'. Thanks for clearing that up! -- /Sorin -- Benefiting from Server Virtualization: Beyond Initial Workload Consolidation -- Increasing the use of server virtualization is a top priority.Virtualization can reduce costs, simplify management, and improve application availability and disaster protection. Learn more about boosting the value of server virtualization. http://p.sf.net/sfu/vmware-sfdev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] Backing up little by little or throttling the backup?
Jake Wilson napsal(a): On Wed, Apr 13, 2011 at 3:39 PM, John Rouillard rouilj-backu...@renesys.com mailto:rouilj-backu...@renesys.com wrote: On Wed, Apr 13, 2011 at 10:14:13AM -0600, Jake Wilson wrote: Thanks for the replies, everyone. I have not tried the backup yet, so I don't know what to expect. I just didn't want to try it on a whim without researching it a bit more. The servers are not just file servers. They run Oracle and proprietary data modeling software that do a lot of data crunching throughout the day. Just needed to make sure that rsyncing stuff during the process was not going to hinder performance too bad. Hi, We are using the trickle library modifier http://linux.die.net/man/1/trickle : # throttling 300kB up and 2MB down $Conf{TarClientCmd} = '/usr/bin/trickle -s -u 300 -d 2000 $sshPath -C -q -x -n -l root $host' . ' /usr/bin/env LC_ALL=C nice $tarPath --one-file-system -c -v -f - -C $shareName+' . ' --totals'; Regards, Pavel. -- Benefiting from Server Virtualization: Beyond Initial Workload Consolidation -- Increasing the use of server virtualization is a top priority.Virtualization can reduce costs, simplify management, and improve application availability and disaster protection. Learn more about boosting the value of server virtualization. http://p.sf.net/sfu/vmware-sfdev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] Ubuntu, RAID 1 For Top Dir Not Working
Thanks - I read the error verbatim and realised I was being silly. The error (I can't remember the exact text and can't get to it for now) mentioned that the cpool directory and pc directory did not exist and it could not create the hard links, then it went on to certain commands to try. I was concentrating on the hardlinks issue and tried creating them myself and found no issue. Then realised it also complained about the missing directories, I resolved this by simple creating the directories under /raid/backuppc/ and chown'ing and backuppc works fine. +-- |This was sent by alankan...@hotmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- Benefiting from Server Virtualization: Beyond Initial Workload Consolidation -- Increasing the use of server virtualization is a top priority.Virtualization can reduce costs, simplify management, and improve application availability and disaster protection. Learn more about boosting the value of server virtualization. http://p.sf.net/sfu/vmware-sfdev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] Re-read hostname config after DumpPreUserCmd
Hi everyone, I've been using BackupPC for several years and I find it very useful. I have a bunch of virtual servers in our platform that I wish to backup. The issue here is that sometimes they're turned on and sometimes they're off. I came out with a solution: modifying the ssh command at login time I can pass another argument with the virtual machine name so I can mount it, chroot to it's disk and make a backup if the machine is turned off. .ssh/authorized-keys from=backupservers.andago.net,command=/root/.ssh/validate-rsync ssh-rsa B3NzaC1yc2EBIwAAAQEAsR5xaq75Dlh7w6O3RozKo9/sMZJozorPRy2aHoEkkXvPAQiOZTcK9q6OgBMZ/rsOMF4pKg8+9G6pSLjCcjpgaA5p1Dd+QpxU0jzTkX/l0oxoPJYp2P9dfLkLW3XKH6GXCS4cNOba1Sz14tBT60CVuEVlAqfRyPCEQOcFi9WleiKtjdXky2bByOG/gxNTqqQxJGtkxZ+wdDY90TTjObLhpampaVqO7sgvgFP3e9wE8duBNTN4JQm/psBl16ZsSU019c/ZPc0pWP8JcnjijPfJOeeTuB1BKdxIUyr+Yr4jhmSzWuqK41Vz8knYYgfTKtvCk7BsYMfXCeHYgB6vGN03LQ== backuppc@gondor / .ssh/authorized-keys .ssh/validate-rsync #!/bin/sh if echo $SSH_ORIGINAL_COMMAND | grep -qE ^.*rsync.*vm=*; then # if rsync with custom option vm= passed then start chroot environment VM=$(echo $SSH_ORIGINAL_COMMAND | sed -e s/^.*vm=\(.*\)$/\1/ig) SSH_ORIGINAL_COMMAND=$(echo $SSH_ORIGINAL_COMMAND | sed -es/ vm=.*$//ig) MOUNTPOINT=$(mount | grep $VM | sed -e s/^.* on \(.*\) type.*$/\1/ig) if [ -z $MOUNTPOINT ]; then echo $VM disk not mounted, DumpPreUserCmd failed! exit 1 fi chroot $MOUNTPOINT $SSH_ORIGINAL_COMMAND elif [ ! -z $SSH_ORIGINAL_COMMAND ]; then # args passed, run ssh command $SSH_ORIGINAL_COMMAND else # no args passed, interactive login bash -l fi .ssh/validate-rsync So here it comes the issue: I have a DumpPreUserCmd that modifies the very backuppc configuration of the host being backed up. This command modifies ClientNameAlias so BackupPC connects to the server that has access to the virtual machines disks. It also mounts the virtual machine disk so when DumpPreUserCmd is done running, the environment is prepared for the BackupPC rsync. So I needed to re-read the host config after DumpPreUserCmd is ran. This way if I change $host or $hostIP I can tweak BackupPC to connect to a server that hast access to virtual machine disk when the machine is off. I added the following at the end of the method UserCommandRun within BackupPC_dump and it works like a charm: code # # Re-read config file, so we can include the PC-specific config # print Re-reading $client configuration\n; $clientURI = $bpc-uriEsc($client); print [$clientURI] Change host: $host = ; if ($cmdType eq 'DumpPreUserCmd' || $cmdType eq 'DumpPostUserCmd') { if (defined(my $error = $bpc-ConfigRead($client)) ) { print(dump failed: Can't read PC's config file: $error\n); exit(1); } %Conf = $bpc-Conf(); $host=$Conf{ClientNameAlias}; $hostIP=$host; print $host\n; } /code I just wanted to share this with BackupPC users as some others may want to implement this. I wanted also to include this modification under BackupPC_dump as it increases BackupPC's functionality. Best regards. -- Marcos Lorenzo de Santiago System administrator marcos.lore...@andago.com ÁNDAGO INGENIERÍA Tlf: +34 916 011 373 Álcalde Ángel Arroyo, 10, 1º Mvl: +34 637 741 034 28904, Getafe, Madrid (Spain) Fax: +34 916 011 372 www.andago.com --- We all know Linux is great...it does infinite loops in 5 seconds. (Linus Torvalds about the superiority of Linux on the Amsterdam Linux Symposium) - Benefiting from Server Virtualization: Beyond Initial Workload Consolidation -- Increasing the use of server virtualization is a top priority.Virtualization can reduce costs, simplify management, and improve application availability and disaster protection. Learn more about boosting the value of server virtualization. http://p.sf.net/sfu/vmware-sfdev2dev___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] Backup stalls - exiting after signal ALRM
On 04/14 12:24 , Manu Poletti wrote: We use BackupPC on Ubuntu Hardy to backup a number of Windows XP hosts using smb. It has worked faultlessly for at last 2 years; until a week ago when the hosts stopped being backed up. Now none except localhost that uses the tar transfer method work. Have you tried running BackupPC_dump by hand? $ su - # su - backuppc $ /usr/share/lib/backuppc/bin/BackupPC_dump -f -v host.to.backup See if there's any consistency on where it stops, or if it gives you any meaningful error. Have you checked for filesystem corruption on your backuppc server? In cases like this in the past, where it happened with individual hosts; I would just try excluding parts of the filesystem on the host to be backed up, until the backup succeeded. However, if all hosts are having the same problem it would seem to be an issue with the backup server itself. Are all your clients being backed up with SMB? Try setting some up to use Cygwin rsyncd, and see if that succeeds. -- Carl Soderstrom Systems Administrator Real-Time Enterprises www.real-time.com -- Benefiting from Server Virtualization: Beyond Initial Workload Consolidation -- Increasing the use of server virtualization is a top priority.Virtualization can reduce costs, simplify management, and improve application availability and disaster protection. Learn more about boosting the value of server virtualization. http://p.sf.net/sfu/vmware-sfdev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] BackupPC 3.2.0 host not pingable, how to include in backup?
Hi all, especially big Hello to Craig from an old school friend. BackupPC 3.2.0 / Debian lenny / 90% of hosts on network backup as they are able to be Ping'd successfully, however 3 hosts are not able to be ping'd (yes, they are there and alive, nmblookup fails, the hosts (external) must have iptables that drop ICMP ping requests. Is there a way to configure BackupPC to backup these hosts regardless? Thanks in advance for your time and assistance. Regards, Ed Cox +-- |This was sent by edw...@onereason.com.au via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- Benefiting from Server Virtualization: Beyond Initial Workload Consolidation -- Increasing the use of server virtualization is a top priority.Virtualization can reduce costs, simplify management, and improve application availability and disaster protection. Learn more about boosting the value of server virtualization. http://p.sf.net/sfu/vmware-sfdev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] BackupPC 3.2.0 host not pingable, how to include in backup?
I simply use the per-host config to override the ping command with /bin/true on those cases... Timothy J. Massey Out of the Box Solutions Inc. Sent from my iPad On Apr 14, 2011, at 11:07 AM, edwardcox backuppc-fo...@backupcentral.com wrote: Hi all, especially big Hello to Craig from an old school friend. BackupPC 3.2.0 / Debian lenny / 90% of hosts on network backup as they are able to be Ping'd successfully, however 3 hosts are not able to be ping'd (yes, they are there and alive, nmblookup fails, the hosts (external) must have iptables that drop ICMP ping requests. Is there a way to configure BackupPC to backup these hosts regardless? Thanks in advance for your time and assistance. Regards, Ed Cox +-- |This was sent by edw...@onereason.com.au via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- Benefiting from Server Virtualization: Beyond Initial Workload Consolidation -- Increasing the use of server virtualization is a top priority.Virtualization can reduce costs, simplify management, and improve application availability and disaster protection. Learn more about boosting the value of server virtualization. http://p.sf.net/sfu/vmware-sfdev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/ -- Benefiting from Server Virtualization: Beyond Initial Workload Consolidation -- Increasing the use of server virtualization is a top priority.Virtualization can reduce costs, simplify management, and improve application availability and disaster protection. Learn more about boosting the value of server virtualization. http://p.sf.net/sfu/vmware-sfdev2dev___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] BackupPC 3.2.0 host not pingable, how to include in backup?
On 4/14/2011 1:11 AM, edwardcox wrote: Hi all, especially big Hello to Craig from an old school friend. BackupPC 3.2.0 / Debian lenny / 90% of hosts on network backup as they are able to be Ping'd successfully, however 3 hosts are not able to be ping'd (yes, they are there and alive, nmblookup fails, the hosts (external) must have iptables that drop ICMP ping requests. Is there a way to configure BackupPC to backup these hosts regardless? See $Conf{PingPath} in http://backuppc.sourceforge.net/faq/BackupPC.html -- Les Mikesell lesmikes...@gmail.com -- Benefiting from Server Virtualization: Beyond Initial Workload Consolidation -- Increasing the use of server virtualization is a top priority.Virtualization can reduce costs, simplify management, and improve application availability and disaster protection. Learn more about boosting the value of server virtualization. http://p.sf.net/sfu/vmware-sfdev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] Backup stalls - exiting after signal ALRM
On 4/13/2011 7:24 PM, Manu Poletti wrote: We use BackupPC on Ubuntu Hardy to backup a number of Windows XP hosts using smb. It has worked faultlessly for at last 2 years; until a week ago when the hosts stopped being backed up. Now none except localhost that uses the tar transfer method work. When A backup runs it seems to stall very soon (a few seconds) after it starts. Some files are copied to the temporary directory at /var/lib/backuppc/pc/[hostname]/new/ but the pid's show no CPU activity after this. The backup seems to stop on a random file each time, and using windows Computer Management on the host I can see that the server has that one file open. After this BackupPC will just sit there doing nothing until the ClientTimeout is exceeded (20 hours). Then the backup exits with an error reported in the XferLOG exiting after signal ALRM. I have tried running the smbclient command manually like this: sudo -u backuppc /usr/bin/smbclient [hostName]\\wk -U [userName] -N -d 10 -c tarmode\ full -Tc backup.tar This works so I assume its not the smb transport. But after that I am stuck. Any suggestions? Is there an 'on-access' virus scanner on the Windows boxes. Sometimes they will block or slow access to the point that the backup times out. -- Les Mikesell lesmikes...@gmail.com -- Benefiting from Server Virtualization: Beyond Initial Workload Consolidation -- Increasing the use of server virtualization is a top priority.Virtualization can reduce costs, simplify management, and improve application availability and disaster protection. Learn more about boosting the value of server virtualization. http://p.sf.net/sfu/vmware-sfdev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] How to force full backups on weekends?
I did search Google and researched the topic in the wiki and documentation and the only thing I could find remotely related was the cron scheduling page in the wiki. If a question is asked (and answered) a lot I guess I would expect to see the best solutions at least listed on the wiki. But it's not. Sorry for 'wasting' your time. Jake Wilson On Wed, Apr 13, 2011 at 3:33 PM, Jeffrey J. Kosowsky backu...@kosowsky.orgwrote: Jake Wilson wrote at about 11:54:13 -0600 on Wednesday, April 13, 2011: In order to minimize cpu load on our servers at the office, I'd like to make sure that the full backups only occur on the weekends. Is there a straightforward way to accomplish this in the interface or do I need to go the cron job route? Does anybody bother to do a Google search or read the archives before WASTING our time asking the EXACT same question that was asked just a couple of weeks ago? This newslist gets enough traffic even without people asking the same questions over and over again. It really is getting to be quite rude... It's as if posters think the rest of us have all the time in the world to answer the same FAQ's again because they are too lazy to try to find out the answer themselves. -- Benefiting from Server Virtualization: Beyond Initial Workload Consolidation -- Increasing the use of server virtualization is a top priority.Virtualization can reduce costs, simplify management, and improve application availability and disaster protection. Learn more about boosting the value of server virtualization. http://p.sf.net/sfu/vmware-sfdev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/ -- Benefiting from Server Virtualization: Beyond Initial Workload Consolidation -- Increasing the use of server virtualization is a top priority.Virtualization can reduce costs, simplify management, and improve application availability and disaster protection. Learn more about boosting the value of server virtualization. http://p.sf.net/sfu/vmware-sfdev2dev___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] How to force full backups on weekends?
On 4/14/2011 12:04 PM, Jake Wilson wrote: I did search Google and researched the topic in the wiki and documentation and the only thing I could find remotely related was the cron scheduling page in the wiki. If a question is asked (and answered) a lot I guess I would expect to see the best solutions at least listed on the wiki. But it's not. Sorry for 'wasting' your time. The reason you didn't find anything other than cron scheduling is that BackupPC does not have any features for strict scheduling of backups. The best it can do is to schedule the full backup as 1 week from the last successful full backup and this is susceptible to creep due to failed backups (Saturday's full backup fails, so it tries again on Sunday...future full backups are now done on Sundays). If you need something more structured, you have to go to cron. -- Bowie -- Benefiting from Server Virtualization: Beyond Initial Workload Consolidation -- Increasing the use of server virtualization is a top priority.Virtualization can reduce costs, simplify management, and improve application availability and disaster protection. Learn more about boosting the value of server virtualization. http://p.sf.net/sfu/vmware-sfdev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] How to force full backups on weekends?
I have scripts run from cron that: 1. do full backups on Fridays 2. do full backups on the last day of the month 3. prunes (deletes) backups based on a schedule, in my case: a. delete all full backups that don't land on a Friday or the last day of the month b. keep monthly backups for 15 month c. keep yearly backups for 7 years d. keep weekly backups for 10 weeks Works like a charm, and made accounting happy. At one point (before the scripts), the end of month backup was an incremental, and was deleted in BackupPC's regular fashion. Oops. They needed the month end data again, and all I could give them was a couple of days before or after. i still let BackupPC delete incrementals whenever it wants. Gerald - Original Message - From: Bowie Bailey bowie_bai...@buc.com To: backuppc-users@lists.sourceforge.net Sent: Thursday, April 14, 2011 11:23:51 AM Subject: Re: [BackupPC-users] How to force full backups on weekends? On 4/14/2011 12:04 PM, Jake Wilson wrote: I did search Google and researched the topic in the wiki and documentation and the only thing I could find remotely related was the cron scheduling page in the wiki. If a question is asked (and answered) a lot I guess I would expect to see the best solutions at least listed on the wiki. But it's not. Sorry for 'wasting' your time. The reason you didn't find anything other than cron scheduling is that BackupPC does not have any features for strict scheduling of backups. The best it can do is to schedule the full backup as 1 week from the last successful full backup and this is susceptible to creep due to failed backups (Saturday's full backup fails, so it tries again on Sunday...future full backups are now done on Sundays). If you need something more structured, you have to go to cron. -- Bowie -- Benefiting from Server Virtualization: Beyond Initial Workload Consolidation -- Increasing the use of server virtualization is a top priority.Virtualization can reduce costs, simplify management, and improve application availability and disaster protection. Learn more about boosting the value of server virtualization. http://p.sf.net/sfu/vmware-sfdev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/ -- Benefiting from Server Virtualization: Beyond Initial Workload Consolidation -- Increasing the use of server virtualization is a top priority.Virtualization can reduce costs, simplify management, and improve application availability and disaster protection. Learn more about boosting the value of server virtualization. http://p.sf.net/sfu/vmware-sfdev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] How to force full backups on weekends?
Hi, Jake Wilson wrote on 2011-04-14 10:04:42 -0600 [Re: [BackupPC-users] How to force full backups on weekends?]: I did search Google and researched the topic in the wiki and documentation and the only thing I could find remotely related was the cron scheduling page in the wiki. in my opinion, that is exactly the right place, so if you found that, your search was good - probably even very good, because the name of the page is not good ;-). The problem is, if you *know* the answer, you'll find it on that page. It's just not obvious (and it should be, because it's intended for people who *don't* know the answer yet). If a question is asked (and answered) a lot I guess I would expect to see the best solutions at least listed on the wiki. I agree with you there. As for me, I have, in the past, spent my energy on answering questions here rather than in the wiki. I would have loved to have a *good* wiki page to point to instead, but I never got around to putting any time into it. This time, I edited the wiki page instead. I would be greatly interested if my changes clear anything up or confuse matters further (and whether they answer your question). If they don't, keep asking. That's the only way we'll ever get answers into the wiki (that people other than ourselves understand). And Jeffrey, if you could give me a pointer to the previous thread, I'll add anything from there, or you could, of course, also do that yourself ;-). Regards, Holger -- Benefiting from Server Virtualization: Beyond Initial Workload Consolidation -- Increasing the use of server virtualization is a top priority.Virtualization can reduce costs, simplify management, and improve application availability and disaster protection. Learn more about boosting the value of server virtualization. http://p.sf.net/sfu/vmware-sfdev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] Backing up little by little or throttling the backup?
On 04/14 09:42 , Pavel Hofman wrote: We are using the trickle library modifier http://linux.die.net/man/1/trickle : Interesting. Thanks for posting that. -- Carl Soderstrom Systems Administrator Real-Time Enterprises www.real-time.com -- Benefiting from Server Virtualization: Beyond Initial Workload Consolidation -- Increasing the use of server virtualization is a top priority.Virtualization can reduce costs, simplify management, and improve application availability and disaster protection. Learn more about boosting the value of server virtualization. http://p.sf.net/sfu/vmware-sfdev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] Re-read hostname config after DumpPreUserCmd
Hi, Marcos Lorenzo de Santiago wrote on 2011-04-14 12:31:08 +0200 [[BackupPC-users] Re-read hostname config after DumpPreUserCmd]: [...] I have a bunch of virtual servers in our platform that I wish to backup. The issue here is that sometimes they're turned on and sometimes they're off. I came out with a solution: modifying the ssh command at login time I can pass another argument with the virtual machine name so I can mount it, chroot to it's disk and make a backup if the machine is turned off. while I like the idea of being flexible about how to obtain the backup, I don't see that you would need to modify the configuration at runtime for that to work. Maybe I'm missing something obvious, but I would imagine either 1.) RsyncClientCmd could do all the work - simply point that to a script that sets things up and execs the appropriate 'ssh $host rsync ...' command. 2.) DumpPreUserCmd could prepare things (as you are doing now) and write into a state file whether a native or chroot backup should be done. RsyncClientCmd could be a script that reads that state file and then execs the appropriate command like above. 3.) DumpPreUserCmd *can* be Perl code, so you could probably even use it to do exactly what you are proposing *without* modifying the BackupPC code (meaning if the change is not of general interest, you don't need to patch the source to get its benefits; you can put it in the configuration) - if you really need it. You'll have to be smart about PingCmd in any case (probably just ping the server holding the virtual disk contents?). Aside from all that, can't you just run the backup in the chroot environment regardless of whether the virtual machine is running? What kind of virtualization are we talking about? I'm not saying your approach doesn't work. It's just that, personally, I'd find a solution which does not modify the configuration at runtime more readable (maintainable) and more resilient against errors. I wanted also to include this modification under BackupPC_dump as it increases BackupPC's functionality. I disagree. As I said, I don't see that it makes anything possible that isn't already. It *does* add a failure case (if a narrow one) and unnecessary work (if not much) for the vast majority of BackupPC installations. And it could conceivably break installations if anyone should currently use DumpPreUserCmd to modify his configuration and rely on it *not* taking immediate effect (not that I'd expect that, but who knows?). Regards, Holger -- Benefiting from Server Virtualization: Beyond Initial Workload Consolidation -- Increasing the use of server virtualization is a top priority.Virtualization can reduce costs, simplify management, and improve application availability and disaster protection. Learn more about boosting the value of server virtualization. http://p.sf.net/sfu/vmware-sfdev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] How to force full backups on weekends?
Jake Wilson wrote at about 10:04:42 -0600 on Thursday, April 14, 2011: I did search Google and researched the topic in the wiki and documentation and the only thing I could find remotely related was the cron scheduling page in the wiki. If a question is asked (and answered) a lot I guess I would expect to see the best solutions at least listed on the wiki. But it's not. Sorry for 'wasting' your time. You must not be very good at Google ;) Googling: backuppc full weekend (those are the 3 key words used based on your own first sentence I'd like to make sure that full backups only occur on the weekends) And what do you know... THE *FIRST* (non-advertisement) Google result is... Re: [BackupPC-users] Restrict machine to do full backups Friday ... Mar 30, 2011 ... Re: [BackupPC-users] Restrict machine to do full backups Friday night ... that machine to exclude everything except for the weekend -- or ... http://www.adsm.org/lists/html/BackupPC-users/2011-03/msg00339.html Which is *exactly the thread from less than 2 weeks ago that I was referencing... So, really now, is googling that hard? Is looking through the last 13 days of archives that difficult? Jake Wilson On Wed, Apr 13, 2011 at 3:33 PM, Jeffrey J. Kosowsky backu...@kosowsky.orgwrote: Jake Wilson wrote at about 11:54:13 -0600 on Wednesday, April 13, 2011: In order to minimize cpu load on our servers at the office, I'd like to make sure that the full backups only occur on the weekends. Is there a straightforward way to accomplish this in the interface or do I need to go the cron job route? Does anybody bother to do a Google search or read the archives before WASTING our time asking the EXACT same question that was asked just a couple of weeks ago? This newslist gets enough traffic even without people asking the same questions over and over again. It really is getting to be quite rude... It's as if posters think the rest of us have all the time in the world to answer the same FAQ's again because they are too lazy to try to find out the answer themselves. -- Benefiting from Server Virtualization: Beyond Initial Workload Consolidation -- Increasing the use of server virtualization is a top priority.Virtualization can reduce costs, simplify management, and improve application availability and disaster protection. Learn more about boosting the value of server virtualization. http://p.sf.net/sfu/vmware-sfdev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/ -- -- Benefiting from Server Virtualization: Beyond Initial Workload Consolidation -- Increasing the use of server virtualization is a top priority.Virtualization can reduce costs, simplify management, and improve application availability and disaster protection. Learn more about boosting the value of server virtualization. http://p.sf.net/sfu/vmware-sfdev2dev -- ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/ -- Benefiting from Server Virtualization: Beyond Initial Workload Consolidation -- Increasing the use of server virtualization is a top priority.Virtualization can reduce costs, simplify management, and improve application availability and disaster protection. Learn more about boosting the value of server virtualization. http://p.sf.net/sfu/vmware-sfdev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] How to force full backups on weekends?
Holger Parplies wrote at about 19:48:52 +0200 on Thursday, April 14, 2011: And Jeffrey, if you could give me a pointer to the previous thread, I'll add anything from there, or you could, of course, also do that yourself ;-). Sure I posted the reference on my last reply to Jake... -- Benefiting from Server Virtualization: Beyond Initial Workload Consolidation -- Increasing the use of server virtualization is a top priority.Virtualization can reduce costs, simplify management, and improve application availability and disaster protection. Learn more about boosting the value of server virtualization. http://p.sf.net/sfu/vmware-sfdev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] [newb] ssh rsync with restricted permissions
[...] That looks like an invocation using rsyncd; which I would avoid. The time to use rsyncd is when backing up Windows, because cygwin ssh+rsync is buggy and doesn't work. Keep in mind that BackupPC has both: $Conf{XferMethod} = 'rsyncd'; $Conf{XferMethod} = 'rsync'; You should use 'rsync' for your XferMethod unless there's a really good reason. Thank you Carl, Bowie and Les for your answers. The main advantage I saw using rsyncd, is in the fact that the command line can be much simplified and the include and exclude options can reside in the rsyncd.conf file. But I finally rallied Carls advice (see also http://www.aboutdedupe.com/phpBB2/viewtopic.php?p=212471sid=0612823bf08f34da225b41976ec74c1c) and it works. tom +-- |This was sent by sneak...@gmx.net via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- Benefiting from Server Virtualization: Beyond Initial Workload Consolidation -- Increasing the use of server virtualization is a top priority.Virtualization can reduce costs, simplify management, and improve application availability and disaster protection. Learn more about boosting the value of server virtualization. http://p.sf.net/sfu/vmware-sfdev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] [newb] ssh rsync with restricted permissions
On 4/14/2011 2:26 PM, yilam wrote: [...] That looks like an invocation using rsyncd; which I would avoid. The time to use rsyncd is when backing up Windows, because cygwin ssh+rsync is buggy and doesn't work. Keep in mind that BackupPC has both: $Conf{XferMethod} = 'rsyncd'; $Conf{XferMethod} = 'rsync'; You should use 'rsync' for your XferMethod unless there's a really good reason. Thank you Carl, Bowie and Les for your answers. The main advantage I saw using rsyncd, is in the fact that the command line can be much simplified and the include and exclude options can reside in the rsyncd.conf file. But I finally rallied Carls advice (see also http://www.aboutdedupe.com/phpBB2/viewtopic.php?p=212471sid=0612823bf08f34da225b41976ec74c1c) and it works. tom +-- |This was sent by sneak...@gmx.net via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- Benefiting from Server Virtualization: Beyond Initial Workload Consolidation -- Increasing the use of server virtualization is a top priority.Virtualization can reduce costs, simplify management, and improve application availability and disaster protection. Learn more about boosting the value of server virtualization. http://p.sf.net/sfu/vmware-sfdev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/ I agree that limiting the backuppc user on the [backuppc-client] machine to only running rsync with certain options is good practice. I run BackupPC per the instructions at http://backuppc.sourceforge.net/faq/ssh.html#how_can_client_access_as_root_be_avoided. Isn't it the case, however, that when you run rsync over ssh that the client machine logs into the [BackupPC-server] as root?My nightmare is that a public-facing box (i.e. web server) has root access compromised, or at the very least the the private key in [BackupPC-client]/home/backuppc-user/.ssh/id_rsa is exposed. Then the bad guy could run 'ssh -i /home/backuppc-user/.ssh/id_rsa backuppc-server.mydomain.com' and get ROOT access to the backup server. Am i missing something? -Chris -- Benefiting from Server Virtualization: Beyond Initial Workload Consolidation -- Increasing the use of server virtualization is a top priority.Virtualization can reduce costs, simplify management, and improve application availability and disaster protection. Learn more about boosting the value of server virtualization. http://p.sf.net/sfu/vmware-sfdev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] [newb] ssh rsync with restricted permissions
On 04/14 02:46 , Christopher Hunt wrote: Isn't it the case, however, that when you run rsync over ssh that the client machine logs into the [BackupPC-server] as root? No, because the connections are initiated from the BackupPC server. The client does not log into the server, unless you have some arrangement that allows the client to log in (perhaps to set up a tunnel or the like). Even then, there's no reason to log in as root. -- Carl Soderstrom Systems Administrator Real-Time Enterprises www.real-time.com -- Benefiting from Server Virtualization: Beyond Initial Workload Consolidation -- Increasing the use of server virtualization is a top priority.Virtualization can reduce costs, simplify management, and improve application availability and disaster protection. Learn more about boosting the value of server virtualization. http://p.sf.net/sfu/vmware-sfdev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] How to restore an 8GB archive file?
On Thu, Apr 14, 2011 at 08:33:10AM +0200, Sorin Srbu wrote: -Original Message- OTOH, ext3 is said to have a max file size limit from about 16GB up to some 2TB, depending on block size. So why I would have a problem with an 8GB file is anybody's guess. I don't think you had a problem with the filesystem. More likely it was a ulimit issue of the user account you were using to restore the file. Check the output of 'ulimit -a' within the user account to see if that was the case. -- Benefiting from Server Virtualization: Beyond Initial Workload Consolidation -- Increasing the use of server virtualization is a top priority.Virtualization can reduce costs, simplify management, and improve application availability and disaster protection. Learn more about boosting the value of server virtualization. http://p.sf.net/sfu/vmware-sfdev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] BackupPC 3.2.0 host not pingable, how to include in backup?
Thanks Timothy Les, just what the doctor ordered. Regards, Edward. +-- |This was sent by edw...@onereason.com.au via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- Benefiting from Server Virtualization: Beyond Initial Workload Consolidation -- Increasing the use of server virtualization is a top priority.Virtualization can reduce costs, simplify management, and improve application availability and disaster protection. Learn more about boosting the value of server virtualization. http://p.sf.net/sfu/vmware-sfdev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] Gzip File Grows!
Hi Guys I use the BackupPC_tarCreate method for creating archived tar.gz files for backup to removable media. However I'm having trouble with some of the files actually growing in size once they are gzipped. The one that are growing are backups of already compressed audio files. Here is the command I use to create the archive /usr/share/backuppc/bin/BackupPC_archiveHost /usr/share/backuppc/bin/BackupPC_tarCreate /usr/bin/split /usr/bin/par2 phone-calls -1 /bin/gzip .gz 0 /tmp 0 * Is there any way I can prevent BackupPC_tarCreate from using gzip, i.e. just create the tar file? Thanks Mark -- Benefiting from Server Virtualization: Beyond Initial Workload Consolidation -- Increasing the use of server virtualization is a top priority.Virtualization can reduce costs, simplify management, and improve application availability and disaster protection. Learn more about boosting the value of server virtualization. http://p.sf.net/sfu/vmware-sfdev2dev___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] Gzip File Grows!
I just found that if I just tar the backup that the tar file is slightly larger than the resultant gzip file. I was determining the that the file was larger by executing the gzip -l command on the gzip file and it was coming back at about -324% compressed. It must be an error with the reporting command. Thanks Mark From: Mark Wass [mailto:m...@market-analyst.com] Sent: Friday, 15 April 2011 10:53 AM To: backuppc-users@lists.sourceforge.net Subject: [BackupPC-users] Gzip File Grows! Hi Guys I use the BackupPC_tarCreate method for creating archived tar.gz files for backup to removable media. However I'm having trouble with some of the files actually growing in size once they are gzipped. The one that are growing are backups of already compressed audio files. Here is the command I use to create the archive /usr/share/backuppc/bin/BackupPC_archiveHost /usr/share/backuppc/bin/BackupPC_tarCreate /usr/bin/split /usr/bin/par2 phone-calls -1 /bin/gzip .gz 0 /tmp 0 * Is there any way I can prevent BackupPC_tarCreate from using gzip, i.e. just create the tar file? Thanks Mark -- Benefiting from Server Virtualization: Beyond Initial Workload Consolidation -- Increasing the use of server virtualization is a top priority.Virtualization can reduce costs, simplify management, and improve application availability and disaster protection. Learn more about boosting the value of server virtualization. http://p.sf.net/sfu/vmware-sfdev2dev___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] How to restore an 8GB archive file?
Hi, Sorin Srbu wrote on 2011-04-14 08:33:10 +0200 [Re: [BackupPC-users] How to restore an 8GB archive file?]: -Original Message- From: Jeffrey J. Kosowsky [mailto:@.org] please don't do that. At least now I know why I'm getting spam to my backuppc-list-only email address. To: General list for user discussion, questions and support Much better, though this address is probably less sensitive ... Cc: sorin.s...@orgfarm.uu.se Your problem :-). [...] Moving the 8GB archive to a machine with ext4, solved the problem. I agree with the other opinions. Amongst other things, you changed the file system. I doubt this was the relevant change. OTOH, ext3 is said to have a max file size limit from about 16GB up to some 2TB, depending on block size. Several years ago, I worried about file sizes, too. It turned out to just work even back then. I haven't encountered such limits in years. Then again, on relevant file systems I don't tend to use ext3, because it *still* seems to have occasional problems with online resizing (admittedly on a Debian etch installation; might have gone away since). Huge files seem to go hand in hand with online resizing requirements. Sorin Srbu wrote on 2011-04-14 08:37:54 +0200 [Re: [BackupPC-users] How to restore an 8GB archive file?]: [...] From: Les Mikesell Sent: Wednesday, April 13, 2011 5:10 PM Why don't you just restore it back to his machine, using the typical option 1? If BackupPC archived it in the first place, it can restore it the same way. I've never had that option to work. This time I got a weird unable to read 4 bytes-error when trying a direct restore. Usually that means the restore is configured to use ssh in some way, and the ssh keys aren't set up correctly. Is there something different about the way your restore command works? I do use passwordless login for the backups to work. The backup works fine using ssh this way; I don't get prompted for a password. Not sure though, how you mean different for restoring. Could you elaborate a bit? You've got it the wrong way around. *You* need to elaborate. What are your RsyncClientCmd and RsyncClientRestoreCmd (it was rsync, wasn't it?)? If we knew those, we could see what might be misconfigured or causing problems (or what is even *involved* in backing up/restoring in your setup). I haven't really looked into the first restore option, ie tweaked in any way, as #2 and #3 have worked fine so far, until now. Well, then it may be set incorrectly. Or not. Depending on what you did to the backup command. Sorin Srbu wrote on 2011-04-14 08:47:12 +0200 [Re: [BackupPC-users] How to restore an 8GB archive file?]: From: Holger Parplies Sent: Thursday, April 14, 2011 12:38 AM - Which user on the target host do you need to connect as? Perhaps root? When the backuppc user connects to a host to do a backup, it uses a passwordless login with ssh keys. The password entered the very first time I transferred the key, was root's. So does this mean it's user backuppc that does the actual restore or user root? Well, you took away the context, so it's not obvious you misunderstood the question (which wasn't one, actually). If you use computers to do things, you need to think. There is no way around that. Even a nice shiny GUI does not have a do the right thing, now button. Downloading a tar file over the GUI requires you to think about where to do that and how to get the tar file to the destination computer, as the right user, and where to put it. There might be a simple solution (go to the destination computer and download the tar file from a browser belonging to the user, and he'll tell you where to put it), but there might as well be many obstacles (not enough tmp space, broken browser version, no network access to the BackupPC server, slow network link, transparent proxy, user out for lunch, user needs to leave before the download is complete ...). Some of these might even impose *arbitrary* file size limits when downloading (browsers seem to have *strange* solutions for starting downloads before they know where to put the file). You might automatically select the right option, or you might not think about it at all and just get away with it. Or hit something that looks like a file system problem, but can't really be explained. Concerning the selection of an ssh target user, if you want a generic answer, use root, that will always work (but has the potential to do more harm if you get something wrong). For your case, if you *can* log in as the file owner (all files in the restore belong to him, right?), then do that. Maybe I should have written select the target user that makes most sense in each respective case. All of this has *nothing* to do with BackupPC doing backups. It's only about *you* getting the user's files back on his computer. And it's coincidentally similar to how automatic restores would work, except that they need a generic (and