Re: [BackupPC-users] How to restore an 8GB archive file?

2011-04-15 Thread Sorin Srbu
-Original Message-
From: Dale King [mailto:d...@daleking.org]
Sent: Friday, April 15, 2011 1:17 AM
To: backuppc-users@lists.sourceforge.net
Subject: Re: [BackupPC-users] How to restore an 8GB archive file?

 OTOH, ext3 is said to have a max file size limit from about 16GB up to some
2TB,
 depending on block size. So why I would have a problem with an 8GB file is
 anybody's guess.

I don't think you had a problem with the filesystem.  More likely it was a
ulimit issue of the user account you were using to restore the file.
Check the output of 'ulimit -a' within the user account to see if that was
the case.

I checked that. It said unlimited.

We generally don't limit things like that here (unless that is a default setting
in CentOS?), as files and folders on the machines in question can grow to
gigabytes, depending on how complicated a particular molecular modeling session
is.

-- 
/Sorin



--
Benefiting from Server Virtualization: Beyond Initial Workload 
Consolidation -- Increasing the use of server virtualization is a top
priority.Virtualization can reduce costs, simplify management, and improve 
application availability and disaster protection. Learn more about boosting 
the value of server virtualization. http://p.sf.net/sfu/vmware-sfdev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] How to restore an 8GB archive file?

2011-04-15 Thread Sorin Srbu
-Original Message-
From: Holger Parplies [mailto:wb...@parplies.de]
Sent: Friday, April 15, 2011 6:50 AM
To: sorin.s...@orgfarm.uu.se; General list for user discussion, questions
and support
Subject: Re: [BackupPC-users] How to restore an 8GB archive file?

 -Original Message-
 From: Jeffrey J. Kosowsky [mailto:@.org]

please don't do that. At least now I know why I'm getting spam to my
backuppc-list-only email address.

 To: General list for user discussion, questions and support

Much better, though this address is probably less sensitive ...

 Cc: sorin.s...@orgfarm.uu.se

Your problem :-).

I don't follow, don't do what??


 OTOH, ext3 is said to have a max file size limit from about 16GB up to
 some 2TB, depending on block size.

Several years ago, I worried about file sizes, too. It turned out to just
work even back then. I haven't encountered such limits in years. Then again,
on relevant file systems I don't tend to use ext3, because it *still* seems to
have occasional problems with online resizing (admittedly on a Debian etch 
installation; might have gone away since). Huge files seem to go hand in hand
with online resizing requirements.

I was limited to ext3 on the old backup server, as well as the hosts. The hosts
were installed with CentOS 5.0 a few years back, at which time only ext3 was
available as the most proven and stable file system. We will most probably go
with ext4 when CentOS 6 is released, and we do a full fresh install of
everything.


Sorin Srbu wrote on 2011-04-14 08:37:54 +0200 [Re: [BackupPC-users] How to
restore  an 8GB archive file?]:
 [...]
 From: Les Mikesell
 Sent: Wednesday, April 13, 2011 5:10 PM
  Why don't you just restore it back to his machine, using the typical
  option 1? If BackupPC archived it in the first place, it can restore it
  the same way.
 
  I've never had that option to work. This time I got a weird unable to
 read 4 bytes-error when trying a direct restore.
 
 Usually that means the restore is configured to use ssh in some way, and
 the ssh keys aren't set up correctly.  Is there something different
 about the way your restore command works?

 I do use passwordless login for the backups to work. The backup works fine
 using ssh this way; I don't get prompted for a password.

 Not sure though, how you mean different for restoring. Could you elaborate a
 bit?

You've got it the wrong way around. *You* need to elaborate. What are your
RsyncClientCmd and RsyncClientRestoreCmd (it was rsync, wasn't it?)? If we
knew those, we could see what might be misconfigured or causing problems (or
what is even *involved* in backing up/restoring in your setup).

Yes, rsync, I use the default settings. They seem to have worked fine, until now
at least.

RsyncClientCmd:
$sshPath -q -x -l root $host $rsyncPath $argList+

RsyncClientRestoreCmd:
$sshPath -q -x -l root $host $rsyncPath $argList+


So let's get back to that topic, if you're still interested.

I am. Would be nice to understand why things went kinda' pear-shaped at first.

Sorry if I spaced out... I find linux, and BPC, somewhat frustrating sometime.
8-/
-- 
/Sorin



--
Benefiting from Server Virtualization: Beyond Initial Workload 
Consolidation -- Increasing the use of server virtualization is a top
priority.Virtualization can reduce costs, simplify management, and improve 
application availability and disaster protection. Learn more about boosting 
the value of server virtualization. http://p.sf.net/sfu/vmware-sfdev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] How to restore an 8GB archive file?

2011-04-14 Thread Sorin Srbu
-Original Message-
From: Jeffrey J. Kosowsky [mailto:backu...@kosowsky.org]
Sent: Wednesday, April 13, 2011 3:33 PM
To: General list for user discussion, questions and support
Cc: sorin.s...@orgfarm.uu.se
Subject: Re: [BackupPC-users] How to restore an 8GB archive file?

  That limit is long gone:
 
  root@frances:/tmp# uname -a
  Linux frances 2.6.32-30-generic #59-Ubuntu SMP Tue Mar 1 21:30:21 UTC
  2011 i686 GNU/Linux

I believe the OP was talking about 32bit Windows. Though even on WinXP
or Win2000 I don't believe that is a limitation (unless you use FAT32
rather than NTFS). Perhaps the OP was talking about FAT32...

No, it was actually linux. However it was my misunderstanding, as I thought it
was a 32b kernel-problem, when in fact it's a file system limitation according
to Google. The problem first came up on a 32b linux machine running ext3 file
system. Moving the 8GB archive to a machine with ext4, solved the problem. 

OTOH, ext3 is said to have a max file size limit from about 16GB up to some 2TB,
depending on block size. So why I would have a problem with an 8GB file is
anybody's guess.

-- 
/Sorin



--
Benefiting from Server Virtualization: Beyond Initial Workload 
Consolidation -- Increasing the use of server virtualization is a top
priority.Virtualization can reduce costs, simplify management, and improve 
application availability and disaster protection. Learn more about boosting 
the value of server virtualization. http://p.sf.net/sfu/vmware-sfdev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] How to restore an 8GB archive file?

2011-04-14 Thread Sorin Srbu
-Original Message-
From: Les Mikesell [mailto:lesmikes...@gmail.com]
Sent: Wednesday, April 13, 2011 5:10 PM
To: backuppc-users@lists.sourceforge.net
Subject: Re: [BackupPC-users] How to restore an 8GB archive file?

 Why don't you just restore it back to his machine, using the typical
 option 1? If BackupPC archived it in the first place, it can restore it
 the same way.

 I've never had that option to work. This time I got a weird unable to
read 4 bytes-error when trying a direct restore.

Usually that means the restore is configured to use ssh in some way, and
the ssh keys aren't set up correctly.  Is there something different
about the way your restore command works?

I do use passwordless login for the backups to work. The backup works fine using
ssh this way; I don't get prompted for a password.

Not sure though, how you mean different for restoring. Could you elaborate a
bit?

I haven't really looked into the first restore option, ie tweaked in any way, as
#2 and #3 have worked fine so far, until now.
-- 
/Sorin



--
Benefiting from Server Virtualization: Beyond Initial Workload 
Consolidation -- Increasing the use of server virtualization is a top
priority.Virtualization can reduce costs, simplify management, and improve 
application availability and disaster protection. Learn more about boosting 
the value of server virtualization. http://p.sf.net/sfu/vmware-sfdev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] How to restore an 8GB archive file?

2011-04-14 Thread Sorin Srbu
-Original Message-
From: Holger Parplies [mailto:wb...@parplies.de]
Sent: Thursday, April 14, 2011 12:38 AM
To: sorin.s...@orgfarm.uu.se; General list for user discussion, questions
and support
Subject: Re: [BackupPC-users] How to restore an 8GB archive file?

- Which user on the target host do you need to connect as? Perhaps root?

When the backuppc user connects to a host to do a backup, it uses a
passwordless login with ssh keys. The password entered the very first time I
transferred the key, was root's. So does this mean it's user backuppc that
does the actual restore or user root? If the first, then I can understand that
the user backuppc can't write to anywhere, right?


Personally, I wouldn't use the web interface for downloading large amounts of
data anyway. On the command line, your imagination is the limit to what you
can do. If it's not available as a filter yet, the BPC-author would likely
need to implement the functionality. A generic tar2zipsplit filter would be
more useful to the world than a specific implementation inside BackupPC,
don't you think?

Dunno', I only ever use the web gui, as it's so easy, practical and
straight-forward to use. Actually it's the main reason why I stick with BPC;
IMHO a backup-system is as good as the gui is and how admin-friendly it is.
Personally I don't want to jump through hoops when I need to restore stuff
quickly - a few clicks in the gui and I'm done. As I said, it's my personal
opinions and maybe not really on-topic. 8-)
-- 
/Sorin




--
Benefiting from Server Virtualization: Beyond Initial Workload 
Consolidation -- Increasing the use of server virtualization is a top
priority.Virtualization can reduce costs, simplify management, and improve 
application availability and disaster protection. Learn more about boosting 
the value of server virtualization. http://p.sf.net/sfu/vmware-sfdev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] How to restore an 8GB archive file?

2011-04-14 Thread Dale King
On Thu, Apr 14, 2011 at 08:33:10AM +0200, Sorin Srbu wrote:
 -Original Message-
 
 OTOH, ext3 is said to have a max file size limit from about 16GB up to some 
 2TB,
 depending on block size. So why I would have a problem with an 8GB file is
 anybody's guess.

I don't think you had a problem with the filesystem.  More likely it was a
ulimit issue of the user account you were using to restore the file.
Check the output of 'ulimit -a' within the user account to see if that was
the case.


--
Benefiting from Server Virtualization: Beyond Initial Workload 
Consolidation -- Increasing the use of server virtualization is a top
priority.Virtualization can reduce costs, simplify management, and improve 
application availability and disaster protection. Learn more about boosting 
the value of server virtualization. http://p.sf.net/sfu/vmware-sfdev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] How to restore an 8GB archive file?

2011-04-14 Thread Holger Parplies
Hi,

Sorin Srbu wrote on 2011-04-14 08:33:10 +0200 [Re: [BackupPC-users] How to 
restore  an 8GB archive file?]:
 -Original Message-
 From: Jeffrey J. Kosowsky [mailto:@.org]

please don't do that. At least now I know why I'm getting spam to my
backuppc-list-only email address.

 To: General list for user discussion, questions and support

Much better, though this address is probably less sensitive ...

 Cc: sorin.s...@orgfarm.uu.se

Your problem :-).

 [...]
 Moving the 8GB archive to a machine with ext4, solved the problem. 

I agree with the other opinions. Amongst other things, you changed the file
system. I doubt this was the relevant change.

 OTOH, ext3 is said to have a max file size limit from about 16GB up to
 some 2TB, depending on block size.

Several years ago, I worried about file sizes, too. It turned out to just
work even back then. I haven't encountered such limits in years. Then again, 
on relevant file systems I don't tend to use ext3, because it *still* seems to
have occasional problems with online resizing (admittedly on a Debian etch
installation; might have gone away since). Huge files seem to go hand in hand
with online resizing requirements.

Sorin Srbu wrote on 2011-04-14 08:37:54 +0200 [Re: [BackupPC-users] How to 
restore  an 8GB archive file?]:
 [...]
 From: Les Mikesell
 Sent: Wednesday, April 13, 2011 5:10 PM
  Why don't you just restore it back to his machine, using the typical
  option 1? If BackupPC archived it in the first place, it can restore it
  the same way.
 
  I've never had that option to work. This time I got a weird unable to
 read 4 bytes-error when trying a direct restore.
 
 Usually that means the restore is configured to use ssh in some way, and
 the ssh keys aren't set up correctly.  Is there something different
 about the way your restore command works?
 
 I do use passwordless login for the backups to work. The backup works fine
 using ssh this way; I don't get prompted for a password.
 
 Not sure though, how you mean different for restoring. Could you elaborate a
 bit?

You've got it the wrong way around. *You* need to elaborate. What are your
RsyncClientCmd and RsyncClientRestoreCmd (it was rsync, wasn't it?)? If we
knew those, we could see what might be misconfigured or causing problems (or
what is even *involved* in backing up/restoring in your setup).

 I haven't really looked into the first restore option, ie tweaked in any
 way, as #2 and #3 have worked fine so far, until now.

Well, then it may be set incorrectly. Or not. Depending on what you did to the
backup command.

Sorin Srbu wrote on 2011-04-14 08:47:12 +0200 [Re: [BackupPC-users] How to 
restore  an 8GB archive file?]:
 From: Holger Parplies
 Sent: Thursday, April 14, 2011 12:38 AM
 
 - Which user on the target host do you need to connect as? Perhaps root?
 
 When the backuppc user connects to a host to do a backup, it uses a
 passwordless login with ssh keys. The password entered the very first time I
 transferred the key, was root's. So does this mean it's user backuppc that
 does the actual restore or user root?

Well, you took away the context, so it's not obvious you misunderstood the
question (which wasn't one, actually).

If you use computers to do things, you need to think. There is no way around
that. Even a nice shiny GUI does not have a do the right thing, now button.
Downloading a tar file over the GUI requires you to think about where to do
that and how to get the tar file to the destination computer, as the right
user, and where to put it. There might be a simple solution (go to the
destination computer and download the tar file from a browser belonging to
the user, and he'll tell you where to put it), but there might as well be
many obstacles (not enough tmp space, broken browser version, no network
access to the BackupPC server, slow network link, transparent proxy, user
out for lunch, user needs to leave before the download is complete ...).
Some of these might even impose *arbitrary* file size limits when downloading
(browsers seem to have *strange* solutions for starting downloads before they
know where to put the file).
You might automatically select the right option, or you might not think
about it at all and just get away with it. Or hit something that looks like a
file system problem, but can't really be explained.

Concerning the selection of an ssh target user, if you want a generic answer,
use root, that will always work (but has the potential to do more harm if you
get something wrong). For your case, if you *can* log in as the file owner
(all files in the restore belong to him, right?), then do that. Maybe I should
have written select the target user that makes most sense in each respective
case.

All of this has *nothing* to do with BackupPC doing backups. It's only about
*you* getting the user's files back on his computer. And it's coincidentally
similar to how automatic restores would work, except that they need a generic
(and non

[BackupPC-users] How to restore an 8GB archive file?

2011-04-13 Thread Sorin Srbu
Hi all,

A user came to me this morning asking me to restore a folder which turns out to
be some 8,5GB.

Not initially knowing how big it actually was, using BPC I downloaded it in a
zipped and uncompressed format to my Windows machine, then transferred the zip
to her linux machine running 32b CentOS v5.6. Uncompression failed. 
Googling a bit I found out 32b linux is limited to 2GB per file on
account of the file system. 8GB is a lot more than 2GB, so this would explain
the uncompression failing.

So, my question is a two-parter:

1. How would I best deal with really big archives when restoring from BPC and
32b linux is involved?

2. Wouldn't a zip-split function be a nice thing to have in BPC when restoring
data? This is a hint to the BPC-author. 8-)

Thanks.
-- 
BW,
Sorin
---
# Sorin Srbu[Sysadmin, Systems Engineer]
# Dept of Medicinal Chemistry,  Phone: +46 (0)18-4714482 3 rings GSM
# Div of Org Pharm Chem,Mobile: +46 (0)701-718023
# Box 574, Uppsala University,  Fax: +46 (0)18-4714482
# SE-751 23 Uppsala, Sweden Visit: BMC, Husargatan 3, D5:512b
#   Web: http://www.orgfarm.uu.se
---
# ()  ASCII ribbon campaign - Against html E-mail 
# /\  http://www.asciiribbon.org
#
# MotD follows:
# The essence of motorcycling: Four wheels moves your body. Two wheels moves
your soul.




--
Forrester Wave Report - Recovery time is now measured in hours and minutes
not days. Key insights are discussed in the 2010 Forrester Wave Report as
part of an in-depth evaluation of disaster recovery service providers.
Forrester found the best-in-class provider in terms of services and vision.
Read this report now!  http://p.sf.net/sfu/ibm-webcastpromo
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] How to restore an 8GB archive file?

2011-04-13 Thread Tyler J. Wagner
On Wed, 2011-04-13 at 14:38 +0200, Sorin Srbu wrote:
   Googling a bit I found out 32b linux is limited to 2GB per file on
 account of the file system. 8GB is a lot more than 2GB, so this would explain
 the uncompression failing.

That limit is long gone:

root@frances:/tmp# uname -a
Linux frances 2.6.32-30-generic #59-Ubuntu SMP Tue Mar 1 21:30:21 UTC
2011 i686 GNU/Linux
root@frances:/tmp# dd if=/dev/zero of=/tmp/test bs=1048576 count=3072
3072+0 records in
3072+0 records out
3221225472 bytes (3.2 GB) copied, 57.5292 s, 56.0 MB/s
root@frances:/tmp# ls -lah test 
-rw-r--r-- 1 root root 3.0G 2011-04-13 13:58 test
root@frances:/tmp# du -sh test
3.1Gtest

There *WAS* a 2GB limit, under kernel 2.4 and ext2. Anything you
installed in the last 4 years does not have this issue.

Regards,
Tyler

-- 
Freedom of thought is best promoted by the gradual illumination of
men's minds, which follows from the advance of science.
   -- Charles Darwin


--
Forrester Wave Report - Recovery time is now measured in hours and minutes
not days. Key insights are discussed in the 2010 Forrester Wave Report as
part of an in-depth evaluation of disaster recovery service providers.
Forrester found the best-in-class provider in terms of services and vision.
Read this report now!  http://p.sf.net/sfu/ibm-webcastpromo
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] How to restore an 8GB archive file?

2011-04-13 Thread Tyler J. Wagner
On Wed, 2011-04-13 at 14:38 +0200, Sorin Srbu wrote:
 A user came to me this morning asking me to restore a folder which turns out 
 to
 be some 8,5GB.

Why don't you just restore it back to his machine, using the typical
option 1? If BackupPC archived it in the first place, it can restore it
the same way.

Regards,
Tyler

-- 
... that your voice is amplified to the degree where it reaches from
one end of the country to the other does not confer upon you greater
wisdom or understanding than you possessed when your voice reached only
from one end of the bar to the other.
   -- Edward R. Murrow


--
Forrester Wave Report - Recovery time is now measured in hours and minutes
not days. Key insights are discussed in the 2010 Forrester Wave Report as
part of an in-depth evaluation of disaster recovery service providers.
Forrester found the best-in-class provider in terms of services and vision.
Read this report now!  http://p.sf.net/sfu/ibm-webcastpromo
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] How to restore an 8GB archive file?

2011-04-13 Thread Jeffrey J. Kosowsky
Tyler J. Wagner wrote at about 14:02:26 +0100 on Wednesday, April 13, 2011:
  On Wed, 2011-04-13 at 14:38 +0200, Sorin Srbu wrote:
  Googling a bit I found out 32b linux is limited to 2GB per file on
   account of the file system. 8GB is a lot more than 2GB, so this would 
   explain
   the uncompression failing.
  
  That limit is long gone:
  
  root@frances:/tmp# uname -a
  Linux frances 2.6.32-30-generic #59-Ubuntu SMP Tue Mar 1 21:30:21 UTC
  2011 i686 GNU/Linux

I believe the OP was talking about 32bit Windows. Though even on WinXP
or Win2000 I don't believe that is a limitation (unless you use FAT32
rather than NTFS). Perhaps the OP was talking about FAT32...

--
Forrester Wave Report - Recovery time is now measured in hours and minutes
not days. Key insights are discussed in the 2010 Forrester Wave Report as
part of an in-depth evaluation of disaster recovery service providers.
Forrester found the best-in-class provider in terms of services and vision.
Read this report now!  http://p.sf.net/sfu/ibm-webcastpromo
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] How to restore an 8GB archive file?

2011-04-13 Thread Sorin Srbu
-Original Message-
From: Tyler J. Wagner [mailto:ty...@tolaris.com]
Sent: Wednesday, April 13, 2011 3:03 PM
To: sorin.s...@orgfarm.uu.se; General list for user discussion, questions
and support
Subject: Re: [BackupPC-users] How to restore an 8GB archive file?

On Wed, 2011-04-13 at 14:38 +0200, Sorin Srbu wrote:
 A user came to me this morning asking me to restore a folder which turns out 
 to
 be some 8,5GB.

Why don't you just restore it back to his machine, using the typical
option 1? If BackupPC archived it in the first place, it can restore it
the same way.

I've never had that option to work. This time I got a weird unable to read 4 
bytes-error when trying a direct restore.
-- 
/Sorin



--
Forrester Wave Report - Recovery time is now measured in hours and minutes
not days. Key insights are discussed in the 2010 Forrester Wave Report as
part of an in-depth evaluation of disaster recovery service providers.
Forrester found the best-in-class provider in terms of services and vision.
Read this report now!  http://p.sf.net/sfu/ibm-webcastpromo
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] How to restore an 8GB archive file?

2011-04-13 Thread Carl Wilhelm Soderstrom
On 04/13 02:38 , Sorin Srbu wrote:
 1. How would I best deal with really big archives when restoring from BPC and
 32b linux is involved?

Use tar when recovering a file for Unix, zip when recovering for windows.
(Tho .zip may be buggy and you may need to use tar anyway).

-- 
Carl Soderstrom
Systems Administrator
Real-Time Enterprises
www.real-time.com

--
Forrester Wave Report - Recovery time is now measured in hours and minutes
not days. Key insights are discussed in the 2010 Forrester Wave Report as
part of an in-depth evaluation of disaster recovery service providers.
Forrester found the best-in-class provider in terms of services and vision.
Read this report now!  http://p.sf.net/sfu/ibm-webcastpromo
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] How to restore an 8GB archive file?

2011-04-13 Thread Sorin Srbu
-Original Message-
From: Carl Wilhelm Soderstrom [mailto:chr...@real-time.com]
Sent: Wednesday, April 13, 2011 4:01 PM
To: sorin.s...@orgfarm.uu.se; General list for user discussion, questions
and support
Subject: Re: [BackupPC-users] How to restore an 8GB archive file?

On 04/13 02:38 , Sorin Srbu wrote:
 1. How would I best deal with really big archives when restoring from BPC
and
 32b linux is involved?

Use tar when recovering a file for Unix, zip when recovering for windows.
(Tho .zip may be buggy and you may need to use tar anyway).

Yupp, just what I ended up doing in the end. Great minds think alike, right?
Thanks. ;-)

-- 
/Sorin



--
Forrester Wave Report - Recovery time is now measured in hours and minutes
not days. Key insights are discussed in the 2010 Forrester Wave Report as
part of an in-depth evaluation of disaster recovery service providers.
Forrester found the best-in-class provider in terms of services and vision.
Read this report now!  http://p.sf.net/sfu/ibm-webcastpromo
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] How to restore an 8GB archive file?

2011-04-13 Thread Les Mikesell
On 4/13/2011 8:34 AM, Sorin Srbu wrote:

 A user came to me this morning asking me to restore a folder which turns 
 out to
 be some 8,5GB.

 Why don't you just restore it back to his machine, using the typical
 option 1? If BackupPC archived it in the first place, it can restore it
 the same way.

 I've never had that option to work. This time I got a weird unable to read 4 
 bytes-error when trying a direct restore.

Usually that means the restore is configured to use ssh in some way, and 
the ssh keys aren't set up correctly.  Is there something different 
about the way your restore command works?

-- 
   Les Mikesell
 lesmikes...@gmail.com


--
Forrester Wave Report - Recovery time is now measured in hours and minutes
not days. Key insights are discussed in the 2010 Forrester Wave Report as
part of an in-depth evaluation of disaster recovery service providers.
Forrester found the best-in-class provider in terms of services and vision.
Read this report now!  http://p.sf.net/sfu/ibm-webcastpromo
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] How to restore an 8GB archive file?

2011-04-13 Thread Holger Parplies
Hi,

just to add an option for the archives ...

Sorin Srbu wrote on 2011-04-13 14:38:29 +0200 [[BackupPC-users] How to restore  
an 8GB archive file?]:
 [...]
 Not initially knowing how big it actually was, using BPC I downloaded it in a
 zipped and uncompressed format to my Windows machine, then transferred the zip
 to her linux machine running 32b CentOS v5.6. [...]
 
 1. How would I best deal with really big archives when restoring from BPC and
 32b linux is involved?

with really big archives you want to avoid unnecessary network transfers and
intermediate storage of the files. Try something along the lines of ...

backuppc-server$ sudo -u backuppc 
/usr/share/backuppc/bin/BackupPC_tarCreate -h host -n dumpNum -s shareName 
/path/to/data/relative/to/share | ssh -l user target-host tar xvpf - -C 
/share/path/on/target

(of course, great minds prefer netcat ;-). You'll have to play around a bit
with that (practise with small amounts of data and piping BackupPC_tarCreate
into 'tar tvf -' instead of ssh to get a feeling for what files you are
selecting and what the paths in the tar stream look like).
Some things to consider:
- Which user on the target host do you need to connect as? Perhaps root?
- Are you restoring in-place or do you need to change paths? Consider using
  the '-r' and '-p' options to BackupPC_tarCreate or restore to a temporary
  location - preferably on the correct partition - and move the target
  directory into the correct place manually. Check permissions before moving
  so 'mv' does not, in fact, start copying things.
- Does sudo -u backuppc work for you or do you need to become the backuppc
  user in a different way?
- Where is your BackupPC_tarCreate? I've used the Debian package path, but
  that's not the standard ...
- I just added a tar 'v' option, because you should probably see what you are
  doing until it has become routine, and perhaps even then ...

 2. Wouldn't a zip-split function be a nice thing to have in BPC when restoring
 data? This is a hint to the BPC-author. 8-)

Personally, I wouldn't use the web interface for downloading large amounts of
data anyway. On the command line, your imagination is the limit to what you
can do. If it's not available as a filter yet, the BPC-author would likely
need to implement the functionality. A generic tar2zipsplit filter would be
more useful to the world than a specific implementation inside BackupPC, don't
you think?

Regards,
Holger

--
Benefiting from Server Virtualization: Beyond Initial Workload 
Consolidation -- Increasing the use of server virtualization is a top
priority.Virtualization can reduce costs, simplify management, and improve 
application availability and disaster protection. Learn more about boosting 
the value of server virtualization. http://p.sf.net/sfu/vmware-sfdev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/