[BackupPC-users] Tar exited...
-BEGIN PGP SIGNED MESSAGE- Hash: SHA1 Hello together, I am not the perl- and tar-crack, so a popped up error results in this question: What does this log entry mean? (see end of mail) Hopefully someone got an answer... Thanks in advance and best regards Stefan Jurisch Running: /bin/tar -c -v -f - -C /mnt/vcb/vm/fichtenmopped --totals . full backup started for directory /mnt/vcb/vm/fichtenmopped Xfer PIDs are now 13446,13445 tarExtract: *** glibc detected *** /usr/bin/perl: corrupted double-linked list: 0x087ee100 *** tarExtract: === Backtrace: = tarExtract: /lib/libc.so.6[0xb7e9f654] tarExtract: /lib/libc.so.6[0xb7ea0d33] tarExtract: /lib/libc.so.6(cfree+0x9c)[0xb7ea0f3c] tarExtract: /usr/bin/perl(Perl_sv_setsv_flags+0x1e3c)[0x810279c] tarExtract: /usr/bin/perl(Perl_pp_sassign+0x75)[0x80eecf5] tarExtract: /usr/bin/perl(Perl_runops_debug+0x12f)[0x80a614f] tarExtract: /usr/bin/perl(perl_run+0x491)[0x80dbca1] tarExtract: /usr/bin/perl(main+0xed)[0x806403d] tarExtract: /lib/libc.so.6(__libc_start_main+0xe5)[0xb7e49705] tarExtract: /usr/bin/perl[0x8063eb1] tarExtract: === Memory map: tarExtract: 08048000-082a1000 r-xp 08:01 50 /usr/bin/perl tarExtract: 082a1000-082a2000 r--p 00258000 08:01 50 /usr/bin/perl tarExtract: 082a2000-082a4000 rw-p 00259000 08:01 50 /usr/bin/perl tarExtract: 082a4000-08929000 rw-p 082a4000 00:00 0 [heap] tarExtract: b6e4f000-b6f49000 rw-p b713d000 00:00 0 tarExtract: b6f49000-b704a000 rw-p b726f000 00:00 0 tarExtract: b70d5000-b71d6000 rw-p b70d5000 00:00 0 tarExtract: b71d6000-b72d rw-p b7916000 00:00 0 tarExtract: b743a000-b753b000 rw-p b78ae000 00:00 0 tarExtract: b753b000-b773d000 rw-p b753b000 00:00 0 tarExtract: b780-b7821000 rw-p b780 00:00 0 tarExtract: b7821000-b790 ---p b7821000 00:00 0 tarExtract: b793-b793d000 r-xp 08:01 262463 /lib/libgcc_s.so.1 tarExtract: b793d000-b793e000 r--p c000 08:01 262463 /lib/libgcc_s.so.1 tarExtract: b793e000-b793f000 rw-p d000 08:01 262463 /lib/libgcc_s.so.1 tarExtract: b793f000-b7a4 rw-p b793f000 00:00 0 tarExtract: b7b41000-b7c42000 rw-p b7b41000 00:00 0 tarExtract: b7c42000-b7c4c000 r-xp 08:01 262165 /lib/libnss_files-2.9.so tarExtract: b7c4c000-b7c4d000 r--p 9000 08:01 262165 /lib/libnss_files-2.9.so tarExtract: b7c4d000-b7c4e000 rw-p a000 08:01 262165 /lib/libnss_files-2.9.so tarExtract: b7c4e000-b7c57000 r-xp 08:01 262167 /lib/libnss_nis-2.9.so tarExtract: b7c57000-b7c58000 r--p 8000 08:01 262167 /lib/libnss_nis-2.9.so tarExtract: b7c58000-b7c59000 rw-p 9000 08:01 262167 /lib/libnss_nis-2.9.so tarExtract: b7c59000-b7c6 r-xp 08:01 262163 /lib/libnss_compat-2.9.so tarExtract: b7c6-b7c61000 r--p 6000 08:01 262163 /lib/libnss_compat-2.9.so tarExtract: b7c61000-b7c62000 rw-p 7000 08:01 262163 /lib/libnss_compat-2.9.so tarExtract: b7c62000-b7c75000 r-xp 08:01 262387 /lib/libz.so.1.2.3 tarExtract: b7c75000-b7c76000 r--p 00012000 08:01 262387 /lib/libz.so.1.2.3 tarExtract: b7c76000-b7c77000 rw-p 00013000 08:01 262387 /lib/libz.so.1.2.3 tarExtract: b7c7f000-b7ca8000 r-xp 08:01 574805 /usr/lib/perl5/5.10.0/i586-linux-thread-multi/auto/Compress/Raw/Zlib/Zlib.so tarExtract: b7ca8000-b7ca9000 r--p 00028000 08:01 574805 /usr/lib/perl5/5.10.0/i586-linux-thread-multi/auto/Compress/Raw/Zlib/Zlib.so tarExtract: b7ca9000-b7caa000 rw-p 00029000 08:01 574805 /usr/lib/perl5/5.10.0/i586-linux-thread-multi/auto/Compress/Raw/Zlib/Zlib.so tarExtract: b7caa000-b7caf000 r-xp 08:01 574624 /usr/lib/perl5/5.10.0/i586-linux-thread-multi/auto/File/Glob/Glob.so tarExtract: b7caf000-b7cb r--p 4000 08:01 574624 /usr/lib/perl5/5.10.0/i586-linux-thread-multi/auto/File/Glob/Glob.so tarExtract: b7cb-b7cb1000 rw-p 5000 08:01 574624 /usr/lib/perl5/5.10.0/i586-linux-thread-multi/auto/File/Glob/Glob.so tarExtract: b7cb1000-b7cbc000 r-xp 08:01 574628 /usr/lib/perl5/5.10.0/i586-linux-thread-multi/auto/List/Util/Util.so tarExtract: b7cbc000-b7cbd000 r--p b000 08:01 574628 /usr/lib/perl5/5.10.0/i586-linux-thread-multi/auto/List/Util/Util.so tarExtract: b7cbd000-b7cbe000 rw-p c000 08:01 574628 /usr/lib/perl5/5.10.0/i586-linux-thread-multi/auto/List/Util/Util.so tarExtract: b7cbe000-b7cc4000 r-xp 08:01 574626 /usr/lib/perl5/5.10.0/i586-linux-thread-multi/auto/IO/IO.so tarExtract: b7cc4000-b7cc5000 r--p 5000 08:01 574626 /usr/lib/perl5/5.10.0/i586-linux-thread-multi/auto/IO/IO.so tarExtract: b7cc5000-b7cc6000 rw-p 6000 08:01 574626 /usr/lib/perl5/5.10.0/i586-linux-thread-multi/auto/IO/IO.so tarExtract: b7cc6000-b7ccb000 r-xp 08:01 574614 /usr/lib/perl5/5.10.0/i586-linux-thread-multi/auto/Digest/MD5/MD5.so tarExtract: b7ccb000-b7ccc000 r--p 4000 08:01 574614 /usr/lib/perl5/5.10.0/i586-linux-thread-multi/auto/Digest/MD5/MD5.so tarExtract:
[BackupPC-users] experiences with very large pools?
Hi, I'm faced with the growing storage demands in my department. In the near future we will need several hundred TB. Mostly large files. ATM we already have 80 TB of data with gets backed up to tape. Providing the primary storage is not the big problem. My biggest concern is the backup of the data. One solution would be using a NetApp solution with snapshots. On the other hand is this a very expensive solution, the data will be written once, but then only read again. Short: it should be a cheap solution, but the data should be backed up. And it would be nice if we could abandon tape backups... My idea is to use some big RAID 6 arrays for the primary data, create LUNs in slices of max. 10 TB with XFS filesystems. Backuppc would be ideal for backup, because of the pool feature (we already use backuppc for a smaller amount of data). Has anyone experiences with backuppc and a pool size of 50 TB? I'm not sure how well this will work. I see that backuppc needs 45h to backup 3,2 TB of data right now, mostly small files. I don't like very large filesystems, but I don't see how this will scale with either multiple backuppc server and smaller filesystems (well, more than one server will be needed anyway, but I don't want to run 20 or more server...) or (if possible) with multiple backuppc instances on the same server, each with a own pool filesystem. So, anyone using backuppc in such an environment? Ralf -- SOLARIS 10 is the OS for Data Centers - provides features such as DTrace, Predictive Self Healing and Award Winning ZFS. Get Solaris 10 NOW http://p.sf.net/sfu/solaris-dev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] Restoring files without backuppc
I'm a little confused by the usage of BackupPC_zipCreate The directory I need a zip of is: /backup/data/pc/fshac1/530/f%2fhome/fhefinw The command I'm trying (unsuccessfully) to run (as user backuppc) is: /usr/local/BackupPC/bin/BackupPC_zipCreate -h localhost -n 530 -s fshac1 fhefinw /backup/hefin.zip The error is: bad backup number 530 for host localhost I guess I missed something out here! TIA Huw -- Huw Wyn Jones Systems Administrator Coleg Meirion Dwyfor huw.jo...@meirion-dwyfor.ac.uk - Original Message - From: Les Mikesell lesmikes...@gmail.com To: backuppc-users@lists.sourceforge.net Sent: Friday, 12 February, 2010 15:57:38 GMT +00:00 GMT Britain, Ireland, Portugal Subject: Re: [BackupPC-users] Restoring files without backuppc On 2/12/2010 4:40 AM, Huw Wyn Jones wrote: Hi folks, I have an old backuppc server which is no longer properly functional. I can get the server running reliably with just networking and sshd services, but if I start the backuppc service I get a kernel panic and the whole server freezes. However I need to recover a users files from this old system. I read that you can recover single files from the command line - but unfortunately I need around 70Mb's worth of files! Not really practical to do this one at a time. Please correct me if I'm wrong but IIRC Backuppc encodes files as it backs them up. Is there a way in which I could scp/rsync this users files to my desktop and decode them there? As ever all suggestions greatly appreciated. Have you tried BackuPC_tarCreate or BackupPC_zipCreate? You should be able to execute them via ssh and collect the output elsewhere - or pipe to an appropriate extract command. http://backuppc.sourceforge.net/faq/BackupPC.html#commandline_restore_options -- Les Mikesell lesmikes...@gmail.com -- SOLARIS 10 is the OS for Data Centers - provides features such as DTrace, Predictive Self Healing and Award Winning ZFS. Get Solaris 10 NOW http://p.sf.net/sfu/solaris-dev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/ -- SOLARIS 10 is the OS for Data Centers - provides features such as DTrace, Predictive Self Healing and Award Winning ZFS. Get Solaris 10 NOW http://p.sf.net/sfu/solaris-dev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] Mac Client Setup?
From the current Backup PC doc: *MacOSX* In general this should be similar to Linux/Unix machines. In versions 10.4 and later, the native MacOSX tar works, and also supports resource forks. xtar is another option, and rsync works too (although the MacOSX-supplied rsync has an extension for extended attributes that is not compatible with standard rsync). Can someone explain what the parenthetical expression is suppose to mean? I can interpret it in the following ways: 1. The Mac OSX supplied rsync has an extension for extended attributes (what attributes specifically???) that is (should be 'are') not compatable with standard rsynch'.and hence mac os 10.4 and greater will not work with backup pc? 2. The Mac OSX supplied rsync has an extension for extended attributes (what attributes specifically???) that is (should be 'are') not compatable with standard rsynch'.and hence mac os 10.4 and greater will work correctly when used with backup pc 3. The Mac OSX supplied rsync has an extension for extended attributes (what attributes specifically???) that is (should be 'are') not compatable with standard rsynch'.and hence mac os 10.4 and greater will work correctly when used with backup pc but will produce the following side effects...etc... Which interpretation, if any, is correct? Since I am not very 'Mac OS savvy' can someone annotate this description a bit by describing how to check is rsync is installed and running correctly on the Mac. It is implied that the service is installed and running by default but not clear at all. Thanks for the clarification. ** -- SOLARIS 10 is the OS for Data Centers - provides features such as DTrace, Predictive Self Healing and Award Winning ZFS. Get Solaris 10 NOW http://p.sf.net/sfu/solaris-dev2dev___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] Restoring files without backuppc
Huw Wyn Jones wrote: I'm a little confused by the usage of BackupPC_zipCreate The directory I need a zip of is: /backup/data/pc/fshac1/530/f%2fhome/fhefinw The command I'm trying (unsuccessfully) to run (as user backuppc) is: /usr/local/BackupPC/bin/BackupPC_zipCreate -h localhost -n 530 -s fshac1 fhefinw /backup/hefin.zip The error is: bad backup number 530 for host localhost I guess I missed something out here! In your directory structure it looks like you want to restore from a host named fshac1 but you are asking for -h localhost. And the -s option should match the sharename that was used in backuppc. -- Les Mikesell lesmikes...@gmail.com -- SOLARIS 10 is the OS for Data Centers - provides features such as DTrace, Predictive Self Healing and Award Winning ZFS. Get Solaris 10 NOW http://p.sf.net/sfu/solaris-dev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] Restoring files without backuppc
Can you try /usr/local/BackupPC/bin/BackupPC_zipCreate -h fshac1 -n 530 -s /home hefinw /backup/hefin.zip please? Does hostname=fshac1 makes sense to you? Luis A. Paulo On Mon, Feb 15, 2010 at 3:19 PM, Huw Wyn Jones huw.jo...@meirion-dwyfor.ac.uk wrote: I'm a little confused by the usage of BackupPC_zipCreate The directory I need a zip of is: /backup/data/pc/fshac1/530/f%2fhome/fhefinw The command I'm trying (unsuccessfully) to run (as user backuppc) is: /usr/local/BackupPC/bin/BackupPC_zipCreate -h localhost -n 530 -s fshac1 fhefinw /backup/hefin.zip The error is: bad backup number 530 for host localhost I guess I missed something out here! TIA Huw -- Huw Wyn Jones Systems Administrator Coleg Meirion Dwyfor huw.jo...@meirion-dwyfor.ac.uk - Original Message - From: Les Mikesell lesmikes...@gmail.com To: backuppc-users@lists.sourceforge.net Sent: Friday, 12 February, 2010 15:57:38 GMT +00:00 GMT Britain, Ireland, Portugal Subject: Re: [BackupPC-users] Restoring files without backuppc On 2/12/2010 4:40 AM, Huw Wyn Jones wrote: Hi folks, I have an old backuppc server which is no longer properly functional. I can get the server running reliably with just networking and sshd services, but if I start the backuppc service I get a kernel panic and the whole server freezes. However I need to recover a users files from this old system. I read that you can recover single files from the command line - but unfortunately I need around 70Mb's worth of files! Not really practical to do this one at a time. Please correct me if I'm wrong but IIRC Backuppc encodes files as it backs them up. Is there a way in which I could scp/rsync this users files to my desktop and decode them there? As ever all suggestions greatly appreciated. Have you tried BackuPC_tarCreate or BackupPC_zipCreate? You should be able to execute them via ssh and collect the output elsewhere - or pipe to an appropriate extract command. http://backuppc.sourceforge.net/faq/BackupPC.html#commandline_restore_options -- Les Mikesell lesmikes...@gmail.com -- SOLARIS 10 is the OS for Data Centers - provides features such as DTrace, Predictive Self Healing and Award Winning ZFS. Get Solaris 10 NOW http://p.sf.net/sfu/solaris-dev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/ -- SOLARIS 10 is the OS for Data Centers - provides features such as DTrace, Predictive Self Healing and Award Winning ZFS. Get Solaris 10 NOW http://p.sf.net/sfu/solaris-dev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/ -- SOLARIS 10 is the OS for Data Centers - provides features such as DTrace, Predictive Self Healing and Award Winning ZFS. Get Solaris 10 NOW http://p.sf.net/sfu/solaris-dev2dev___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] Tar exited...
Stefan Jurisch wrote: Hello together, I am not the perl- and tar-crack, so a popped up error results in this question: What does this log entry mean? (see end of mail) Hopefully someone got an answer... Thanks in advance and best regards Stefan Jurisch Running: /bin/tar -c -v -f - -C /mnt/vcb/vm/fichtenmopped --totals . full backup started for directory /mnt/vcb/vm/fichtenmopped Xfer PIDs are now 13446,13445 tarExtract: *** glibc detected *** /usr/bin/perl: corrupted double-linked list: 0x087ee100 *** A google search on that error shows some hits in several Linux distributions but it looks like a bug that has been fixed. Is the system where it happens up to date? -- Les Mikesell lesmikes...@gmail.com -- SOLARIS 10 is the OS for Data Centers - provides features such as DTrace, Predictive Self Healing and Award Winning ZFS. Get Solaris 10 NOW http://p.sf.net/sfu/solaris-dev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] Restoring files without backuppc
Thanks for the replies. Really helpful :-) OK so given that the stuff I need is in /backup/data/pc/fshac1/530/f%2fhome/fhefinw does that translate to: BackupPC_zipCreate -h fshac1 -n 530 -s fhefinw /backup/hefin.zip I'm a little unsure as what exactly is expected in share as I'd originally thought this was fshac1 (one of the hosts). Thanks as ever for all the assistance. Cheers Huw -- Huw Wyn Jones Systems Administrator Coleg Meirion Dwyfor -- SOLARIS 10 is the OS for Data Centers - provides features such as DTrace, Predictive Self Healing and Award Winning ZFS. Get Solaris 10 NOW http://p.sf.net/sfu/solaris-dev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] Restoring files without backuppc
Huw Wyn Jones wrote: Thanks for the replies. Really helpful :-) OK so given that the stuff I need is in /backup/data/pc/fshac1/530/f%2fhome/fhefinw does that translate to: BackupPC_zipCreate -h fshac1 -n 530 -s fhefinw /backup/hefin.zip I'm a little unsure as what exactly is expected in share as I'd originally thought this was fshac1 (one of the hosts). share is the name it had in the backuppc configuration. With smb or rsyncd, it would be the name of the network share or module . With tar or rsync it would be the top directory of the tree for the run - if you only had one for the host it might be '/'. -- Les Mikesell lesmikes...@gmail.com -- SOLARIS 10 is the OS for Data Centers - provides features such as DTrace, Predictive Self Healing and Award Winning ZFS. Get Solaris 10 NOW http://p.sf.net/sfu/solaris-dev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] Encrypting BackupPC TopDir
Would be cool to get the BackupPC TopDir on an encrypted container so I could backup machines that are running an encrypted OS. I found out that dm_crypt can be used between a loop mounted FS container to encrypt all the data in that container, there's just one problem, the size of the container isn't dynamic. So as the backup space requirements change you would need to manually shrink and grow the container and the filesystem in it. I don't want that. The only dynamic crypto container solution I found was EncFS, it seems to be the perfect tool for the job. I tried and tried and tried on my Debian Lenny but I just couldn't get the TopDir to work on an EncFS mount residing on an ext3 partition. EncFS has a command line argument '--public' that is supposed to make the mount act as a normal multiuser mount, well it doesn't work. My backuppc user gets Permission Denied when trying to create new files. And I've checked that the filesystem permissions allowed writing for user backuppc. Now when I do the default EncFS single user mount while logged in as user backuppc, the BackupPC 3.1.0 daemon starts all happy, but the WebUI refuses to connect to the daemon for some reason. Has anyone got EncFS to work with BackupPC? Are there any other solutions I might wanna consider? +-- |This was sent by tri...@r00t3d.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- SOLARIS 10 is the OS for Data Centers - provides features such as DTrace, Predictive Self Healing and Award Winning ZFS. Get Solaris 10 NOW http://p.sf.net/sfu/solaris-dev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] Restoring files without backuppc
I'm loosing it here Les :( OK, if I understood what you meant correctly, something like: /usr/local/BackupPC/bin/BackupPC_zipCreate -c 4 -h fshac1 -n 530 -s /home/hefinw /backup/hefin.zip should work. Given the location of the directory (/backup/data/pc/fshac1/530/f%2fhome/fhefinw) -h fshac1 = the BackupPC host -n 530 = the backup no -s /home/hefinw = the 'share' on the host -c 4 = the compression level used when the backup was created When I run the above command I simply get a 'usage' message! I've tried all sorts of variations (fhefinw etc etc) but without any greater success. Very frustrating. 12.15am. Time for bed I think. H -- Huw Wyn Jones Systems Administrator Coleg Meirion Dwyfor huw.jo...@meirion-dwyfor.ac.uk - Original Message - From: Les Mikesell lesmikes...@gmail.com To: General list for user discussion, questions and support backuppc-users@lists.sourceforge.net Sent: Monday, 15 February, 2010 22:59:46 GMT +00:00 GMT Britain, Ireland, Portugal Subject: Re: [BackupPC-users] Restoring files without backuppc Huw Wyn Jones wrote: Thanks for the replies. Really helpful :-) OK so given that the stuff I need is in /backup/data/pc/fshac1/530/f%2fhome/fhefinw does that translate to: BackupPC_zipCreate -h fshac1 -n 530 -s fhefinw /backup/hefin.zip I'm a little unsure as what exactly is expected in share as I'd originally thought this was fshac1 (one of the hosts). share is the name it had in the backuppc configuration. With smb or rsyncd, it would be the name of the network share or module . With tar or rsync it would be the top directory of the tree for the run - if you only had one for the host it might be '/'. -- Les Mikesell lesmikes...@gmail.com -- SOLARIS 10 is the OS for Data Centers - provides features such as DTrace, Predictive Self Healing and Award Winning ZFS. Get Solaris 10 NOW http://p.sf.net/sfu/solaris-dev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/ -- SOLARIS 10 is the OS for Data Centers - provides features such as DTrace, Predictive Self Healing and Award Winning ZFS. Get Solaris 10 NOW http://p.sf.net/sfu/solaris-dev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/