Re: [BackupPC-users] File Encryption

2008-01-16 Thread Martin Leben
Robert Fulcher wrote:
> I have a client that wants to backup there server (Ubuntu).  I have a backup
> running for them but they want to make sure that the files are encrypted.
> Will backuppc encrypt the files that are backed up?  If not is there a way
> to add this feature?
> 
> Thanks


Hi!

(I am a first time poster, so take my advice for what it is.)

No, BackupPC doesn't encrypt files. And in my very humble opinion it would be 
wrong to add it to BackupPC. These things are better solved at the filesystem 
level, I'd say.

There are two roads to take, as I see it. Which one to choose depends on your 
clients level of paranoia:

1) Not so high paranoia: The BackupPC data directory (/var/lib/backuppc) is in 
a 
encrypted filesystem. Suitable if the client trusts you not to loose his files 
and with key management et cetera.

2) High paranoia: He should make sure that the data is encrypted before it is 
backed up. Suitable if the client only trusts you not to loose his files.


Route 1) is quite easy, since there are quite a few ways to have an encrypted 
filesystem. I will not go into that here.

Route 2) actually has two solutions, of which only one is currently available 
as 
far as I know.


2.1) The client uses an encrypting filesystem like for example EncFS 
<http://www.arg0.net/encfs> on his computer(s). EncFS is a "Pass-through 
filesystem" which stores the files in another part of the filesystem. Example:
   $ encfs /tmp/crypt /tmp/data
... makes it possible to use the directory /tmp/data just as any other 
directory. But the files are actually stored encrypted in /tmp/crypt. So in 
this 
case the client would tell BackupPC to backup the /tmp/crypt directory.


2.2) The client uses a filesystem that acts kind of the opposite as EncFS: The 
files are in e.g. /tmp/data and uses a filesystem that "mirrors" /tmp/data to 
another directory like e.g. /tmp/crypt. When something is read from /tmp/crypt, 
the actual reading takes place in /tmp/data, but before the data is returned by 
the filesystem to the reader, the filesystem encrypts the data. The Debian 
developer Tollef Fog Heen is working on such a filesystem, according to his 
blog: 
<http://err.no/personal/blog/tech/2008-01-09-22-27_scramblefs_ctr_mode_and_choosing_a_nonce>


Best regards
/Martin Leben


-
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2008.
http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Extending / adding another pool location

2008-02-11 Thread Martin Leben
Hi Machael!

Michael Mansour wrote:
> Can BackupPC use two disk areas?

The pool and the pc directories must be in the same FILESYSTEM because of the 
hardlinks. But that can be arranged. See below.

> Or will I need a new BackupPC installation somewhere?

No. Use LVM instead: Configure your current disks and the NAS as separate 
physical volumes and place them in the same volume group. In that case you'd 
have to use iSCSI to the NAS.

> Thanks.
> 
> Michael.

Best regards,
/Martin Leben
-- 
The email-adress is dated and will work until it attracts spam.
My permanent address is <[EMAIL PROTECTED]>.


-
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2008.
http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Need access to raw mailing list archive

2008-02-22 Thread Martin Leben
Curtis Preston wrote:
> Email addresses are masked the same way they are at the backuppc-users
> archives, with the @ sign being changed to " < at > ".

Similar to gmane which substitutes with "  ".

<http://gmane.org/info.php?group=gmane.comp.sysutils.backup.backuppc.general>
<http://news.gmane.org/gmane.comp.sysutils.backup.backuppc.general>

/Martin Leben


-
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2008.
http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Small patch to graph the pool size

2008-03-04 Thread Martin Leben
Ludovic Drolez wrote:
> Le jeudi 28 février 2008 18:50, vous avez écrit :
>> I have an idea. It would be cool if the graphs also showed the total size
>> prior to pooling/compression as a line overlayed on the existing graphs.
>>
>> I think this data is exposed as $fullSizeTot and $incrSizeTot.  I'll see if
>> I can figure it out sometime today.
> 
> Yes it would be nice, but since, there's a 1 to 100 (or more) ratio between 
> the two numbers, I think that you cannot show them on the same graph.
> 
> Cheers,

Hi!
A logarithmic graph (use the flag "--logarithmic" to rrdtool) works better in 
that case. And, if you ask me, in almost every other case as well.

BR
/Martin Leben


-
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2008.
http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Update 3.0.0 -> 3.1.0 debian

2008-03-12 Thread Martin Leben
Nils Breunese (Lemonbit) wrote:
> Dale King wrote:
> 
>> Old thread I know but I was thinking about the current situation (at  
>> least
>> in debian) with config.pl being changed at each upgrade.
> 
> I'd file a bug with the Debian packager or is this normal behavior on  
> Debian?

Hi,

When upgrading a Debian package the package is NOT allowed to silently make 
changes to config files. You should get a question about whether you want to 
keep your file, replace with the new, diff them or escape to a shell.

When you return from the diff or the shell, you should come back to the same 
question again if my memory serves me right.

If a package is making changes to a config file without your approval, that is 
a 
bug.

BR
/Martin Leben


-
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2008.
http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] backing on two raid arrays

2008-04-01 Thread Martin Leben
Gilles Guiot wrote:
> Hello Everybody
> I'm using backuppc 3.1.0 on a debian distro.
> My backup server has two raid 1 arrays
> I have been backing up some servers on the first array
> I need to backup other servers but not enough space on the first array 
> /dev/sda1. 
> I would like to use the same backuppc install and backup other servers on the 
> second array : dev/sdb
> Is it possible and if yes, how shall one proceed ? 


Hi,

No it is not possible. The FAQ at 

 
says:

> BackupPC uses hardlinks to pool files common to different backups. Therefore 
> BackupPC's data store (__TOPDIR__) must point to a single file system that 
> supports hardlinks. You cannot split this file system with multiple mount 
> points or using symbolic links to point a sub-directory to a different file 
> system (it is ok to use a single symbolic link at the top-level directory 
> (__TOPDIR__) to point the entire data store somewhere else). You can of 
> course use any kind of RAID system or logical volume manager that combines 
> the capacity of multiple disks into a single, larger, file system. Such 
> approaches have the advantage that the file system can be expanded without 
> having to copy it.

Best regards,
/Martin


-
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services for
just about anything Open Source.
http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] mysql files

2008-04-11 Thread Martin Leben
Rob Morin wrote:
> Thanks for that link, but i wanted to know if it is possible that just 
> backuppc performing the backup of mysql files, if that can cause an 
> issue with teh files while the server is running, IE like corrupting  
> .myd files or stuff like that?

Hi Rob,

No, BackupPC will not cause corruption to your files, since it is only reading 
them.

But, as Nils already pointed out, it is a bad idea to backup a running database 
by copying the database files. The fact that you repeated the question makes me 
wonder what your goals are.

Are you still thinking about backing up the database files, rather than a 
database dump?

If so: Why?

BR
/Martin


-
This SF.net email is sponsored by the 2008 JavaOne(SM) Conference 
Don't miss this year's exciting event. There's still time to save $100. 
Use priority code J8TL2D2. 
http://ad.doubleclick.net/clk;198757673;13503038;p?http://java.sun.com/javaone
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] rsync xfer error

2008-04-12 Thread Martin Leben
Mauro Condarelli wrote:
> Tony Schreiner ha scritto:
>> 
>> There is a config variable called  BackupZeroFilesIsFatal.
>> If that is set to 1, and your share is empty files, the backup will fail.
>>
>> Set it to 0 or skip /srv.
>> Tony
>>
> Thanks,
> That was it.
> Now it is crunching (on another share).
> 
> Thanks again
> Mauro

Hi Mauro,

Setting BackupZeroFilesIsFatal to 0 might be dangerous. It is configurable for 
a 
reason. Think about what happens in the following scenario:

- BackupZeroFilesIsFatal is set to 0.
- On "/srv" you have mounted a disk or something.
- Suddenly the mount disappears due to you "fat fingering" the configuration 
(remember that human errors are the most common errors) or faulty hardware or 
something else.

Now when BackupPC comes along and wants to backup "/srv" it does that without 
complaining, even though it contains no files. Depending on the schedule and 
retention settings you might have lost your backup completely. Especially if 
this continues for some days/weeks and you don't notice it.

So the recommendation is to leave BackupZeroFilesIsFatal at 1. Don't add "/srv" 
to the backup until there is data on it. If the client machine has other 
directories you are backing up and if "/srv" is a dynamic mount that sometimes 
isn't used, I would recommend that you create a separate host alias in which 
you 
backup only "/srv".

BR
/Martin Leben


-
This SF.net email is sponsored by the 2008 JavaOne(SM) Conference 
Don't miss this year's exciting event. There's still time to save $100. 
Use priority code J8TL2D2. 
http://ad.doubleclick.net/clk;198757673;13503038;p?http://java.sun.com/javaone
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] rsync xfer error

2008-04-12 Thread Martin Leben
Daniel Denson wrote:
> Martin Leben wrote:
>> 
>> Setting BackupZeroFilesIsFatal to 0 might be dangerous. It is configurable 
>> for a 
>> reason. Think about what happens in the following scenario:
>>
>> - BackupZeroFilesIsFatal is set to 0.
>> - On "/srv" you have mounted a disk or something.
>> - Suddenly the mount disappears due to you "fat fingering" the configuration 
>> (remember that human errors are the most common errors) or faulty hardware 
>> or 
>> something else.
>>
>> Now when BackupPC comes along and wants to backup "/srv" it does that 
>> without 
>> complaining, even though it contains no files. Depending on the schedule and 
>> retention settings you might have lost your backup completely. Especially if 
>> this continues for some days/weeks and you don't notice it.
>>
>> So the recommendation is to leave BackupZeroFilesIsFatal at 1. Don't add 
>> "/srv" 
>> to the backup until there is data on it. If the client machine has other 
>> directories you are backing up and if "/srv" is a dynamic mount that 
>> sometimes 
>> isn't used, I would recommend that you create a separate host alias in which 
>> you 
>> backup only "/srv".
> 
> better to put a small token file in /srv so that it does not apear empty.
> 
> echo "backuppc token file, please do not delete" > /srv/.backuppc_token

Hi Daniel,

The only thing THAT would accomplish is that one doesn't even have to set 
BackupZeroFilesIsFatal to 0 in order to lose all files in the backup. Which was 
precisely what I warned about.


A know of a guy that use to say: "I would recommend all my competitors do this"


BR
/Martin Leben
Ps/ Please don't change to top posting in the middle of a thread. /Ds


-
This SF.net email is sponsored by the 2008 JavaOne(SM) Conference 
Don't miss this year's exciting event. There's still time to save $100. 
Use priority code J8TL2D2. 
http://ad.doubleclick.net/clk;198757673;13503038;p?http://java.sun.com/javaone
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] rsync xfer error

2008-04-12 Thread Martin Leben
Daniel Denson wrote:
> Martin Leben wrote:
>> 
>> Setting BackupZeroFilesIsFatal to 0 might be dangerous. It is configurable 
>> for a 
>> reason. Think about what happens in the following scenario:
>>
>> - BackupZeroFilesIsFatal is set to 0.
>> - On "/srv" you have mounted a disk or something.
>> - Suddenly the mount disappears due to you "fat fingering" the configuration 
>> (remember that human errors are the most common errors) or faulty hardware 
>> or 
>> something else.
>>
>> Now when BackupPC comes along and wants to backup "/srv" it does that 
>> without 
>> complaining, even though it contains no files. Depending on the schedule and 
>> retention settings you might have lost your backup completely. Especially if 
>> this continues for some days/weeks and you don't notice it.
>>
>> So the recommendation is to leave BackupZeroFilesIsFatal at 1. Don't add 
>> "/srv" 
>> to the backup until there is data on it. If the client machine has other 
>> directories you are backing up and if "/srv" is a dynamic mount that 
>> sometimes 
>> isn't used, I would recommend that you create a separate host alias in which 
>> you 
>> backup only "/srv".
> 
> better to put a small token file in /srv so that it does not apear empty.
> 
> echo "backuppc token file, please do not delete" > /srv/.backuppc_token

Hi Daniel,

The only thing THAT would accomplish is that one doesn't even have to set 
BackupZeroFilesIsFatal to 0 in order to lose all files in the backup. Which was 
precisely what I warned about.


A know of a guy that use to say: "I would recommend all my competitors do this"


BR
/Martin Leben
Ps/ Please don't change to top posting in the middle of a thread. /Ds


-
This SF.net email is sponsored by the 2008 JavaOne(SM) Conference 
Don't miss this year's exciting event. There's still time to save $100. 
Use priority code J8TL2D2. 
http://ad.doubleclick.net/clk;198757673;13503038;p?http://java.sun.com/javaone
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup to USB disk.

2008-04-14 Thread Martin Leben
Mauro Condarelli wrote:
> Hi,
> I asked this before, but no one answered, so I will try again :)
> 
> I am using a large (500G) external USB disk as backup media.
> It performs reasonably, so no sweat.
> 
> Problem is:
> Is there a way to do a pre-check to see if the drive is actually mounted
> and, if not, just skip the scheduled backup?
> It would be easy to put a do_not_backup file in the directory over which
> I mount the remote.
> I could then do a test to see if that file is present (no disk) or if it
> is absent (something was mounted over it.
> Unfortunately I have no idea where to put such a test in BackupPC!
> 
> Can someone help me, please?
> 
> Related issue:
> I would like to use a small pool of identical external HDs in order to
> increase further security.


Hi Mauro,

Considering what it seems like you want to achieve, I would suggest another 
approach: Use at least three disks in a rotating scheme and RAID1.

Say I have three disks labeled 1, 2 and 3. Then I would rotate them according 
to 
the schedule below, which guarantees that:
- there is always at least one disk in the BackupPC server.
- there is always at least one disk in the off-site storage.
- all disks are never at the same location.

1 2 3   (a = attached, o = off-site)
a o o
a a o -> RAID sync
o a o
o a a -> RAID sync
o o a
a o a -> RAID sync
. . .

An even safer approach would of course be to rotate four disks where at least 
two disks are always attached to the BackupPC server.

Good luck!
/Martin Leben


-
This SF.net email is sponsored by the 2008 JavaOne(SM) Conference 
Don't miss this year's exciting event. There's still time to save $100. 
Use priority code J8TL2D2. 
http://ad.doubleclick.net/clk;198757673;13503038;p?http://java.sun.com/javaone
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup to USB disk.

2008-04-15 Thread Martin Leben
Hi Daniel,

No, not quite.

If you want a clean file system on the drive you are going to remove you must 
temporary stop BackupPC and unmount the file system first. When the drive is 
removed you can mount the file system again and start BackupPC. This will cause 
the RAID to run in degraded mode if you only have one disk left in the RAID, 
but 
will otherwise work as usual. (You don't have to run in degraded mode if you 
are 
rotating four disks in the scheme I proposed.) Stopping BackupPC and unmounting 
the file system can't be done by hotplug, because it is already too late.

You don't have to stop BackupPC when you attach a drive. Just plug it in and 
add 
it to the RAID with "mdadm --add", which you might be able to get hotplug to do 
for you.

Best regards,
/Martin Leben


Daniel Denson wrote:
> I think I understand your want in that you would like to have backuppc 
> check that a drive is hooked up before trying to use it for backups.  If 
> I am correct then I would suggest you build a quick daemon script that 
> starts backuppc when the device is hotplugged and stops backuppc when 
> unplugged.  I dont know what your linux distro is but hotplug scripts 
> are pretty easy, do some google work to get the file and method for a 
> hotplug script on a specific device.


-
This SF.net email is sponsored by the 2008 JavaOne(SM) Conference 
Don't miss this year's exciting event. There's still time to save $100. 
Use priority code J8TL2D2. 
http://ad.doubleclick.net/clk;198757673;13503038;p?http://java.sun.com/javaone
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup of client never completes

2008-05-21 Thread Martin Leben
James wrote:
> Hm, this could explain it then. With the prior conditions, I could see  
> the time to do a full backup pushing the limits of a normal work day.  
> Especially if the user, say, takes the laptop to lunch or class  
> (professor). Perhaps I'll cut his $HOME up into several virtual  
> servers of a manageable size in backuppc. Considering that it would  
> take approximately 5 to 6 hours+ under ideal conditions simply to  
> transfer the data to my server, I have no doubt that the shear size of  
> the data is the root of the problem.


Hi James,

You still hasn't told us which transfer method you use. But if you use rsync I 
would recommend you to convince the professor to leave the computer connected 
overnight. It should suffice to do this just one time, because once you have 
all 
the data on the server only diffs are transferred (plus some checksums) when 
using rsync.

Best regards,
/Martin Leben


-
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2008.
http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup of client never completes

2008-05-22 Thread Martin Leben
James wrote:
> I only ask, because I've been attempting to convince him to leave the  
> laptop overnight for a while but he feels he cannot part from it.

Aha. Ok. It must be very dear to him. :-)

Then I would recommend that you exclude most of the data that should be backed 
up. Then:
1) Back up and make sure that all wasn't excluded was copied to the server.
2) Remove something from the exclusion list.
3) Go to 1 until exclusion list is empty.

That way you iteratively build up the backup without having to transfer all in 
one chunk. And when this process is over, you don't have to deal with multiple 
shares.

Good luck!
/Martin Leben


-
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2008.
http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] question about dhcp and subnets

2008-06-06 Thread Martin Leben
Chantal Rosmuller wrote:
> Hi list,
> 
> I am using backuppc 2.1.1 on a debian system. I use it to backup laptops 
> (among others) with dhcp addresses and it works great. However there is one 
> issue:
> 
> We have 2 offices that are connected through vpn, one with ip range 
> 192.168.2.0/24 other with 192.168.4.0/24
> the backupserver is in range 192.168.2.0/24. When backing up servers with 
> static ip addresses in the 192.168.4.0/24 range there is no problem, but 
> there is one user with a laptop who almost always works in the 192.168.4.0/24 
> office and backuppc cannot find his laptop. How can I resolve this? His 
> laptop is occasionally backed up when he is in the other office but that's 
> not enough. A static ip address for the laptop is not an option.
> 
> Thanks for the advice!
> 
> regards Chantal


Hi!

One solution is to configure the DHCP-server and/or the DNS server to update 
the 
DNS record for the laptop when it acquires an address. You might want to 
consider using rather short TTL in the DNS in order to minimize the risk of 
using outdated IP-addresses due to caching.

Another solution is to make WINS name resolution work over the subnets. But 
someone else has to fill in some info about that.

Good luck!
/Martin Leben


-
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services for
just about anything Open Source.
http://sourceforge.net/services/buy/index.php
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] rsync error: unable to read 4 bytes

2008-06-30 Thread Martin Leben
Leandro Tracchia wrote:
> the ssh log on my linux client shows something odd when i am trying to
> login from the backuppc server...
> 
> Public key 00:6d:ce:5f:XX:XX:XX:XX... blacklisted (see ssh-vulnkey(1))
> 
> could this be causing the problem??? how do i fix this??


Hi!

Others have already told you what the problem is. What I would like to add is 
this: EVERYONE who is administrating a computer should make sure that he 
subscribes to the relevant security announcements. And READ the announcements. 
And try to understand them. If not, don't hesitate to ask in a forum or on a 
list for users of the distribution.

Debian: .
Ubuntu: 

/Martin "Besserwisser" Leben :-)


-
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services for
just about anything Open Source.
http://sourceforge.net/services/buy/index.php
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] "BackupPC_compressPool -t" says "Finished with 1423 errors!!!!"

2008-07-25 Thread Martin Leben
Hi,

I ran this yesterday:

[EMAIL PROTECTED]:~$ time /usr/share/backuppc/bin/BackupPC_compressPool -t
[...]
Error: Can't write 2855461 bytes to 
/var/lib/backuppc/pool/7/1/8/7183089ef8c34f68a56a2cbe5b84fbdf
Error: Can't write 2855461 bytes to 
/var/lib/backuppc/pool/7/1/8/7183089ef8c34f68a56a2cbe5b84fbdf
Error: Can't write 2855461 bytes to 
/var/lib/backuppc/pool/7/1/8/7183089ef8c34f68a56a2cbe5b84fbdf
2008-07-25 04:26:27 Done 100% (4096 of 4096 dirs, 304493 files, 49.13GB raw, 
34.5% reduce, 0 errors)
Finished with 1423 errors

A massive amount of identical lines "Error: Can't write [...]" had filled the 
entire scroll buffer, so I don't know if I missed anything else.

Is this normal? Do I dare to make an actual compress?

I use version 3.1.0-2~bpo40+1 on Debian Lenny.

Thanks!
/Martin Leben


-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK & win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100&url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] sync with bittorrent

2008-07-28 Thread Martin Leben
Daniel Denson wrote:
> I recently read a tip on lifehacker about checking and fixing downloaded 
> ISO media with bittorrent.  bittorrent is designed for small incremental 
> part downloads and organizing that data which could make it a nice fit 
> for remote filesystem syncing with any filesystem that can do readable 
> snapshots.
> 
> consider make an LVM snapshot and then a torrent file for it.  setup 
> your backuppc server as a bittorrent tacker.  send the torrent to the 
> remote machine and run it with rtorrent or some cli torrent client.


Hmm... Have understood you correct if what you want to achieve is a sync of a 
large file set without the huge memory overhead of rsync?

/Martin


-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK & win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100&url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Explanation of archiving process

2008-07-29 Thread Martin Leben
Hi Joanne,

I am no expert, but below is my understanding of how the archives work.

Joanne Cook wrote:
> The documentation says that the Archive function uses TarCreate, which merges
> incremental backups automatically. Does that mean that if you create an
> archive of a given incremental backup you are actually creating an archive of
> that increment plus the previous full backup?

Yes. Backuppc will make sure that the created archive contains (at least) all 
the files that were present on the client when the incremental was made.

If the backups were made using transfer method rsync or rsyncd the archive 
would 
not even contain files that were deleted between the last full and the 
incremental. (Someone with more experience, please correct me if I am wrong.)


> And to restore from a given
> incremental all I would need is that archive and not the previous full
> backup?

Yes.


> Presumably also if I create nightly archives of the previous incremental
> backup they are going to get increasingly large?

Syntax error in your sentence...! ;-) Did you meant to ask "Will the nightly 
archives I create from incrementals get increasingly larger?".  Yes they will, 
unless you are using transfer method rsync or rsyncd. (See note above.)

BR
/Martin Leben


-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK & win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100&url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backups seem to work, but don't show up in web

2008-08-11 Thread Martin Leben
Alan McKay wrote:
> Oh, and I forgot to mention the number of files is about 60,000.
> 
> Is that a lot?

No. The opposite, actually.

BR
/Martin


-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK & win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100&url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Wiki instructions for installing/upgrading on Ubuntu 8.04.

2008-08-17 Thread Martin Leben
dan wrote:
> [...]
> apt-get install sendmail
> [...]
> 
> sendmail is not listed in the dependancies for backuppc so you need to
> install that or postfix which has sendmail emulations and a
> /usr/bin/sendmail script for that.


Hi,

If you can use an SMTP relay on some other machine or at your ISP there is no 
need for a full-fledged mailer like like sendmail, postfix or exim. After all, 
backuppc only sends mail so ANY packet that provides /usr/sbin/sendmail will 
suffice.

<http://packages.debian.org/search?searchon=contents&keywords=sendmail&mode=exactfilename&suite=stable&arch=any>
 
list the following:

courier-mta, esmtp-run, exim, exim4-daemon-heavy, exim4-daemon-light, masqmail, 
msmtp-mta, nullmailer, postfix, smail, ssmtp, xmail

I use "nullmailer" myself. Couldn't be easier.

Best regards,
/Martin Leben


-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK & win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100&url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Wiki instructions for installing/upgrading on Ubuntu 8.04.

2008-08-17 Thread Martin Leben
Martin Leben wrote:
> I use "nullmailer" myself. Couldn't be easier.

In fact, is so easy that I include the entire configuration, although slightly 
munged:

# cat /etc/nullmailer/remotes
smtp.example.org


BR
/Martin Leben


-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK & win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100&url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] scp as the transfer?

2008-08-18 Thread Martin Leben
Nils Breunese (Lemonbit) wrote:
> Ward... James Ward wrote:
> 
>> Since I don't have rsync or gnu tar on my busybox console servers,  
>> is it possible to configure BackupPC to use scp?
> 
> No, the only values for $Conf{XferMethod} for backups are smb, rsync,  
> rsyncd and tar. See the documentation: 
> http://backuppc.sourceforge.net/faq/BackupPC.html#step_5__client_setup

... which basically puts you in a position where you have to copy the busybox 
data to another machine, using for example scp from cron, and then backup the 
copy instead. Messy, but it works.

Good luck!
/Martin


-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK & win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100&url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Ping too slow

2008-09-04 Thread Martin Leben
Andrew wrote:
> [...] Instead, it's trying to DNS "shipping" using opendns.
> OpenDNS creates the same issue as my ISP's DNS: it will direct any host
> that isn't found to its own search servers. 

Hi Andrew,

This is a VERY good example of why you should avoid OpenDNS or any other DNS 
service that returns bogus data when queried for non-existing hosts. They 
create 
confusion and breaks the principle of least surprise. Run your own caching 
resolver instead and prime it with knowledge of the root servers instead.

Best regards,
/Martin Leben


-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK & win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100&url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] suffering from the backups have stopped syndrome

2008-10-04 Thread Martin Leben
Terri Kelley wrote:
> List,
> 
> I have had a backup running successfully running of a server for some 
> time. It uses automysqlbackup. Looking at the directory where that is 
> stored on the server to be backed up, that is still being executed by 
> BackupPC. However, the backup itself just stays running in the host 
> summary. Looking at the log file in backup, I see the following:
> 
> 2008-10-01 20:00:03 full backup started for directory 
> /home/backuppc/test (baseline backup #54)
> 2008-10-02 11:11:55 Aborting backup up after signal INT
> 2008-10-02 11:11:56 Got fatal error during xfer (fileListReceive failed)
> 2008-10-02 20:00:03 full backup started for directory 
> /home/backuppc/test (baseline backup #54)
> 2008-10-03 16:00:04 Aborting backup up after signal ALRM
> 2008-10-03 16:00:05 Got fatal error during xfer (fileListReceive failed)
> 2008-10-03 20:00:03 full backup started for directory 
> /home/backuppc/test (baseline backup #54)
> 
> There have been no changes to this server so I don't understand why it 
> would stop backing up. Anyone have a clue or pointers for things to look 
> at?
> 
> Terri Kelley
> Network Engineer


Hi,

On the backed up machine, are really you sure that someone hasn't changed or 
removed the private part of the key which backuppc uses when connecting?

/Martin


-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK & win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100&url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Different TopDir for different clients.

2008-11-14 Thread Martin Leben
Hi Oz,

Read on...

Oz Dror wrote:
> I am sure that it was asked before, but I was not able to find 
> satisfying answer on the net.
> 
> 1. How can I have different TopDir assigned to different client computers.

No, you can't. Tell us more about what problem you are trying to solve instead.


> 2. How can I have different backup schedules for the same PC client.

Create an extra host in backuppc that points to the same client machine. But 
why 
would you want to do that? Just as with the question above, tell us more about 
the actual problem you are trying to solve.

I am not really an expert on Backuppc, so if I misunderstood anything please 
correct me. And don't hesitate to follow up if Oz has more questions if he 
follows up on this subject and I am not jumping the gun.

BR
/Martin Leben


-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK & win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100&url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Different TopDir for different clients.

2008-11-15 Thread Martin Leben
Oz Dror wrote:
> Thanks for responding:

You're welcome.

> Regarding the first issue.
> 
> I have a limited backup space in my server. On the other had I have disk 
> space in some clients that is wasted.
> Thus I was hopping to backup one client to another client's disk, rather 
> than to my main storage.

One of the fundamental principles of backuppc is that a file with a certain 
content is (in general) only stored once, regardless of how many times and 
where 
such a file occurs in the same client or others. This achieved by hardlinks. 
Hardlinks only works on the same filesystem.

So to achieve what you want I think you would have to use separate instances of 
backuppc. And I am not sure how if that is possible. But really, are disks that 
expensive? Just buy some disks instead of using some homegrown solution that 
will cost you more time to implement and maintain.


> Regarding the second issues. The client is a windows client. My 
> understanding is that when you use rsyncd in windows. you can either 
> exclude dirs or include
> dirs. not both.

Can you provide a source of that information? I find it very much unlikely that 
rsyncd on Windows should behave different than rsyncd on some Unix in this way.

(Please note that I am not using include and exclude simultaneously on neiher 
Windows nor Linux, so I am just making an educated guess. And my usual "I am 
not 
a backuppc expert" disclaimer still applies.)

BR
/Martin


-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK & win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100&url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Different TopDir for different clients.

2008-11-15 Thread Martin Leben
Martin Leben wrote:
> [...] And I am not sure how if that is possible. But really, are disks that 
> expensive? Just buy some disks instead of using some homegrown solution that 
> will cost you more time to implement and maintain.

This of course means that you must use for example LVM or some other solution 
to 
combine several disks and be able to use them in ONE SINGLE filesystem.

/Martin


-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK & win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100&url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Different TopDir for different clients.

2008-11-15 Thread Martin Leben
Jack Coats wrote:
> I don't know what happened to it, but at one time there was development 
> being done on a 'distributed
> file system', where the data was 'raided' across many systems, so if 
> some of the systems 'went away'
> the data was still there and updated.  And when they came back, it was 
> automatically put back in and
> 'synced'.  It was supposed to use all much of the 'unused space' on 
> client desktops, but the people on
> the desktop would have to go into the 'front door' to see the data, not 
> just what was on their desk.

I don't know what specific file system you are thinking about, but 

 
and 

 
have some interesting alternatives if one wants to go down that avenue.

Lustre sounds interesting.

/Martin


-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK & win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100&url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Different TopDir for different clients.

2008-11-15 Thread Martin Leben
Mauro Condarelli wrote:
> I have several clients which are generally orthogonal (files in one
> group will not be found on the other).
> 
> I am backing up to removable media (eSATA disks) rotating the disks
> for added security.
> 
> I have a set of identical 500GB external disks.
> 
> Up to now I backed up just inserting the disk and letting backuppc
> take over, then I replace the disk with another and, next time I have
> another backup on the other disk of the set.
> 
> If no disk is inserted the scheduled backup is simply skipped because
> the mount-point directory is not writable by backuppc.
> 
> So far so good.
> 
> Now my total backup amount exceeds 500GB.
> I tried setting two different TopDirs (for different, orthogonal sets
> of clients), but that, as you very well know, fails.
> I do *not* want to have LVM, since that would mean a big hassle using
> removable drives!
> 
> What is the "recommended" way to achieve this?
> 
> TiA
> Mauro


Hi Maouro,

I can think of several options:

1) Replace your 500GB disks with 1TB disks instead. That would buy you some 
time.

2) Use a couple of internal disks combined with LVM. To have an off-site copy 
you could use BackupPC_tarCreate to export individual hosts to external disk(s).

3) Use at least three 1TB disks in a rotating scheme and RAID1.
Say you have three disks labeled 1, 2 and 3. Then you would rotate them 
according to the schedule below, which guarantees that:
- there is always at least one disk in the BackupPC server.
- there is always at least one disk in the off-site storage.
- all disks are never at the same location.

1 2 3   (a = attached, o = off-site)
a o o
a a o -> RAID sync
o a o
o a a -> RAID sync
o o a
a o a -> RAID sync
. . .

An even safer approach would of course be to rotate four disks where at least
two disks are always attached to the BackupPC server.

On top of the RAID1 I recommend that you use LVM even though it is not strictly 
necessary right now if you use 1TB disks. The reason for doing that is when 1TB 
is no longer sufficient you can expand by using an second set of disks in a 
similar setup and add that second RAID1 to the volume.



I don't particularly like 2) because it requires extra work when 
adding/removing 
clients and it requires more storage space. And I prefer method 3) over 1) 
because it doesn't screw up the logs and you don't have to locate a particular 
disk when doing a restore.

Good luck!

BR
/Martin


-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK & win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100&url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backuppc mirroring with rdiff-backup or not?

2008-11-17 Thread Martin Leben
Ermanno Novali wrote:
> [...]
> I'd like to mirror the backuppc pool - I searched through ml archives
> and found that mirroring the backuppc pool (wherever it is) with rsync
> on an external hard drive isn't efficient and doesn't scale good -
> i've tried myself and is cpu and time consuming and very very long for
> big pools - not very reliable.
> 
> So i've tried to mirror the pool with rdiff-backup, and it seems a
> little better, but not the optimal solution.
> 
> In this ml the best solutions for this task are two hdd with pool on
> them (two external, or two in raid maybe) or dd form pool to external
> mirror disk - but NOT mirroring the backup with rsync or something
> like that - right? can you confirm that?
> 
> And dd is time consuming like rsync but more reliable for backuppc pool?
> 
> Thank you so much,
> have a nice day
> 
> Ermanno


Hi,

Yes use dd (or even better dd-rescue that is restartable and gives progress 
indication) for big pools. For smaller pools you might use "cp -a" or "rsync 
-aH" (restartable). You have to find out the practical upper limit for the 
latter methods depending on your requirements.


Another alternative is to use at least three disks in a rotating scheme and 
RAID1. (Those of you who have been reading the list for more than a few days 
are 
getting tired of hearing this by now, I imagine...!) Say you have three disks 
labeled 1, 2 and 3. Then you would rotate them according to the schedule below, 
which guarantees that:
- there is always at least one disk in the BackupPC server.
- there is always at least one disk in the off-site storage.
- all disks are never at the same location.

1 2 3   (a = attached, o = off-site)
a o o
a a o -> RAID sync
o a o
o a a -> RAID sync
o o a
a o a -> RAID sync
. . .

On top of the RAID1 I recommend that you use LVM even though it might not be 
strictly necessary right now if your backups fit on one disk. The reason for 
doing that is when your disk are too small you can expand by using an second 
set 
of disks in a similar setup and add that second RAID1 to the volume.

Good luck!

/Martin


-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK & win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100&url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/