Re: [BackupPC-users] destination directory and encrytion

2015-09-02 Thread Tim Fletcher
On 02/09/15 14:35, Jan Novak wrote:
> Hi there,
> 
> two questions:
> Is it possible to switch the destination directory of the backup for 
> each host
> and how can be a backup encoded (because of unsecure cloud server or 
> similar)?

I've read over the thread and BackupPC is in this case not the right answer.

The issues that you are trying to solve, ie separation of clients via
different encryption keys and not trusting the server are fundamentally
opposed to BackupPC's design.

There are backup systems that make use of client side encryption, that
prevent the server ever seeing the plain text file contents. The one of
these that I know best is duplicity there is also one called tarsnap
that I know less well.

BackupPC is designed as a trusted server side pull backup system, ie the
server sees the plain text of files. BackupPC makes use of file level
dedup, ie the same file from multiple clients have the same content on
the server. This is exactly what properly implemented encryption is
designed to avoid.

Encrypted file systems on cloud servers are hard to get correct, the
master key for the file systems such as truecrypt or LUKS are always in
the machine's memory. Unless you have physical hardware the hypervisor
can always read the memory of a machine thus extract the master key
without alerting the owner of the VM. Even with physical servers given
access to the server and a willingness to reboot the system keys can be
recovered.





signature.asc
Description: OpenPGP digital signature
--
Monitor Your Dynamic Infrastructure at Any Scale With Datadog!
Get real-time metrics from all of your servers, apps and tools
in one place.
SourceForge users - Click here to start your Free Trial of Datadog now!
http://pubads.g.doubleclick.net/gampad/clk?id=241902991=/4140___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC fails to start in daemon mode on Ubuntu 14.04

2015-05-14 Thread Tim Fletcher
 

On 2015-05-13 03:41, Stoyan Stoyanov wrote: 

 Hi Holger, 
 While it seems like a packaging issue, there are no bug reports on launchpad, 
 so I thought maybe someone on this list stumbled upon this problem. 
 Unfortunately, nothing gets recorded in the LOG file. I ran perl in debug 
 mode, but the program exits right after forking the child process so there is 
 nothing really that hints on what the problem might be. Same code taken out 
 of the context of the BackupPC script works fine i.e. child is forked and 
 doesn't die immediately. 
 
 Stoyan

Hi Stoyan, 

Does the system that BackupPC fails to start on have ldap configured? 

I have this problem too and was looking at it again following your
email, I noticed from strace that a called was being made to .ldaprc
during backuppc startup. 

I have removed all ldap config and packages from the server and backuppc
now starts correctly. 

I know this isn't a fix but at least it's a step forward. 

I removed libnss-ldap auth-client-config ldap-auth-config
ldap-auth-client libpam-ldap

I reran pam-auth-config and disabled the auto creation of home-dirs and
systemd registration that I had enabled for ldap support. 

Tim Fletcher
 t...@night-shade.org.uk --
One dashboard for servers and applications across Physical-Virtual-Cloud 
Widest out-of-the-box monitoring support with 50+ applications
Performance metrics, stats and reports that give you Actionable Insights
Deep dive visibility with transaction tracing using APM Insight.
http://ad.doubleclick.net/ddm/clk/290420510;117567292;y___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Performance reference (linux --(rsync)- linux)

2012-11-07 Thread Tim Fletcher
On 5 Nov 2012, at 19:02, Cassiano Surek c...@surek.co.uk wrote:

 Hello all,
 
 This is my first post to the list, so please be gentle. :)
 
 I have been running a backuppc server for a while, but recently it has been 
 running way too slow to be useful.
 
 Bearing in mind that every situation is different, it would be beneficial to 
 compare the performance of my setup with others to rule out any obvious 
 bottlenecks.
 
 Please read below the specs for the server and a client, with the reported 
 speed achieved with that combination.
 
 Server:
 
 Centos 5, Linux 2.6.18-53.el5 #1 SMP Mon Nov 12 02:22:48 EST 2007 i686 i686 
 i386 GNU/Linux
 Quad Intel(R) Xeon(TM) CPU 3.06GHz
 2Gb Ram
 ext3 raid 1 pool on separate disk/partition
 rsync 3.0.6
 backuppc 3.2.1 with the latest perl modules installed via perl CPAN.
 
 
 Client:
 
 Centos 4.9, Linux 2.6.9-103.ELsmp #1 SMP Fri Dec 9 04:31:51 EST 2011 i686 
 i686 i386 GNU/Linux
 Dual Intel(R) Pentium(R) 4 CPU 3.20GHz
 4Gb Ram
 ext3 raid 1
 rsync 2.6.3 (client) running via xinetd
 
 Both are connected to the same switch at the data centre using gigabit 
 interfaces.
 
 This yields:
 
 Backup size: 105.58 Gb (yep, it is quite big)
 Speed: 0.29 Mb/s
 

I think the likely problems are:

Ram on the server as already pointed out.

The clients disk subsystem performance, try mounting the filesystem being 
backed up with noatime.

Try changing from rsync to tar, there are trade offs but for small files tar is 
faster. 

Clients CPU performance, less likely but might be worth trying the arcfour 
cypher on ssh. 



--
LogMeIn Central: Instant, anywhere, Remote PC access and management.
Stay in control, update software, and manage PCs from one command center
Diagnose problems and improve visibility into emerging IT issues
Automate, monitor and manage. Do more in less time with Central
http://p.sf.net/sfu/logmein12331_d2d
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] backuppc slow rsync speeds

2012-09-17 Thread Tim Fletcher
You are being hit by disk io speeds, check you dont have atime turned on on the 
fs. Also it's worth considering tar instead of rsync for this sort of work 
load. 

--

Sent from a mobile device

Tim Fletcher

On 17 Sep 2012, at 10:08, Mark Coetser m...@tux-edo.co.za wrote:

 Hi
 
 backuppc   3.1.0-9.1
 rsync  3.0.7-2
 
 OK I have a fairly decent spec backup server with 2 gigabit e1000 nics 
 bonned together and running in bond mode 0 all working 100%. If I run 
 plain rsync between the backup server and a backup client both connected 
 on gigabit lan I can get sync speeds of +/- 300mbit/s but using backuppc 
 and rsync the max speed I get is 20mbit and the backup is taking 
 forever. Currently I have a full backup thats been running for 3461:23 
 minutes where as the normal rsync would have taken a few hours to complete.
 
 The data is users maildirs and its about 2.6Tb and I am not using rsync 
 over ssh, I have the rsync daemon running on the client and have setup 
 the .pl as follows.
 
 config
 
 #
 $Conf{ClientTimeout} = 28800;
 
 # Minimum period in days between full and incremental backups:
 $Conf{FullPeriod} = 6.97;
 $Conf{IncrPeriod} = 0.97;
 
 # Number of full and incremental backups to keep:
 $Conf{FullKeepCnt} = 2;
 $Conf{IncrKeepCnt} = 10;
 # Note that additional fulls will be kept for as long as is necessary
 # to support remaining incrementals.
 
 #$Conf{DumpPreUserCmd} = 'sudo /bin/mount -t nfs ns1:/var/mail /var/mail';
 #$Conf{DumpPostUserCmd} = 'sudo /bin/umount /mnt/mail';
 
 # What transport to use backup the client [smb|rsync|rsyncd|tar|archive]:
 $Conf{XferMethod} = 'rsyncd';
 
 # The file system path or the name of the rsyncd module to backup when
 # using rsync/rsyncd:
 $Conf{RsyncShareName} = ['backuppc'];
 
 $Conf{RsyncdAuthRequired} = 0;
 
 $Conf{RsyncdUserName} = '';
 $Conf{RsyncdPasswd} = '';
 
 # If this is defined only these files/paths will be included in the backup:
 $Conf{BackupFilesOnly} = undef;
 
 # These files/paths will be excluded from the backup:
 $Conf{BackupFilesExclude} = [
 '/DONOTDELETE',
 '/lost+found'
 ];
 
 # Level of verbosity in Xfer log files:
 $Conf{XferLogLevel} = 1;
 
 # Commands to run for client backups:
 # Note the use of SSH's -C attribute. This enables compression in SSH.
 $Conf{RsyncClientCmd} = '$rsyncPath $argList+';
 
 # Commands to run for client direct restores:
 # Note the use of SSH's -C attribute. This enables compression in SSH.
 $Conf{RsyncClientRestoreCmd} = '$rsyncPath $argList+';
 
 # Compression level to use on files. 0 means no compression. See notes
 # in main config file before changing after backups have already been done.
 $Conf{CompressLevel} = 3;
 
 
 
 -- 
 Thank you,
 
 Mark Adrian Coetser
 
 
 --
 Live Security Virtual Conference
 Exclusive live event will cover all the ways today's security and 
 threat landscape has changed and how IT managers can respond. Discussions 
 will include endpoint security, mobile security and the latest in malware 
 threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/
 

--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] backuppc slow rsync speeds

2012-09-17 Thread Tim Fletcher
No it won't in the same way, you are basically asking rsync to walk the large 
and complex file tree checking the date of every file, where as with a full 
rsync all you are asking for is next file, next lie, next file

--
Sent from a mobile device

On 17 Sep 2012, at 15:59, Mark Coetser m...@tux-edo.co.za wrote:

 On 17/09/2012 14:50, Tim Fletcher wrote:
 You are being hit by disk io speeds, check you dont have atime turned on on 
 the fs. Also it's worth considering tar instead of rsync for this sort of 
 work load.
 
 --
 Hi
 
 Surely disk io would affect normal rsync as well? Normal rsync and even nfs 
 get normal transfer speeds its only rsync within backuppc that is slow.
 
 Thank you,
 
 Mark Adrian Coetser
 

--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] rsync: full backup more than twice faster than incremental backup

2012-08-16 Thread Tim Fletcher
On 16/08/12 09:47, Udo Rader wrote:
 One of the reasons I can think of is the file structure on that host. It
 serves as a special storage pool for a customer developed application
 and as such it has really really many subdirectories with really really
 many subdirectories with really really many subdirectories. And by
 really really many I mean really really many ... So my best guess would
 be that building the file list diff takes much longer than just fetching
 the files as they exist.

 I've now disabled incremental backups on this server, but maybe someone
 has an idea how to enable incremental backups for this host as well.

 thanks in advance!

Try using tar instead of rsync, you are correct that for large file 
trees rsync is very slow and also RAM intensive when it's building the 
filetrees

-- 
Tim Fletcher t...@night-shade.org.uk


--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC not starting at boot

2012-08-06 Thread Tim Fletcher
Sounds like backuppc is starting before the external disk has mounted 

--

Sent from a mobile device

Tim Fletcher

On 5 Aug 2012, at 21:42, Norman Goldstein norm...@telus.net wrote:

 I am running
 BackupPC.i6863.2.1-7.fc17
 
 and BackupPC does not load at boot.  However, I can always get it 
 started manually with
 
 systemctl start backuppc.service
 
 I have issued the command
 
 systemctl enable backuppc.service
 
 so that backuppc should be starting at boot.
 In the system log, I used to see the message
 
 ... Can't create a test hardlink between a file in /var/lib/BackupPC//pc 
 and /var/lib/BackupPC//cpool
 
 but I am not even seeing this any more.
 
 My /var/lib/BackupPC is a soft-link to an external hard drive, which 
 supports hard links
 (obviously, since I am able to start BackupPC, manually).
 
 I did a test to have /var/lib/BackupPC as a plain directory, and then 
 BackupPC
 started up properly at boot.  It seems that BackupPC has a problem with
 the external drive, even though it worked fine under Fedora 15.  I moved the
 mount line of the external drive in /etc/fstab to the top of /etc/fstab, 
 but that
 did not help.
 
 
 
 
 
 
 
 --
 Live Security Virtual Conference
 Exclusive live event will cover all the ways today's security and 
 threat landscape has changed and how IT managers can respond. Discussions 
 will include endpoint security, mobile security and the latest in malware 
 threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/
 

--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Running backuppc on a raspberry pi - possible ?

2012-07-23 Thread Tim Fletcher
On 22/07/12 20:27, Poul Nielsen wrote:
 I am considering using a raspberry pi which is a very low power, low
 spec linux unit.


 http://www.raspberrypi.org/
 CPU:  700 MHz ARM11 ARM1176JZF-S core
 Memory (SDRAM)iB  256 MiB
 USB 2.0 ports:2 (via integrated USB hub)


 Wanted:
 - home use
 - versioned backup from a number of PCs and devices
 - speed is not essential
 - using rsync where possible

 rsync might need more memory than the max 192 Mb available?

 Any experience ?

I have a Pi and there are a few things to remember about the Pi:

1. The ethernet is on the USB bus, so you are sharing the USB bus with 
network and storage.
2. The SD card storage subsystem is slow

I think that as has been discussed on the list many times backuppc is 
very heavy on the I/O subsystem.

I think that if you are looking at this sort of solution you might be 
better to look at something kirkwood based such as a dreamplug or a QNAP 
TS-219p which has both gig ethernet and eSATA on a pci express bus not USB.

You other option is something more modern like a Mele A1000, which again 
is more ram and better IO

-- 
Tim Fletcher t...@night-shade.org.uk

--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC data volume

2012-07-10 Thread Tim Fletcher
On 10/07/12 22:44, Bryan Keadle (.net) wrote:
 It's recommended to mount the BackupPC data volume with noatime:

 /dev/vg_backuppc_data/backuppc_data /var/lib/BackupPC ext4
 rw,*noatime *0 0


 However, since my data volume is an iSCSI volume, I needed this entry to
 have the iSCSI mount properly mount at boot using _netdev:

 /dev/vg_backuppc_data/backuppc_data /var/lib/BackupPC ext4 _netdev 0 0

 so do I still need *noatime* and if so, how/where do I include it on
 that line?

in fstab add ,noatime,nodiratime in the same place as you had _netdev so 
you end up with:

noatime,nodiratime,_netdev

-- 
Tim Fletcher t...@night-shade.org.uk



--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Offsite copy

2012-07-01 Thread Tim Fletcher
On 27/06/12 00:02, shorvath wrote:
 Hi Timothy,

 Thanks for your comments, unfortunately however I think you're missing my 
 point.
 I very much like the way Backuppc handles backup, dedupe etc. and to use this 
 on the local site to handle the backups of each individual client would be 
 preferred,  plus it gives the client an interface to browse backups and do 
 restores. (One of the main reasons I want to use it)
 However for remote backups (eg offsite) I want to be able to have and rsync 
 style snapshot of the most recent backup.
 I do not want to have to pull backups from my remote site to each individual 
 server for this purpose because
 A) I'm already backing them up from the local site backup server and I don't 
 want to back them up twice (more io/cpu and it'll take longer) These servers 
 are production and heavily used with a lot of daily changes.
 Keep in mind that the backup server is used just for backups so it can spend 
 the whole day getting thrashed for all I care.
 and to be honest that just seems silly and more  than a solution.
 B) I don't like having multiple entry points .  actually no other points 
 are needed. Point A is enough reason not to do it.
 I already achieve my needs currently by simply using rsync for both local and 
 remote but as mentioned earlier I want a more elegant solution and I like 
 backuppc and how it handles the local backups.
 If there is no way of exposing the latest  backup of each host from the 
 backuppc server either via a fuse module (that works) or other means then my 
 solution to either stick to my current solution. eg rsync but spend some time 
 refining it.
 or
 use something like rdiff-backup or rsnapshot.
 rdiff-backup will allow me to see the most recent snapshot but has it's 
 caveats/pitfalls/complexities.
 Rsnapshot will do everything I need but maybe not the most efficient on space.
 Of course there are other...
 Bandwidth, not a problem, I can limit/shape that several ways.

You options that I can see are:

1. Script up a recover of each server to the backup server and then 
rsync that to the remote host. This means you lose a chunk of space on 
the backup server but is bandwidth efficient.

Something like:

BackupPC_tarCreate -h host -n -1 -s share name / |\
tar -C /path/to/localbackups/host -x -f -

and then:

rsync -avz /path/to/localbackups/ remotehost:/path/to/remotebackups/

This will cost you disk space on the backup server and some disk 
bandwidth during the recover.

2. If bandwidth truely isn't an issue then you can simply recover the 
latest backup into an ssh pipe something like this:

BackupPC_tarCreate -h host -n -1 -s share name / |\
ssh -c arcfour -C remotehost tar -C /path/to/remotebackups/host -x -f -

This will transfer the most recent backup via tar and ssh to the remote 
host but this will transfer a full backup everytime.

This will hammer the cpu on the backup server to compress and encrypt 
the transfer and also hammer your bandwidth both remote disk and network.

-- 
Tim Fletcher t...@night-shade.org.uk



--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Q: not-expected? behaviour for 'BackupPC_tarCreate' ?

2012-06-14 Thread Tim Fletcher
On 13 Jun 2012, at 20:26, Tim Chipman tchip...@gmail.com wrote:

 Thanks for the quick reply.  Good to know there is an issue... now I
 just need to figure it out.
 
 Dare I ask, does this sound familiar in any way ?
 
 I had another followup which suggested I check 2 things:
 (a) are files all present when I browse via web interface? Answer - no
 - I only see files in the latest incremental backup / not 'all files'
 'filled'.  If I go to day zero backup - I see all files in the
 original full backup though.
 (b) if I pull the tarball via web interface - I get exactly the same
 size (partial, not full) tarball that is returned when I do the
 command line method.

My guess is that the nightly process might not be running properly. Can you see 
if the is anything in the log files?

Assuming that the file system layout on Debian is the same as Ubuntu the log 
file is in ~backuppc/log/LOG and is rotated daily. 

--

Sent from a mobile device

Tim Fletcher
--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Q: not-expected? behaviour for 'BackupPC_tarCreate' ?

2012-06-13 Thread Tim Fletcher
On 13/06/12 14:49, Tim Chipman wrote:

snip problem description

 So.  Just wanted to check,
 - is this expected behaviour?
 - if I want to get 'full  backup and latest' offsite, do I just need
 to take a copy of the 'zero day' tarball offsite, and then each night,
 generate a tarball of 'diffs for that day' using the n of minus one
 flag -- and rsync that out -- and I'll be 'ok'.
 - or is there some other way to get the BackupPC_tarCreate command -
 to automagically merge diffs with full - so that each night it dumps
 the 'whole, latest' ?

 Any comments are certainly greatly appreciated.


What if you trigger the tar file build from the web interface?

What if you browse the most recent backup via the web interface does it 
contain the files you expect?

-- 
Tim Fletcher t...@night-shade.org.uk

--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] cheap encryption

2012-05-31 Thread Tim Fletcher
On Wed, 2012-05-30 at 14:05 -0400, Neal Becker wrote:
 Using rsync (via ssh), default is aes encryption, which is expensive.
 
 I wanted to try setting
 
 Host * Ciphers arcfour,blowfish-cbc
 
 I put that in user backuppc .ssh/config, but that didn't seem to work 
 (according 
 to the output of ps, not showing the args to ssh).
 
 I did find that putting it in 
 
 $Conf{RsyncArgs}
 
 as
 
   '-e ssh -c arcfour,blowfish-cbc'
 
 (note the quotes)
 
 appears to be working (I can see the args in ps, and the backup is running).

I normally use the following to enable arcfour for a particular target:

$Conf{RsyncClientCmd} = '$sshPath -q -x -C -c arcfour -l root $host
$rsyncPath $argList+';


-- 
Tim Fletcher t...@night-shade.org.uk


--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Pool filesystem wierdness

2012-05-30 Thread Tim Fletcher
On Tue, 2012-05-29 at 19:27 -0400, Brad Alexander wrote:
 I know it is bad form to respond to one's own post, but I was digging
 around in my Munin graphs, and noticed that the filesystem skyrocked
 from about 70% to 100% late on the 26th or early on the 27th. I have
 included both the 4-week pool graph from the backup machine and the
 weekly graph from my munin server. Note that while the pool filesystem
 is smooth and even at around 300GB, the munin graph climbs sharply...I
 wasn't able to find anything that might have caused it, though doing
 df on the pc directory, with all it's hard links, seems to be a Bad
 Idea.

I've been caught out by things like thunderbird indexing 10 gig of mail
suddenly and adding a massive chunk to my backups.

You can use du on parts of the pool file system as modern versions of du
will skip hardlinks after it's counted the first instance of them.

-- 
Tim Fletcher t...@night-shade.org.uk


--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Pool filesystem wierdness

2012-05-30 Thread Tim Fletcher
On Wed, 2012-05-30 at 10:15 -0400, Brad Alexander wrote:
 I think I just found it. Ironically enough, it was my workstation. I
 have an external drive that is normally plugged into my laptop for
 files I need to transport I had plugged (and left plugged) this drive
 into my desktop, which was being indexed and attempted to be backed
 up. Apparently, this filled the filesystem, and the backup never
 completed, but the files from the partial backup were apparently
 stored somewhere. Once I umounted the portable hard drive, the backup
 completed, and the drive space went back down to normal levels.
 
 I have since commented out all of /media.

Changing the rsync flags to include --one-file-system at the top level
will stop this catching you out again, I think it should be a
configuration default.

-- 
Tim Fletcher t...@night-shade.org.uk


--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Correct rsync parameters for doing incremental transfers of large image-files

2012-05-12 Thread Tim Fletcher
On 12/05/12 11:57, Andreas Piening wrote:
 Hi Les,

 I allready thought about that and I agree that the handling of large 
 image files is problematic in general. I need to make images for the 
 windows-based virtual machines to get them back running when a 
 disaster happens. If I go away from backuppc for transfering these 
 images, I don't see any benefits (maybe because I just don't know of a 
 image solution that solves my problems better).
 As I already use backuppc to do backups of the data partitions (all 
 linux based) I don't want my backups to become more complex than 
 necessary.
 I can live with the amount of harddisk space the compressed images 
 will consume and the IO while merging the files is acceptable for me, too.
 I can tell the imaging software (partimage) to cut the image into 2 GB 
 volumes, but I doubt that this enables effective pooling, since the 
 system volume I make the image from has temporary files, profiles, 
 databases and so on stored. If every image file has changes (even if 
 there are only a few megs altered), I expect the rsync algorithm to be 
 less effective than comparing large files where it is more likely to 
 have a unchanged long part which is not interrupted by artificial 
 file size boundaries resulting from the 2 GB volume splitting.

 I hope I made my situation clear.
 If anyone has experiences in large image file handling which I may 
 benefit from, please let be know!

The real question is what are you trying to do, do you want a backup (ie 
another single copy of a recent version of the image file) or an archive 
(ie a series of daily or weekly snapshots of the images as they change)?

BackupPC is designed to produce archives mainly of small to medium sized 
files and it stores the full file not changes (aka deltas) and so for 
large files (multi gigabyte in your case) that change each backup it is 
much less efficient.

To my mind if you already have backuppc backing up your data partitions 
and the issue is that you want to back up the raw disk images from your 
virtual machines OS disks the best thing to snapshot them as you have 
already setup and then simply rsync that snapshot to another host which 
will just transfer the deltas between the diskimages. This will leave 
you with backuppc providing an ongoing archive for your data partitions 
and a simple rsync backup for your root disks that will at worse mean 
you lose a days changes in case of a total failure.

-- 
Tim Fletchert...@night-shade.org.uk


--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backups very slow after upgrade to squeeze

2012-05-12 Thread Tim Fletcher
On 09/05/12 18:03, Les Mikesell wrote:
 I generally use --one-file-system as an rsync option which will keep 
 it from wandering into /proc and /sys as well as any nfs or iso mounts 
 that are accidentally in the path. Of course if you do that, you have 
 to add explict 'share' entries for each mount point that you want 
 backed up and be careful to add new ones as needed. 

I would argue that should be a configuration default to stop exactly 
this sort of thing happening.

-- 
Tim Fletchert...@night-shade.org.uk


--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Bare metal restores

2012-04-27 Thread Tim Fletcher
Try update-grub there is also a grub config generator but the name escapes me 
check the list of binaries in the grub package. 

--

Sent from a mobile device

Tim Fletcher

On 27 Apr 2012, at 17:16, Brad Alexander stor...@gmail.com wrote:

 Grub uses them too...But I changed them in grub.cfg and fstab (and
 /etc/cryptab), and grub was still having issues...I tried both
 update-grub /dev/sda and dpkg-reconfigure linux-image-3.2.0-2-amd64
 (to rebuild the initramfs) and both gave me disk not found.
 
 --b
 
 On Fri, Apr 27, 2012 at 12:09 PM, Les Mikesell lesmikes...@gmail.com wrote:
 On Fri, Apr 27, 2012 at 10:53 AM, Till Hofmann
 hofmannt...@googlemail.com wrote:
 
 
 You can change the UUIDs of your new drive so they match the ones of the old
 drive.
 tune2fs -U NEWUUID /dev/HD-PARTITION
 see man tune2fs .
 
 Just make sure you don't mix up the two drives since they will have
 identical UUIDs (which defeats the purpose of unique IDs).
 
 Or, boot with a live CD and edit the places where they are used -
 probably just /etc/fstab unless grub uses them too.
 
 If you are looking for a painless way to back up and restore a whole
 linux sustem, look at clonezilla-live which will do partition-image
 copies  or ReaR which will will do a traditional tar backup but will
 also build a boot ISO with a script to reconstruct your filesystem
 layout and restore it.  With a little fiddling you can adjust the
 destination sizes.   I'm not sure how either will mesh with encrypted
 disks, though.   I think ReaR could be tuned to do the parts of a bare
 metal restore that backuppc needs fairly easily since it is just a
 bunch of shell scripts.
 
 --
   Les Mikesell
 lesmikes...@gmail.com
 
 --
 Live Security Virtual Conference
 Exclusive live event will cover all the ways today's security and
 threat landscape has changed and how IT managers can respond. Discussions
 will include endpoint security, mobile security and the latest in malware
 threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/
 
 --
 Live Security Virtual Conference
 Exclusive live event will cover all the ways today's security and 
 threat landscape has changed and how IT managers can respond. Discussions 
 will include endpoint security, mobile security and the latest in malware 
 threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/
 

--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] backing up machines with more than one IP

2012-04-25 Thread Tim Fletcher
On Tue, 2012-04-24 at 19:24 -0700, Joel Uckelman wrote:
 I have a laptop which is sometimes connected to the network via wireless
 and sometimes via a wired connection. My router has a built-in DNS server
 which will not permit the same name to be associated with two different
 IPs, even if those two IPs are never in use simultaneously. Hence, when
 BackupPC looks up the IP of my laptop, my laptop might not have that IP
 at that time, which causes the backup to fail. What I'd like to happen
 is for the backup to succeed so long as my laptop is reachable via
 either IP address.
 
 How can I do this? Is there some way to give BackupPC a list of IPs to
 try?

If the laptop is running MacOSX, modern linux or Windows with the apple
bonjour software installed (it's installed automagically if you install
iTunes etc) then try using the host name of laptopname.local obviously
replacing laptopname with your laptop's name with any spaces etc taken
out.

-- 
Tim Fletcher t...@night-shade.org.uk


--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Moving backuppc to a new machine

2012-04-23 Thread Tim Fletcher
On Fri, 2012-04-20 at 12:41 -0500, Kameleon wrote:
 Currently our backuppc server is a Xen pv domU running Ubuntu 10.04.
 It has served us well over the past two years. However it is time to
 move it out of the virtual environment and back onto physical
 hardware. This is only so that it can be located on the far edge of
 our campus as far away from the physical servers it backs up as
 possible while still keeping it on the fiber network. So with that we
 are looking to install a fresh OS on the new hardware. We could stay
 with Ubuntu and just load 12.04. Most of our other servers are Centos
 or Fedora. Is there one distribution that is better than the other for
 backuppc? I will be moving the /var/lib/backuppc pool (it is on it's
 own lv) to the new machine. Should I expect any problems with this?

I have done something similar and migrated from a Ubuntu 10.04 machine
to a Fedora 15 (now 16) machine with a block level copy over ssh.

The steps where thus:

1. Install fedora or Distro of choice
2. Create Backuppc LV that is the size you want and must be bigger than
the current install
3. From the new machine run: ssh -l root -c arcfour oldmachine dd
if=/dev/backuppc/volume bs=1M | dd of=/dev/backuppc/newvolume bs=1M
4. Wait
5. Resize the filesystem to the correct size with:
resize2fs /dev/backuppc/newvolume
6. setup new backuppc install, iirc you needed to move a few config
files about and check pool mount points

-- 
Tim Fletcher t...@night-shade.org.uk


--
For Developers, A Lot Can Happen In A Second.
Boundary is the first to Know...and Tell You.
Monitor Your Applications in Ultra-Fine Resolution. Try it FREE!
http://p.sf.net/sfu/Boundary-d2dvs2
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Restoring complete virtualized Windows-Servers / Saving MBR

2012-04-16 Thread Tim Fletcher
On 16/04/12 11:50, Andreas Piening wrote:
 Hi,

 thank you for your response.

 At the moment I don't use disk images. Instead I use LVM volumes which are 
 directly connected to my KVM-machines.
 There is a way like creating images of the LVM volumes with a image tool like 
 partimage. These images would be compressed like 50GB in combined size, no 
 problem to copy it to an external usb drive but much data to transfer over a 
 VDSL50 internet connection.
 The point is that the customer wants me to backup the whole system 
 (real-server including 2 VMs) over the network to a different location. So 
 when the server gets unusable damaged for instance by beeing flooded with 
 water because of a pipe-break, or the system gets stolen, I should be able to 
 buy new hardware and get back to the state one day before the disaster 
 occurred.

 I like the efficient way of file based backups backupPC uses, so I ask for 
 experiences on that. But maybe I should search for a partition image tool 
 that supports incremental backups. I only know of Acronis True Image but this 
 is a commercial (and not cheap) way.

If you rip the LVM disk images to a file backuppc will just backup the 
changes as that's how rsync works, however this requires you to the 
storage to image the disk to a working location.

Your other option if you are looking for disaster recovery is something 
like DRDB and a live snapshot of the VMs to a remote location. This is 
not backup as the remote image is updated live so make a change and it's 
replicated to the remote location very quickly, but it is a DR option.

-- 
Tim Fletchert...@night-shade.org.uk


--
For Developers, A Lot Can Happen In A Second.
Boundary is the first to Know...and Tell You.
Monitor Your Applications in Ultra-Fine Resolution. Try it FREE!
http://p.sf.net/sfu/Boundary-d2dvs2
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] How to move to a new remove server?

2012-04-10 Thread Tim Fletcher

On 08/04/12 00:31, Alexander, Clint Mr ARMY GUEST USA USAMC USA wrote:


Hello everyone.

I'm joining the list for the first time in attempts to figure out a 
problem I cannot find a solution for no matter what keywords I have 
searched for...


I have 2 servers at a hosting company; an old one having a long and 
large pool of BackupPC data on it. The new one is replacing the old one.


The hard drive cannot be removed from the old server and added to the 
new one. It simply  needs to be copied from one location to the other 
via the internal GB network (no backup tapes or CD's or anything else, 
strictly network copy only).


My question is, How?




On the old server:

dd if=/dev/oldbackuppcfs bs=1M | ssh -c arcfour -l root newserver dd 
of=/dev/newbackuppcfs


This is the best way to move a backuppc filesystem by just copying the 
raw filesystem about.


--
Tim Fletchert...@night-shade.org.uk

--
Better than sec? Nothing is better than sec when it comes to
monitoring Big Data applications. Try Boundary one-second 
resolution app monitoring today. Free.
http://p.sf.net/sfu/Boundary-dev2dev___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] $hostIP does not seem to be taken into account

2012-04-10 Thread Tim Fletcher
On 04/04/12 16:46, Moritz Lennert wrote:
 Les Mikeselllesmikesellat  gmail.com  writes:
 An even better solution would be to arrange for dhcp to give a fixed
 IP address to that NIC or to dynamically update DNS so you can use dns
 instead of nmblookup.

 Yeah, that would be great, but that's not in my hands to decide and the answer
 is no. I'll have to look into the possibility of creating a local dns 
 server...


It might be worth looking into mDNS aka Bonjour aka Avahi depending on 
your flavor of OS

-- 
Tim Fletchert...@night-shade.org.uk


--
Better than sec? Nothing is better than sec when it comes to
monitoring Big Data applications. Try Boundary one-second 
resolution app monitoring today. Free.
http://p.sf.net/sfu/Boundary-dev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Issues with DDNS and DHCP

2012-03-28 Thread Tim Fletcher
On Wed, 2012-03-28 at 09:44 -0600, Michael Coffman wrote:
 Hello,
 
 We have been using backuppc with great success until recently.   
 
 Our company has been mucking around with DHCP and DDNS configuration
 to support users migrating from a remote login to wireless to
 hardwired  ( I have not control or authority into this
 environment).   
 
 The end result is that backuppc for very mobile users seems to be
 perpetually broken.   As their laptop moves around it gets multiple
 IPs associated with it and the forward-reverse lookups don't always
 work.We have even had the case of a users laptop getting the IP
 that had previously belong to another system and the IP had not been
 flushed from DDNS cleanly - end result was laptop A backed up into the
 location form laptop B.Has anyone else had to deal with this?
 Any creative ideas for preventing this previous issue or for dealing
 with ever changing system names?I believe that the right answer is
 to fix our DHCP/DNS/DDNS environment, but I dont' think that's going
 to happen.   I am about ready to bail on backuppc for our highly
 mobile users :(Not sure what I expect from the list, just kind of
 grasping at straws at this point.

Can you use mDNS aka Bonjour aka Avahi depending on the flavor of the
implementation?

If you can ping machinename.local then you might have a chance to make
it work as the machine (normally) is the one that is replying to the
mDNS request.

-- 
Tim Fletcher t...@night-shade.org.uk


--
This SF email is sponsosred by:
Try Windows Azure free for 90 days Click Here 
http://p.sf.net/sfu/sfd2d-msazure
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Issues with DDNS and DHCP

2012-03-28 Thread Tim Fletcher
On 28 Mar 2012, at 17:49, Tyler J. Wagner ty...@tolaris.com wrote:

 On 2012-03-28 17:26, Tim Fletcher wrote:
 Can you use mDNS aka Bonjour aka Avahi depending on the flavor of the
 implementation?
 
 If you can ping machinename.local then you might have a chance to make
 it work as the machine (normally) is the one that is replying to the
 mDNS request.
 
 mDNS works great for us. However, we also use normal DHCP+DNS integration
 for hosts outside the backup server's LAN (where mDNS doesn't reach).

Would an avahi reflector help you reach those remote hosts?

--

Sent from a mobile device
 


--
This SF email is sponsosred by:
Try Windows Azure free for 90 days Click Here 
http://p.sf.net/sfu/sfd2d-msazure
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Issues with DDNS and DHCP

2012-03-28 Thread Tim Fletcher
On 28 Mar 2012, at 18:01, Tyler J. Wagner ty...@tolaris.com wrote:

 On 2012-03-28 17:58, Tim Fletcher wrote:
 mDNS works great for us. However, we also use normal DHCP+DNS integration
 for hosts outside the backup server's LAN (where mDNS doesn't reach).
 
 Would an avahi reflector help you reach those remote hosts?
 
 I hadn't even considered that. That might be easier than DHCP+DNS with some
 of our networks (f%*$ing Mikrotik broken DNS implementation).

I've blogged briefly about getting iTunes wireless sync across subnets with 
avahi acting as a reflector.

http://blog.night-shade.org.uk/

--

Sent from a mobile device

Tim Fletcher
--
This SF email is sponsosred by:
Try Windows Azure free for 90 days Click Here 
http://p.sf.net/sfu/sfd2d-msazure
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Issues with DDNS and DHCP

2012-03-28 Thread Tim Fletcher

On 28/03/12 18:34, Michael Coffman wrote:


If you can ping machinename.local then you might have a chance
to make
it work as the machine (normally) is the one that is replying to the
mDNS request.


Not usually.  When they get bollixed up, DNS does not always have an 
IP for the original name..   I have 26 systems this morning that dig 
does not return an IP on :(


DNS and mDNS are different, mDNS normally relies on the host being 
searched for responding not a server, there are server ways of doing it 
to but it's really designed for small/medium LANs so they just work


--
Tim Fletchert...@night-shade.org.uk

--
This SF email is sponsosred by:
Try Windows Azure free for 90 days Click Here 
http://p.sf.net/sfu/sfd2d-msazure___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] howto cancel incremental backups

2012-03-19 Thread Tim Fletcher
On Mon, 2012-03-19 at 09:59 +0100, deconya wrote:
 Hi
 
 Im searching how to prepare a configuration to do only full backups,
 because I don't need incremental. I don't view any option to cancel
 this option, it's possible to do this? 

Reduce the full interval below the incremental interval and fulls will
take priority over incremental's.

-- 
Tim Fletcher t...@night-shade.org.uk


--
This SF email is sponsosred by:
Try Windows Azure free for 90 days Click Here 
http://p.sf.net/sfu/sfd2d-msazure
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] problems with rsync in backuppc

2012-03-15 Thread Tim Fletcher
On Thu, 2012-03-15 at 10:35 +0100, deconya wrote:
 Hi
 
 I was commeting in other thread my problems using rsync for the
 message Unable to read 4 bytes. I was checking log messages and
 checking config. And at now my problem is other, appears 
 
 full backup started for directory /etc/
 Running: /usr/bin/ssh -q -x -l root 192.168.0.254 /usr/bin/rsync --server 
 --sender --numeric-ids --perms --owner --group -D --links --hard-links 
 --times --block-size=2048 --recursive --ignore-times . /etc/
 Xfer PIDs are now 7416
 Got remote protocol 1752392034
 Fatal error (bad version): bash: /usr/bin/rsync: No existe el fichero o el 
 directorio
 
 Read EOF: Conexión reinicializada por la máquina remota
 Tried again: got 0 bytes
 fileListReceive() failed
 Done: 0 files, 0 bytes
 Got fatal error during xfer (fileListReceive failed)
 Backup aborted (fileListReceive failed)
 
 --
 
 If someone understand the message will be agree with any suggestion. 
 
The remote end doesn't have rsync installed

-- 
Tim Fletcher t...@night-shade.org.uk


--
This SF email is sponsosred by:
Try Windows Azure free for 90 days Click Here 
http://p.sf.net/sfu/sfd2d-msazure
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Ssh/Rsync transfer fails intermittently

2012-03-03 Thread Tim Fletcher
On Sun, 2012-03-04 at 01:40 +0200, Chris Mavrakis wrote:
 BackupPC works good on my LAN but has problems backing-up a remote
 server of mine. I'm using rsync over ssh and have increased max ping
 to 500 because the remote server is far away. 
 
 
 Almost always the transfer is ending after 1h01'or 1h02'. These values
 in seconds are 3660 and 3720. Maybe they appear in some option...? 
 
 If I manually login to this remote ssh server, and leave my terminal
 untouched for 1h26 I get disconnected. Looks like this is the reason
 BackupPC gets disconnected, too. Is there any way to simulate the
 execution of commands inside BackupPC's session?

Firewall timeout somewhere?

Could be a login time limit?

-- 
Tim Fletcher t...@night-shade.org.uk


--
Virtualization  Cloud Management Using Capacity Planning
Cloud computing makes use of virtualization - but cloud computing 
also focuses on allowing computing to be delivered as a service.
http://www.accelacomm.com/jaw/sfnl/114/51521223/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Migrating backuppc (yes, that again...:)

2012-02-24 Thread Tim Fletcher
On Fri, 2012-02-24 at 14:57 -0500, Brad Alexander wrote:
 Hey all,
 
 I'm running into a problem migrating my /var/lib/backuppc pc
 directory. I got cpool, log, pool, tmp, and trash migrated via rsync,
 and I am attempting to migrate the pc directory.

It's seriously much easier to copy the raw filesystem, with dd and ssh
or netcat.

something like:

dd if=/dev/filesystem bs=1M | ssh -C farragut dd of=/dev/newfilesystem

If you are on the same network it might well be faster to use netcat.

-- 
Tim Fletcher t...@night-shade.org.uk


--
Virtualization  Cloud Management Using Capacity Planning
Cloud computing makes use of virtualization - but cloud computing 
also focuses on allowing computing to be delivered as a service.
http://www.accelacomm.com/jaw/sfnl/114/51521223/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] The famous Backup aborted (Unable to read 4 bytes) error

2012-02-21 Thread Tim Fletcher
On Tue, 2012-02-21 at 10:02 -0600, Les Mikesell wrote:
 On Tue, Feb 21, 2012 at 9:55 AM, Micha Kersloot mi...@kovoks.nl
 wrote:
  
 I've added this to the RsyncArgs hopefully it will make things
 clear.
 
 Is there anything that could be timing-related that could cause the
 machine or network to be unavailable at the scheduled time (machines
 going to sleep, etc.)?   I'd expect that to result in a ping failure
 instead of starting at all, but it might be worth checking.

I was also thinking about dns oddities, have you
changed /etc/resolv.conf recently?

-- 
Tim Fletcher t...@night-shade.org.uk


--
Keep Your Developer Skills Current with LearnDevNow!
The most comprehensive online learning library for Microsoft developers
is just $99.99! Visual Studio, SharePoint, SQL - plus HTML5, CSS3, MVC3,
Metro Style Apps, more. Free future releases when you subscribe now!
http://p.sf.net/sfu/learndevnow-d2d
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] I've Tried Everything

2012-02-16 Thread Tim Fletcher
On Thu, 2012-02-16 at 09:59 -0500, Zach Lanich wrote:
 After using the # cat /var/lib/backuppc/.ssh/Config
 StrictHostKeyChecking on, im now getting:
 
 Running: /usr/bin/ssh -q -x -n -l root zachs-macbook-pro env LC_ALL=C 
 /usr/bin/tar -c -v -f - -C /Users/zlanich/Sites --totals .
 full backup started for directory /Users/zlanich/Sites
 Xfer PIDs are now 29485,29484
 Tar exited with error 65280 () status
 tarExtract: Done: 0 errors, 0 filesExist, 0 sizeExist, 0 sizeExistComp, 0 
 filesTotal, 0 sizeTotal
 Got fatal error during xfer (Tar exited with error 65280 () status)
 Backup aborted (lost network connection during backup)
 Not saving this as a partial backup since it has fewer files than the prior 
 one (got 0 and 0 files versus 35249)
 
 Using Tar. I have a Quick network with Gigabit and Wireless N. I'm
 dumfounded as to how it could be having network issues midbackup

OK so, quick google of Tar exited with error 65280 strongly suggests
it's a login problem so I have a little test for you:

I'm not clear what version of linux you are using on the backuppc server
but this should work on most versions of linux, pull up a terminal
window and type the following:

sudo -s -H -u backuppc 

this opens a shell up as the backuppc user, next run this command:

/usr/bin/ssh -l root zachs-macbook-pro

you should see a new shell open up on your Macbook pro without asking
any questions or for any passwords, however I don't think you will for
one of 3 reasons:

1. You haven't enabled the root user on the mac, as it's disabled by
default.

2. You haven't put the right name in, if you are using bonjour (aka
mDNS) then you need to put the name in as zachs-macbook-pro.local
the .local is important.

3. You haven't got your ssh keys setup correctly.

-- 
Tim Fletcher t...@night-shade.org.uk


--
Virtualization  Cloud Management Using Capacity Planning
Cloud computing makes use of virtualization - but cloud computing 
also focuses on allowing computing to be delivered as a service.
http://www.accelacomm.com/jaw/sfnl/114/51521223/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] I've Tried Everything

2012-02-16 Thread Tim Fletcher
On Thu, 2012-02-16 at 13:22 -0500, Zach Lanich wrote:
 also, because of small dns issues, i just added 192.168.1.2
 zachs-macbook-pro to the etc/hosts file and that seems to work fine
 for resolving my reserved ip

So now that you have sorted out this problem what happens if you try the
backup again?

-- 
Tim Fletcher t...@night-shade.org.uk


--
Virtualization  Cloud Management Using Capacity Planning
Cloud computing makes use of virtualization - but cloud computing 
also focuses on allowing computing to be delivered as a service.
http://www.accelacomm.com/jaw/sfnl/114/51521223/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] which one better ssh+rsync or rsyncd ?

2012-02-15 Thread Tim Fletcher
On Wed, 2012-02-15 at 14:15 +0530, Anand Gupta wrote:
 Hi,
 
 In ssh + rsync, won't you still be running ssh service ?

Yes you will be almost every modern unix system has some sort of ssh
service running normally by default.

-- 
Tim Fletcher t...@night-shade.org.uk


--
Virtualization  Cloud Management Using Capacity Planning
Cloud computing makes use of virtualization - but cloud computing 
also focuses on allowing computing to be delivered as a service.
http://www.accelacomm.com/jaw/sfnl/114/51521223/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Network Backup of backuppc

2012-02-14 Thread Tim Fletcher
On Tue, 2012-02-14 at 08:38 +0700, hans...@gmail.com wrote:
 Either run a second BPC instance over the WAN directly to the target
 hosts, or send compressed tar snapshots, whichever is more appropriate
 for your combination of bandwidth, volume of data, backup time window,
 number of target hosts, degree of duplicated data etc 

Or option 3 which is to produce local snapshots of the backups and then
sync them over the network to have local backup and archive and offsite
backup.


-- 
Tim Fletcher t...@night-shade.org.uk


--
Keep Your Developer Skills Current with LearnDevNow!
The most comprehensive online learning library for Microsoft developers
is just $99.99! Visual Studio, SharePoint, SQL - plus HTML5, CSS3, MVC3,
Metro Style Apps, more. Free future releases when you subscribe now!
http://p.sf.net/sfu/learndevnow-d2d
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backuppc is Hammering My Clients

2012-02-02 Thread Tim Fletcher
On Wed, 2012-02-01 at 20:24 -0500, Jeffrey J. Kosowsky wrote:
 Les Mikesell wrote at about 07:40:20 -0600 on Wednesday, February 1, 2012:
   On Wed, Feb 1, 2012 at 2:30 AM, Kimball Larsen quang...@gmail.com wrote:
   
 Do any
have local time machine backups that might be included?
   
No, time machine is on external drives, specifically excluded from 
 backups.
   
   It might be worth checking that the excludes work and the links that
   make it show on the desktop aren't being followed.
   
Or
directories with very large numbers of files?
   
This I can check on.  What is considered very large numbers of files?  
 More than 1024?  More than 102400?
   
   It would be relative to the amount of RAM available - probably millions.
 
 I have no trouble backing up half a million files on a system with
 just 512MB. Surely, if you have high-powered relatively new PCs you
 will have many 4+ Gigabytes so it is unlikely that RAM swapping will
 be the problem. Plus any 3.0 version of rsync uses RAM quite
 efficiently.

I find that the speed of the underlying storage is more of a factor than
the ram or cpu limits

-- 
Tim Fletcher t...@night-shade.org.uk


--
Keep Your Developer Skills Current with LearnDevNow!
The most comprehensive online learning library for Microsoft developers
is just $99.99! Visual Studio, SharePoint, SQL - plus HTML5, CSS3, MVC3,
Metro Style Apps, more. Free future releases when you subscribe now!
http://p.sf.net/sfu/learndevnow-d2d
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backuppc is Hammering My Clients

2012-02-01 Thread Tim Fletcher
On Tue, 2012-01-31 at 19:47 -0500, Steve wrote:
  On Tue, Jan 31, 2012 at 5:25 PM, Kimball Larsen quang...@gmail.com wrote:
  Is there anything I can to to have the backups run in a more transparent 
  manner?  We are not all that concerned with speed of backup process - 
  we're all here all day anyway, so as long as everyone gets a backup at 
  least once a day we're happy.
 
 check the nice level during the backup (on the client) and see where
 rsync is running.  If it's the same as the other user processes, maybe
 change the $Conf{RsyncClientCmd} to include a nice level?  I know that
 is suggested here:
 http://backuppc.sourceforge.net/faq/ssh.html

If you add the flags -c arcfour to the ssh command line that will use
the arcfour cypher which is less less secure but much cpu intensive, I
use it on CPU bound clients such as my iPhone.

-- 
Tim Fletcher t...@night-shade.org.uk


--
Keep Your Developer Skills Current with LearnDevNow!
The most comprehensive online learning library for Microsoft developers
is just $99.99! Visual Studio, SharePoint, SQL - plus HTML5, CSS3, MVC3,
Metro Style Apps, more. Free future releases when you subscribe now!
http://p.sf.net/sfu/learndevnow-d2d
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Bizarre connection failures

2012-01-31 Thread Tim Fletcher
On Tue, 2012-01-31 at 20:25 +0300, haruspex wrote:
 Hello.
 
 Name resolution, via nmblookup. IP addresses are assigned via DHCP, so
 subject to change. So you are saying that connection would have
 established fine, but it tries to establish it with a wrong place?..
 An interesting idea. Now, how can the result of the BackupPC name
 resolution can be checked?..

nmblookup on the server

-- 
Tim Fletcher t...@night-shade.org.uk


--
Keep Your Developer Skills Current with LearnDevNow!
The most comprehensive online learning library for Microsoft developers
is just $99.99! Visual Studio, SharePoint, SQL - plus HTML5, CSS3, MVC3,
Metro Style Apps, more. Free future releases when you subscribe now!
http://p.sf.net/sfu/learndevnow-d2d
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] backuppc incremental taking up a lot of bandwidth with no additional changes

2012-01-20 Thread Tim Fletcher
On Fri, 2012-01-20 at 16:11 -0800, smallpox wrote:
 
 On 1/20/2012 11:49 AM, Jeffrey J. Kosowsky wrote:

  Second, what makes you think the issue is a bandwidth issue? You just
  said that it takes 10 minutes. Have you determined that the time is
  due to bandwidth bottlenecks and not just disk reads and rsync
  computations?
 
 I see mrtg's graph, it's doing about 5mbit for 10 minutes.

Could the problem be caused by the granularity of timestamps[1] on
Windows filesystems?

Try adding --modify-window=1 to the rsync command line

[1]ftp://pserver.samba.org/pub/unpacked/rsyncweb/daylight-savings.html

-- 
Tim Fletcher t...@night-shade.org.uk


--
Try before you buy = See our experts in action!
The most comprehensive online learning library for Microsoft developers
is just $99.99! Visual Studio, SharePoint, SQL - plus HTML5, CSS3, MVC3,
Metro Style Apps, more. Free future releases when you subscribe now!
http://p.sf.net/sfu/learndevnow-dev2
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] trying to improve the speed at which backuppc rsync back up processes a large binary file in incremental backups

2012-01-13 Thread Tim Fletcher
On Fri, 2012-01-13 at 14:51 +1100, Adam Goryachev wrote:

 Definitely, I find that transferring (for example) uncompressed
 mysqldump files will transfer less bytes than transferring the
 compressed files. (Obviously because the compressed file will transfer
 100%, while the uncompressed will only transfer the changes).

http://beeznest.wordpress.com/2005/02/03/rsyncable-gzip/
-- 
Tim Fletcher t...@night-shade.org.uk


--
RSA(R) Conference 2012
Mar 27 - Feb 2
Save $400 by Jan. 27
Register now!
http://p.sf.net/sfu/rsa-sfdev2dev2
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Problem with status graphs

2011-12-28 Thread Tim Fletcher
On Wed, 2011-12-28 at 10:28 -0300, Carlos Albornoz wrote:
 Hi, i have a backuppc and backups has working fine, but the graphs on
 status page dont work?
 
 Any idea why dont work? i dont see any log about this. The
 installation is by default.

The graphs are part of the Debian package so only work on Debian and
derivatives such as Ubuntu.

-- 
Tim Fletcher t...@night-shade.org.uk


--
Write once. Port to many.
Get the SDK and tools to simplify cross-platform app development. Create 
new or port existing apps to sell to consumers worldwide. Explore the 
Intel AppUpSM program developer opportunity. appdeveloper.intel.com/join
http://p.sf.net/sfu/intel-appdev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Problem with status graphs

2011-12-28 Thread Tim Fletcher
On Wed, 2011-12-28 at 10:57 -0300, Carlos Albornoz wrote:

 Justly is installed on Ubuntu Server 10.04 LTS
 
 I need any special package?

Just rrdtool I think, the graphs are generated by BackupPC_nightly won't
appear straight away.

-- 
Tim Fletcher t...@night-shade.org.uk


--
Write once. Port to many.
Get the SDK and tools to simplify cross-platform app development. Create 
new or port existing apps to sell to consumers worldwide. Explore the 
Intel AppUpSM program developer opportunity. appdeveloper.intel.com/join
http://p.sf.net/sfu/intel-appdev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] URL to restore a particular file from the latest backup

2011-12-19 Thread Tim Fletcher
On Mon, 2011-12-19 at 12:13 +0100, Dmitry Katsubo wrote:
 Dear BackupPC users!
 
 I wonder if it is possible to construct URL for file restore, that will
 always refer the latest backup (in particular by skipping num
 parameter or giving it a special value)?
 
 For example, currently the file invoice.pdf can be restored from
 backup 204 as follows:

 It looks like currently num parameter is obligatory...

does -1 work as that is short hand for latest in some of the commandline
tools

-- 
Tim Fletcher t...@night-shade.org.uk


--
Learn Windows Azure Live!  Tuesday, Dec 13, 2011
Microsoft is holding a special Learn Windows Azure training event for 
developers. It will provide a great way to learn Windows Azure and what it 
provides. You can attend the event by watching it streamed LIVE online.  
Learn more at http://p.sf.net/sfu/ms-windowsazure
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] 8.030.000, Too much files to backup ?

2011-12-19 Thread Tim Fletcher
On Mon, 2011-12-19 at 12:32 -0600, Les Mikesell wrote:
 On Mon, Dec 19, 2011 at 12:04 PM, Jean Spirat jean.spi...@squirk.org wrote:
 
  I directly mount the nfs share on the backuppc server so no need for
  rsyncd here this is like local backup with the NFS overhead of course.
 
 The whole point of rsync is that it can read the files locally with
 block checksums to decide what it really has to copy over the network.
  Doing it over NFS, you've already had to copy if over the network so
 rsync at the wrong end can read it (and decide that it didn't have
 to...).

I think the real problem is the metadata access, and after a bit of
digging I've dug this up, it's comparing iSCSI with NFS.

But what might help is tweaking the NFS settings to improve the metadata
caching etc.

6.2 Meta-data intensive applications
NFS and iSCSI show their greatest differences in their
handling of meta-data intensive applications. Overall,
we find that iSCSI outperforms NFS for meta-data in-
tensive workloads—workloads where the network traffic
is dominated by meta-data accesses.
The better performance of iSCSI can be attributed to
two factors. First, NFS requires clients to update meta-
data synchronously to the server. In contrast, iSCSI,
when used in conjunction with modern file systems, up-
dates meta-data asynchronously. An additional bene-
fit of asynchronous meta-data updates is that it enables
update aggregation—multiple meta-data updates to the
same cached cached block are aggregated into a single
network write, yielding significant savings. Such opti-
mizations are not possible in NFS v2 or v3 due to their
synchronous meta-data update requirement.
Second, iSCSI also benefits from aggressive meta-
data caching by the file system. Since iSCSI reads are
in granularity of disk blocks, the file system reads and
caches entire blocks containing meta-data; applications
with meta-data locality benefit from such caching. Al-
though the NFS client can also cache meta-data, NFS
clients need to perform periodic consistency checks with
the server to provide weak consistency guarantees across
client machines that share the same NFS namespace.
Since the concept of sharing does not exist in the SCSI
architectural model, the iSCSI protocol also does not pay
the overhead of such a consistency protocol.

Full details are here: http://lass.cs.umass.edu/papers/pdf/FAST04.pdf

-- 
Tim Fletcher t...@night-shade.org.uk


--
Write once. Port to many.
Get the SDK and tools to simplify cross-platform app development. Create 
new or port existing apps to sell to consumers worldwide. Explore the 
Intel AppUpSM program developer opportunity. appdeveloper.intel.com/join
http://p.sf.net/sfu/intel-appdev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Scary problem with USB3...

2011-12-16 Thread Tim Fletcher
On Thu, 2011-12-15 at 16:38 -0500, Zach La Celle wrote:

 Regarding some other responses, I'll be sure to try eSATA next time
 instead of USB to see if that tends to be more stable.  I could also use
 internal drives, I suppose...the real reason we're using external disks
 is so that we can replace them and store them off-site every once in a
 while.  Using a pre-packaged drive was simply more convenient.

I use one of these external usb/eSATA - SATA docks, they are about £20
in the UK and let you hot swap a SATA drive a bit like a tape.

http://www.amazon.co.uk/Startech-Esata-Sata-Dock-2-5-3-5/dp/B0026B8VR0/



-- 
Tim Fletcher t...@night-shade.org.uk


--
Learn Windows Azure Live!  Tuesday, Dec 13, 2011
Microsoft is holding a special Learn Windows Azure training event for 
developers. It will provide a great way to learn Windows Azure and what it 
provides. You can attend the event by watching it streamed LIVE online.  
Learn more at http://p.sf.net/sfu/ms-windowsazure
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] 8.030.000, Too much files to backup ?

2011-12-16 Thread Tim Fletcher
On Fri, 2011-12-16 at 10:42 +0100, Jean Spirat wrote:
 hi,
 
   I use backuppc to save a webserver. The issue is that the application 
 used on it is making thousand of little files used for a game to create 
 maps and various things. The issue is that we are now at 100GB of data 
 and 8.030.000 files so the backups takes 48H and more (to help the files 
 are on NFS share). I think i come to the point where file backup is at 
 it's limit.

 ps: backuppc server and the web server are debian linux,  i use rysnc 
 method and backup  the NFS that i mount localy on the backuppc server.

I have a backup with a similar number of files in and I have found that
tar is much better than rsync. Your issues are:

1. rsync will take a very long time and a very large amount of memory to
build the file tree, especially over NFS

2. NFS isn't really a high performance filesystem, you are better off
working locally on the server being backed up via ssh.

I would suggest you try the following: 

Move to tar over ssh on the remote webserver, the first full backup
might well take a long time but the following ones should be faster.

tar+ssh backups however use more bandwidth but as you are already using
nfs I am assuming you are on a local network of some sort.

-- 
Tim Fletcher t...@night-shade.org.uk


--
Learn Windows Azure Live!  Tuesday, Dec 13, 2011
Microsoft is holding a special Learn Windows Azure training event for 
developers. It will provide a great way to learn Windows Azure and what it 
provides. You can attend the event by watching it streamed LIVE online.  
Learn more at http://p.sf.net/sfu/ms-windowsazure
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] 8.030.000, Too much files to backup ?

2011-12-16 Thread Tim Fletcher
On Fri, 2011-12-16 at 11:49 +0100, Jean Spirat wrote:

  I would suggest you try the following:

  tar+ssh backups however use more bandwidth but as you are already using
  nfs I am assuming you are on a local network of some sort.

 for my understanding  rsync had allways seems to be the most efficient 
 of the two but i never challenged this fact ;p
 
   i will have a look at tar and see if i can work with it .

http://www.mail-archive.com/backuppc-users@lists.sourceforge.net/msg15217.html

Is the pros and cons of tar and rsync in far more detail than I can
offer.

-- 
Tim Fletcher t...@night-shade.org.uk


--
Learn Windows Azure Live!  Tuesday, Dec 13, 2011
Microsoft is holding a special Learn Windows Azure training event for 
developers. It will provide a great way to learn Windows Azure and what it 
provides. You can attend the event by watching it streamed LIVE online.  
Learn more at http://p.sf.net/sfu/ms-windowsazure
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] 8.030.000, Too much files to backup ?

2011-12-16 Thread Tim Fletcher
On Fri, 2011-12-16 at 07:33 -0600, Les Mikesell wrote:
 On Fri, Dec 16, 2011 at 4:49 AM, Jean Spirat jean.spi...@squirk.org wrote:

  for my understanding  rsync had allways seems to be the most efficient
  of the two but i never challenged this fact ;p
 
 Rsync working natively is very efficient, but think about what it has
 to do in your case.   It will have to read the entire file across nfs
 just so rsync can compere contents and decide not to copy the content
 that already exists in your  backup.
 
   i will have a look at tar and see if i can work with it .
 
 I'd try rsync over ssh first, at least if most of the files do not
 change between runs.   If you don't have enough ram to hold the
 directory listing or if there are changes to a large number of files
 per run, tar might be faster.

The real issue with rsync is the memory usage for the 8 million entries
in the file list. This is because the first thing that happens is rsync
walks the tree comparing with already backuped up files to see if the
date stamp has changed. This puts memory and disk load on both the
backup server and the backed up client. The approach that tar uses is
just to walk the directory tree and transfer everything newer than a
timestamp that backuppc passes to it. 

This costs some extra network bandwidth but massively reduces the disk
and memory bandwidth needed on both the backuppc client and server.

The server that I am backing up with ~7 million files takes on the order
of 6000 minutes to backup with rsync, the bulk of that time is taken up
by rsync building the tree of files to transfer. The same server takes
about 2500 minutes with tar because of the simpler way of finding files.

Overall rsync makes better backups because it finds moved and deleted
files and is far far more efficient with network bandwidth, but if you
understand the draw backs and need the filesystem efficiency of tar then
it is still an excellent backup tool.

-- 
Tim Fletcher t...@night-shade.org.uk


--
Learn Windows Azure Live!  Tuesday, Dec 13, 2011
Microsoft is holding a special Learn Windows Azure training event for 
developers. It will provide a great way to learn Windows Azure and what it 
provides. You can attend the event by watching it streamed LIVE online.  
Learn more at http://p.sf.net/sfu/ms-windowsazure
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Disk Full?

2011-12-12 Thread Tim Fletcher
On Mon, 2011-12-12 at 08:25 +0100, Christian Völker wrote:
 Hi,
 
 I got an email from my backuppc host this night:
 --
 Yesterday 160 hosts were skipped because the file system containing
 /var/lib/BackupPC/ was too full.  The threshold in the
 configuration file is 97%, while yesterday the file system was
 up to 96% full.
 --
 So I'm wondering about two facts there:
 1. I don't have 160 hosts- where is this number coming from?
 2. is 97% lower than 96%?

The filesystem may have become less full after the backups failed to
start but before the email was sent as the nightly tidying run will have
taken place.

ie in time order:

backups (failed) - nightly tidying run (removes some old files) -
sending email 

-- 
Tim Fletcher t...@night-shade.org.uk


--
Learn Windows Azure Live!  Tuesday, Dec 13, 2011
Microsoft is holding a special Learn Windows Azure training event for 
developers. It will provide a great way to learn Windows Azure and what it 
provides. You can attend the event by watching it streamed LIVE online.  
Learn more at http://p.sf.net/sfu/ms-windowsazure
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backing up from BackupPC to BackupPC

2011-12-09 Thread Tim Fletcher
On Fri, 2011-12-09 at 13:15 +, member horvath wrote:
 Hi,
 
 I have a requirement where I need to deploy a backuppc installation on
 a site that will connect to several servers and backup their required
 files.
 I need to keep a daily incremental of 30 daily and 6 monthly backups.
 This part is ok and I have no problem setting up (Except getting the
 schedule right - I find this hard to do with backuppc)
 
 As an offsite backup I'd like my onsite backuppc unit to inform my
 offsite backuppc unit that the backup is complete and then the remote
 needs to pull only the most current backup from the onsite.
 So basically a 30 day/6 month onsite backup with the most current
 backup stored offsite
 Can this be done?

I'd look into the archive functionally of backuppc and push the current
backup as a tarball to the offsite host and not worry about running
remote backuppc. As you say you are only looking to hold the current
backup offsite you can simply transfer the current archive with
scp/rsync/tar over ssh to the offsite host.

-- 
Tim Fletcher t...@night-shade.org.uk


--
Cloud Services Checklist: Pricing and Packaging Optimization
This white paper is intended to serve as a reference, checklist and point of 
discussion for anyone considering optimizing the pricing and packaging model 
of a cloud services business. Read Now!
http://www.accelacomm.com/jaw/sfnl/114/51491232/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] weird smörgåsbord of errors

2011-12-09 Thread Tim Fletcher
On Fri, 2011-12-09 at 21:50 +0100, Michel wrote:

  Why would your restore to /raid/home/user be affecting things under
  /var?   In any case, I'd recommend using rsync over ssh to the server
  for the backups instead of backing up the nfs-mounted view.
 
 That is a very good question indeed..
 
 The folder I was trying to restore was a harmless one in /raid that
 should have nothing to do with the rest of the system as far as I
 know... It's just the place I store the /home of a completely
 different system.

Could it be that the io of the backuppc process is triggering a hardware
or software failure in the OS?

Is there anything in dmesg or /var/log ?

-- 
Tim Fletcher t...@night-shade.org.uk


--
Cloud Services Checklist: Pricing and Packaging Optimization
This white paper is intended to serve as a reference, checklist and point of 
discussion for anyone considering optimizing the pricing and packaging model 
of a cloud services business. Read Now!
http://www.accelacomm.com/jaw/sfnl/114/51491232/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] undefined symbol: Perl_Gthr_key_ptr

2011-12-06 Thread Tim Fletcher
On Tue, 2011-12-06 at 07:56 -0600, Richard Shaw wrote:
 On Tue, Dec 6, 2011 at 6:05 AM, Neal Becker ndbeck...@gmail.com wrote:
  Thanks.  I got it fixed.  A mix of f15 and f16 perl packages was to blame.
 
 Good to hear! Fix it up with a good ole' yum distro-sync?

package-cleanup --orphans is also a good tool for this

-- 
Tim Fletcher t...@night-shade.org.uk


--
Cloud Services Checklist: Pricing and Packaging Optimization
This white paper is intended to serve as a reference, checklist and point of 
discussion for anyone considering optimizing the pricing and packaging model 
of a cloud services business. Read Now!
http://www.accelacomm.com/jaw/sfnl/114/51491232/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] starting backup-PC

2011-12-01 Thread Tim Fletcher
On Thu, 2011-12-01 at 12:21 +, Greer, Jacob - District Tech wrote:
 Thanks for the response I am using Ubuntu server instead.  I read online that 
 backupPC is an install option.

apt-get install backuppc covers most of it :)

-- 
Tim Fletcher t...@night-shade.org.uk


--
All the data continuously generated in your IT infrastructure 
contains a definitive record of customers, application performance, 
security threats, fraudulent activity, and more. Splunk takes this 
data and makes sense of it. IT sense. And common sense.
http://p.sf.net/sfu/splunk-novd2d
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] help with netbios error

2011-11-18 Thread Tim Fletcher
On Fri, 2011-11-18 at 06:51 -0800, AIM Systems wrote:

 I don't get it?! What am I missing?!

A quick google implies it might be related to 2 connections to the
network either 2xethernet or ethernet+wireless.



http://www.backupcentral.com/phpBB2/two-way-mirrors-of-external-mailing-lists-3/backuppc-21/dum-failed-error-has-mismatching-netbios-name-workgroup-102938/

-- 
Tim Fletcher t...@night-shade.org.uk


--
All the data continuously generated in your IT infrastructure 
contains a definitive record of customers, application performance, 
security threats, fraudulent activity, and more. Splunk takes this 
data and makes sense of it. IT sense. And common sense.
http://p.sf.net/sfu/splunk-novd2d
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] help with netbios error

2011-11-18 Thread Tim Fletcher
On Fri, 2011-11-18 at 11:09 -0800, AIM Systems wrote:
 Thanks for your reply Tim.
 
 I, too, have read this when 'Google'ing the error and have so far been ruled 
 it out.
 Physically, there are no ethernet jacks in the walls of the office, hence the 
 WAPs
 How does one determine if a device has multiple IPs? 
 /bin/nmblookup only returns the one address.

What does ipconfig /all on the windows machine say?

-- 
Tim Fletcher t...@night-shade.org.uk


--
All the data continuously generated in your IT infrastructure 
contains a definitive record of customers, application performance, 
security threats, fraudulent activity, and more. Splunk takes this 
data and makes sense of it. IT sense. And common sense.
http://p.sf.net/sfu/splunk-novd2d
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Child exited prematurely

2011-11-15 Thread Tim Fletcher
On Tue, 2011-11-15 at 17:36 +0100, Christian Völker wrote:
 Yohoo!
 
 On 15/11/2011 17:19, Les Mikesell wrote:
  2011/11/15 Christian Völker chrisc...@knebb.de:
  Can't open /var/lib/BackupPC/pc/netinstall/934/f%2fsrv%2fftp for MD4
  check (err=-3, .)
  .: md4 doesn't match: will retry in phase 1; file removed
  Parent read EOF from child: fatal error!
  Done: 0 files, 0 bytes
  Got fatal error during xfer (Child exited prematurely)
  Backup aborted (Child exited prematurely)
  

 Any other idea?

Disk / RAID faults? check dmesg 

-- 
Tim Fletcher t...@night-shade.org.uk


--
RSA(R) Conference 2012
Save $700 by Nov 18
Register now
http://p.sf.net/sfu/rsa-sfdev2dev1
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] backuppc and excluding ip ranges?

2011-11-10 Thread Tim Fletcher
On Thu, 2011-11-10 at 15:56 -0500, SSzretter wrote:

 
 It would be great if a flag could be set to tell backuppc to only
 backup a machine if it is in a specific subnet range (192.168.2.x) and
 to skip it if not.

You can write a ping script that will test the ip the client has, or you
can tweak the latency test down a bit as I am guessing the T1 connection
has higher latency.

-- 
Tim Fletcher t...@night-shade.org.uk


--
RSA(R) Conference 2012
Save $700 by Nov 18
Register now
http://p.sf.net/sfu/rsa-sfdev2dev1
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Tar exited with error 65280 () status

2011-11-02 Thread Tim Fletcher
On Wed, 2011-11-02 at 10:27 -0500, Les Mikesell wrote:
 On Wed, Nov 2, 2011 at 9:55 AM, Joe Konecny jkone...@rmtohio.com wrote:

  It asks for a password then appears to start dumping to the screen.  Should 
  that be asking for a password?
 
 No, if you are running as the backuppc user and the ssh keys are
 configured correctly it should not be asking for a password.

try sudo -H -s -u backuppc which should give you a shell as the backuppc
user, then try the command again. I think it might be permissions on the
ssh key files but ssh -v might help too.

-- 
Tim Fletcher t...@night-shade.org.uk


--
RSA#174; Conference 2012
Save $700 by Nov 18
Register now#33;
http://p.sf.net/sfu/rsa-sfdev2dev1
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Tar exited with error 65280 () status

2011-11-02 Thread Tim Fletcher
On Wed, 2011-11-02 at 11:57 -0400, Joe Konecny wrote:
 debug1: Remote: Ignored authorized keys: bad ownership or modes for
 directory /root

Could this be the issue?

-- 
Tim Fletcher t...@night-shade.org.uk


--
RSA#174; Conference 2012
Save $700 by Nov 18
Register now#33;
http://p.sf.net/sfu/rsa-sfdev2dev1
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] restore from a read only file system possible?

2011-10-11 Thread Tim Fletcher
On Tue, 2011-10-11 at 19:57 +, John Rouillard wrote:
 On Tue, Oct 11, 2011 at 01:10:30PM +0200, Frank Wolkwitz wrote:
  After raid controller failure the pool file system is running in read 
  only mode.
  Making a file system check would take several days (ext3 fs, 13TB of 16 
  TB used) and success is not garanteed.
  
  So the question is: Is it possible to run backuppc in a read only 
  environment, not to make backups, but to restore files?
 
 Well maybe but why would you. If the filesystem is inconsistent, how
 do you know that the file you are restoring points to the proper data?

That's where checksums come in

 Have you tried running BackupPC_tarCreate without the daemon
 running to see if you can extract data?

I'm pretty sure you can use BackupPC_tarCreate / BackupPC_zipCreate and
BackupPC_zcat without the services running.

-- 
Tim Fletcher t...@night-shade.org.uk


--
All the data continuously generated in your IT infrastructure contains a
definitive record of customers, application performance, security
threats, fraudulent activity and more. Splunk takes this data and makes
sense of it. Business sense. IT sense. Common sense.
http://p.sf.net/sfu/splunk-d2d-oct
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Bad md5sums due to zero size (uncompressed) cpool files - WEIRD BUG

2011-10-07 Thread Tim Fletcher
On Thu, 2011-10-06 at 17:54 +0200, Holger Parplies wrote:

 To be honest, I would *hope* that only you had these issues and everyone
 else's backups are fine, i.e. that your hardware and not the BackupPC software
 was the trigger (though it would probably need some sort of software bug to
 come up with the exact symptoms).

So far my scan of one of my systems has finished and has given:

758080 files in 4096 directories checked, 0 had wrong digests, of these
0 zero-length.

Another system is currently up to a/f/d of a full scan and has found the 
following errors

tim@carbon:~$ grep -v ok /tmp/pool-check.txt ; tail -n 1 /tmp/pool-check.txt
[335403] 1ef2238fe0d1e5ffb7abe1696a32ae91 (   384) != 
5c6bec8866797c63a7fbdc92f8678c1f
[397563] 2429be9ee43ac9c7d0cb8f0f4f759cd8 (   364) != 
12351030008ccf626abd83090c0e5efa
[761269] 452017085ec5f0a21b272dac9cbaf51c (  2801) != 
b4f9ab837e47574f145289faddc38ca2
[1260873] 72ed33567c8fbda29d63ade20f13778d (   364) != 
8521efc754784ac13db47545edb22fcd
[1380912] 7d264e0aedb7d6693946594b583643d6 (   270) != 
c15a891ef0ab8d4842196fcbaf3e6b9f
[1534431] 8a7e659dd6d0a4f45464cc3f55372323 (58) != 
de0ac1b424d9a022b4f3415896817ec4
[1598997] 90097fdb369c0152e737c7b88c0f6ff6 (   282) != 
16d682e1c10bee80844e6966eaabbbcf
[1873732] a9d171972fca12bf03c082b7fba542d1 (   364) != 
ee42dab9abc3486325a674779beaabcc
[1940164] afd73ec3463ea92cfd509ead19f938f5 (  5102) ok

Once the scan has finished I'll do a bit more digging about when the
files where created and from which host I'm backing up.

The hardware in both cases is the same, a HP Microserver with a a raid5
disk set. 

The one without errors is running Fedora 15 32bit but with a pool that
has been moved from Ubuntu 10.04 32bit a few months ago, the pool dates
from the 20/09/2010. 

The one with errors has always been on current Ubuntu 32bit, the pool
dates back to 08/01/2010. 

-- 
Tim Fletcher t...@night-shade.org.uk


--
All of the data generated in your IT infrastructure is seriously valuable.
Why? It contains a definitive record of application performance, security
threats, fraudulent activity, and more. Splunk takes this data and makes
sense of it. IT sense. And common sense.
http://p.sf.net/sfu/splunk-d2dcopy2
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Bad md5sums due to zero size (uncompressed) cpool files - WEIRD BUG

2011-10-07 Thread Tim Fletcher
On Fri, 2011-10-07 at 10:21 +0100, Tim Fletcher wrote:

 Another system is currently up to a/f/d of a full scan and has found the 
 following errors

The final answer off the server with the larger and older install of
backuppc is:

2836949 files in 4096 directories checked, 13 had wrong digests, of
these 0 zero-length.

What sort of information would be helpful for follow up?

-- 
Tim Fletcher t...@night-shade.org.uk


--
All of the data generated in your IT infrastructure is seriously valuable.
Why? It contains a definitive record of application performance, security
threats, fraudulent activity, and more. Splunk takes this data and makes
sense of it. IT sense. And common sense.
http://p.sf.net/sfu/splunk-d2dcopy2
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Bad md5sums due to zero size (uncompressed) cpool files - WEIRD BUG

2011-10-06 Thread Tim Fletcher
On Wed, 2011-10-05 at 21:35 -0400, Jeffrey J. Kosowsky wrote:

 Finally, remember it's possible that many people are having this
 problem but just don't know it, since the only way one would know
 would be if one actually computed the partial file md5sums of all the
 pool files and/or restored  tested ones backups. Since the error
 affects only 71 out of 1.1 million files it's possible that no one has
 ever noticed...
 
 It would be interesting if other people would run a test on their
 pools to see if they have similar such issues (remember I only tested
 my pool in response to the recent thread of the guy who was having
 issues with his pool)...

Do you have a script or series of commands to do this check with?

I have access to a couple of backuppc installs of various ages and sizes
that I can test.

-- 
Tim Fletcher t...@night-shade.org.uk


--
All the data continuously generated in your IT infrastructure contains a
definitive record of customers, application performance, security
threats, fraudulent activity and more. Splunk takes this data and makes
sense of it. Business sense. IT sense. Common sense.
http://p.sf.net/sfu/splunk-d2dcopy1
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Append share to RsyncShareName?

2011-10-03 Thread Tim Fletcher
On Mon, 2011-10-03 at 18:44 +0200, Holger Parplies wrote:

 yes.
 
   push @{$Conf{RsyncShareName}}, '/home';
 
 (you could list any number of shares, separated by commas, or you could add
 each share in an individual push command).
 
 Note that this requires at least BackupPC 3.2.0, and that this won't work well
 with the CGI editor, so you'd have to edit your host config file with a text
 editor.

Very useful to know, thanks

-- 
Tim Fletcher t...@night-shade.org.uk


--
All the data continuously generated in your IT infrastructure contains a
definitive record of customers, application performance, security
threats, fraudulent activity and more. Splunk takes this data and makes
sense of it. Business sense. IT sense. Common sense.
http://p.sf.net/sfu/splunk-d2dcopy1
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC on NTFS, any expected problems?

2011-09-30 Thread Tim Fletcher
On Thu, 2011-09-29 at 23:31 -0400, Long V wrote:
 Hi,
 
 I have a RAID 1 partition formatted as NTFS. The server is linux so Ill 
 be accessing that partition using ntfs-3g.
 
 Is there any expected problems to use BackupPC on top of a NTFS partition?
 
 I know BackupPC heavily uses hard-link so is that fully supported by 
 NTFS (ntfs-3g)?

ntfs-3g performance is average at best

-- 
Tim Fletcher t...@night-shade.org.uk


--
All of the data generated in your IT infrastructure is seriously valuable.
Why? It contains a definitive record of application performance, security
threats, fraudulent activity, and more. Splunk takes this data and makes
sense of it. IT sense. And common sense.
http://p.sf.net/sfu/splunk-d2dcopy2
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup of dual-boot laptop

2011-09-28 Thread Tim Fletcher
On Wed, 2011-09-28 at 17:30 +0200, Dan Johansson wrote:
 I have a laptop that is dual-boot (Linux and WinXP) and gets the same IP from 
 DHCP in both OS's. Today I have two entries in BackupPC for this laptop 
 (hostname_lnx and hostname_win) with different backup methods for each (rsync 
 over ssh for Linux and SMB for WinXP). This works good for me with one small 
 exception - I always gets a Backup-Failed message for one of them each 
 night.
 Does someone have a suggestion on how to solve this in a more beautiful way?

Write a ping script that finds out is the laptop is in Windows or Linux
so one of the other of the backup hosts won't ping.

You can also make use of the fact that most desktop distros have avahi
installed and use short hostname.local as a target host name.

-- 
Tim Fletcher t...@night-shade.org.uk


--
All the data continuously generated in your IT infrastructure contains a
definitive record of customers, application performance, security
threats, fraudulent activity and more. Splunk takes this data and makes
sense of it. Business sense. IT sense. Common sense.
http://p.sf.net/sfu/splunk-d2dcopy1
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] How does BackupPC rsync over ssh work?

2011-09-26 Thread Tim Fletcher
On Mon, 2011-09-26 at 15:37 +0200, Maarten te Paske wrote:

 OK, I will read a bit more into the rsync documentation. I thought this
 way I wouldn't be able to limit the privileges through sudo, but maybe
 I'm wrong.

I use the following line in /etc/sudoers to allow the user backups to
call rsync as root. Note this line does break rsync based recovery, but
this is part of the plan as it prevents the backuppc server writing to
the client being backed up.

backups ALL=NOPASSWD: /usr/bin/rsync --server --sender *

You will also need to remove the requiretty option from /etc/sudoers
too.

The $Conf{RsyncClientCmd} command becomes:
$sshPath -q -x -l backups $host sudo $rsyncPath $argList+


-- 
Tim Fletcher t...@night-shade.org.uk


--
All the data continuously generated in your IT infrastructure contains a
definitive record of customers, application performance, security
threats, fraudulent activity and more. Splunk takes this data and makes
sense of it. Business sense. IT sense. Common sense.
http://p.sf.net/sfu/splunk-d2dcopy1
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] xxShareName = /cygwin ??

2011-09-08 Thread Tim Fletcher
On Fri, 2011-09-09 at 01:54 +0700, hans...@gmail.com wrote:

 I haven't been able to get this to work so far. I've taken the
 --one-filesystem out of my rsync args, tested with no excludes at all,
 no dice.

Have you tried using tar instead of rsync as a backup method?

-- 
Tim Fletcher t...@night-shade.org.uk


--
Doing More with Less: The Next Generation Virtual Desktop 
What are the key obstacles that have prevented many mid-market businesses
from deploying virtual desktops?   How do next-generation virtual desktops
provide companies an easier-to-deploy, easier-to-manage and more affordable
virtual desktop model.http://www.accelacomm.com/jaw/sfnl/114/51426474/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] cannot get backuppc to wol a client

2011-09-04 Thread Tim Fletcher
On Sat, 2011-09-03 at 17:14 -0500, Robert E. Wooden wrote:
 Tim, what OS is your BackupPC running on?

Ubuntu 11.04 32bit

-- 
Tim Fletcher t...@night-shade.org.uk


--
Special Offer -- Download ArcSight Logger for FREE!
Finally, a world-class log management solution at an even better 
price-free! And you'll get a free Love Thy Logs t-shirt when you
download Logger. Secure your free ArcSight Logger TODAY!
http://p.sf.net/sfu/arcsisghtdev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] cannot get backuppc to wol a client

2011-09-04 Thread Tim Fletcher
On Sat, 2011-09-03 at 22:33 +0100, Tim Fletcher wrote:

 And change the pingcmd for the hosts to the following line:

I should also say that this relies on the fact the backuppc once a host
has settled only pings a host just before trying to back it up.

Also I should have mentioned that you need to crank up the pingtimeout
setting on backuppc if you are using this wake up method.

-- 
Tim Fletcher t...@night-shade.org.uk


--
Special Offer -- Download ArcSight Logger for FREE!
Finally, a world-class log management solution at an even better 
price-free! And you'll get a free Love Thy Logs t-shirt when you
download Logger. Secure your free ArcSight Logger TODAY!
http://p.sf.net/sfu/arcsisghtdev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] cannot get backuppc to wol a client

2011-09-03 Thread Tim Fletcher
On Sat, 2011-09-03 at 16:09 -0500, Robert E. Wooden wrote:
 I have worked on this issue in the past for myself. Never really got it 
 completed. However, your email has me interested again and this is good 
 weekend to work on this.

I've had it working for a year or so with the following setup:

In the file /etc/sudoers add the following lines:

Cmnd_Alias WOL = /usr/bin/wakeonlan,/usr/sbin/etherwake
backuppc  ALL=(root) NOPASSWD:WOL

And change the pingcmd for the hosts to the following line:

$Conf{PingCmd} = '/usr/local/bin/wakeup.sh $host';

I've also attached a slightly edited version of the script I use to fire
off the WoL packets.

-- 
Tim Fletcher t...@night-shade.org.uk


wakeup.sh
Description: application/shellscript
--
Special Offer -- Download ArcSight Logger for FREE!
Finally, a world-class log management solution at an even better 
price-free! And you'll get a free Love Thy Logs t-shirt when you
download Logger. Secure your free ArcSight Logger TODAY!
http://p.sf.net/sfu/arcsisghtdev2dev___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] cannot get backuppc to wol a client

2011-09-02 Thread Tim Fletcher
On Fri, 2011-09-02 at 00:34 -0700, egrimisu wrote:
 Hi guys,
 
 backuppc won't wol client pc
 
 NmbLookupFindHostCmd = /etc/backuppc/wakeup.sh 00:1a:4d:8c:0a:78 1 $host
 
 wakeup.sh has 777 rights and contains:
 
 
 #!/bin/bash
 wakeonlan $1
 sleep ${2}m
 /usr/bin/nmblookup $3
 
 using the dos command wolcmd.exe 001a4d8c0a78 192.168.2.74 255.255.255.0  
 wakes the pc up.

Sending the WoL packets need root (admin in the windows world)
privileges so you will need to make setup sudo to allow the backuppc
user to run wakeonlan as root or you need to setuid the wakeonlan
binary. (bad idea)

-- 
Tim Fletcher t...@night-shade.org.uk


--
Special Offer -- Download ArcSight Logger for FREE!
Finally, a world-class log management solution at an even better 
price-free! And you'll get a free Love Thy Logs t-shirt when you
download Logger. Secure your free ArcSight Logger TODAY!
http://p.sf.net/sfu/arcsisghtdev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/