Re: [BackupPC-users] Exclude not working as expected

2010-02-09 Thread Kameleon
Anything that you want to be explicitly excluded you will need to put the
full path in the excludes. Otherwise if it matches anywhere else in the
filesystem it will be excluded.

On Tue, Feb 9, 2010 at 5:03 PM, Mark Wass  wrote:

> Hi Bowie
>
> Thanks for clearing that up. So does that mean I should also amend these
> other excludes by putting a forward slash in front?
>
> 'etc/fstab', ==> '/etc/fstab',
>
> 'var/cache/apt/archives/*', ==> '/var/cache/apt/archives/*',
>
> Thanks
>
> Mark
>
> -Original Message-
> From: Bowie Bailey [mailto:bowie_bai...@buc.com]
> Sent: Wednesday, 10 February 2010 12:40 AM
> To: backuppc-users@lists.sourceforge.net
> Subject: Re: [BackupPC-users] Exclude not working as expected
>
> Mark Wass wrote:
> >
> > Hi Guys
> >
> > I have a config file that looks likes this:
> >
> > $Conf{BackupFilesExclude} = {
> >
> > '/' => [
> >
> > 'dev',
> >
> > 'proc',
> >
> > 'sys',
> >
> > 'tmp',
> >
> > 'var/lib/mysql',
> >
> > 'etc/fstab',
> >
> > 'var/log/mysql/mysql-bin.*',
> >
> > 'var/log/apache2/*',
> >
> > 'shares',
> >
> > 'var/lib/cvs',
> >
> > 'var/lib/cvs-old',
> >
> > 'var/cache/apt/archives/*',
> >
> > 'var/log/samba/*',
> >
> > 'var/log/installer/*',
> >
> > 'var/log/apt/*',
> >
> > 'var/log/samba/*',
> >
> > 'HDD2'
> >
> > ]
> >
> > };
> >
> > $Conf{BackupFilesOnly} = {};
> >
> > $Conf{ClientNameAlias} = '192.168.1.3';
> >
> > $Conf{RsyncShareName} = [
> >
> > '/'
> >
> > ];
> >
> > I've got an exclude in there for "proc", the problem I'm getting is
> > that the "proc" is also getting excluded from "/opt/webmin/proc" I
> > only want the proc directly on the root "/" share to be excluded. How
> > can I make sure the no other "proc" folders are excluded?
> >
>
> You are telling it that you want all files/directories called 'proc' to
> be excluded. If you only want to exclude '/proc', then list it that way.
> You probably want to do the same thing with most of the rest of your
> list unless you are also wanting to exclude all 'tmp' directories, etc.
>
> $Conf{BackupFilesExclude} = {
> '/' => [
> '/dev',
> '/proc',
> '/sys',
> '/tmp',
> ...
> 'HDD2'
> ]
> };
>
> --
> Bowie
>
>
> 
> --
> The Planet: dedicated and managed hosting, cloud storage, colocation
> Stay online with enterprise data centers and the best network in the
> business
> Choose flexible plans and management services without long-term contracts
> Personal 24x7 support from experience hosting pros just a phone call away.
> http://p.sf.net/sfu/theplanet-com
> ___
> BackupPC-users mailing list
> BackupPC-users@lists.sourceforge.net
> List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
> Wiki:http://backuppc.wiki.sourceforge.net
> Project: http://backuppc.sourceforge.net/
>
>
>
> --
> SOLARIS 10 is the OS for Data Centers - provides features such as DTrace,
> Predictive Self Healing and Award Winning ZFS. Get Solaris 10 NOW
> http://p.sf.net/sfu/solaris-dev2dev
> ___
> BackupPC-users mailing list
> BackupPC-users@lists.sourceforge.net
> List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
> Wiki:http://backuppc.wiki.sourceforge.net
> Project: http://backuppc.sourceforge.net/
>
--
SOLARIS 10 is the OS for Data Centers - provides features such as DTrace,
Predictive Self Healing and Award Winning ZFS. Get Solaris 10 NOW
http://p.sf.net/sfu/solaris-dev2dev___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC 3 question

2010-02-09 Thread Chris Robertson
James Ward wrote:
> I'm trying to figure out how to do something in the GUI.
>
> I have the following exclude: /data0*
>
> Now I would like to add an exception to that rule and back 
> up: /data02/vodvendors/promo_items/
>
> Is it possible to set this up in the GUI?  I can't figure it out.

$Conf{BackupFilesExclude} behaves differently depending on the 
underlying transport program, rsync, smbclient or tar.  If you are using 
rsync as the backup method, I think you should be able to either add 
"/data02/vodvendors/promo_items/" as an RsyncShareName, or add...

--include = "/data02/vodvendors/promo_items/"

...to your RsyncArgs.

>
> Ward... James Ward
> Tekco Management Group, LLC
> jew...@torzo.com 
> 520-290-0910x268
> ICQ: 201663408

Chris


--
SOLARIS 10 is the OS for Data Centers - provides features such as DTrace,
Predictive Self Healing and Award Winning ZFS. Get Solaris 10 NOW
http://p.sf.net/sfu/solaris-dev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Exclude not working as expected

2010-02-09 Thread Mark Wass
Hi Bowie

Thanks for clearing that up. So does that mean I should also amend these
other excludes by putting a forward slash in front?

'etc/fstab', ==> '/etc/fstab',

'var/cache/apt/archives/*', ==> '/var/cache/apt/archives/*',

Thanks 

Mark

-Original Message-
From: Bowie Bailey [mailto:bowie_bai...@buc.com] 
Sent: Wednesday, 10 February 2010 12:40 AM
To: backuppc-users@lists.sourceforge.net
Subject: Re: [BackupPC-users] Exclude not working as expected

Mark Wass wrote:
>
> Hi Guys
>
> I have a config file that looks likes this:
>
> $Conf{BackupFilesExclude} = {
>
> '/' => [
>
> 'dev',
>
> 'proc',
>
> 'sys',
>
> 'tmp',
>
> 'var/lib/mysql',
>
> 'etc/fstab',
>
> 'var/log/mysql/mysql-bin.*',
>
> 'var/log/apache2/*',
>
> 'shares',
>
> 'var/lib/cvs',
>
> 'var/lib/cvs-old',
>
> 'var/cache/apt/archives/*',
>
> 'var/log/samba/*',
>
> 'var/log/installer/*',
>
> 'var/log/apt/*',
>
> 'var/log/samba/*',
>
> 'HDD2'
>
> ]
>
> };
>
> $Conf{BackupFilesOnly} = {};
>
> $Conf{ClientNameAlias} = '192.168.1.3';
>
> $Conf{RsyncShareName} = [
>
> '/'
>
> ];
>
> I've got an exclude in there for "proc", the problem I'm getting is
> that the "proc" is also getting excluded from "/opt/webmin/proc" I
> only want the proc directly on the root "/" share to be excluded. How
> can I make sure the no other "proc" folders are excluded?
>

You are telling it that you want all files/directories called 'proc' to
be excluded. If you only want to exclude '/proc', then list it that way.
You probably want to do the same thing with most of the rest of your
list unless you are also wanting to exclude all 'tmp' directories, etc.

$Conf{BackupFilesExclude} = {
'/' => [
'/dev',
'/proc',
'/sys',
'/tmp',
...
'HDD2'
]
};

-- 
Bowie


--
The Planet: dedicated and managed hosting, cloud storage, colocation
Stay online with enterprise data centers and the best network in the
business
Choose flexible plans and management services without long-term contracts
Personal 24x7 support from experience hosting pros just a phone call away.
http://p.sf.net/sfu/theplanet-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


--
SOLARIS 10 is the OS for Data Centers - provides features such as DTrace,
Predictive Self Healing and Award Winning ZFS. Get Solaris 10 NOW
http://p.sf.net/sfu/solaris-dev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Large Linux backup Help

2010-02-09 Thread John Rouillard
On Tue, Feb 09, 2010 at 11:16:02AM -0800, Colin Blower wrote:
> Hello everyone,
> 
> I was hoping for a solution to my latest problem and also advice on my
> general backup. My latest problem is the backup seems to be running,
> but stalled.

Which version of BackupPC are you running?
 
> I see the ssh process on the server and the rsync process on the
> client, but in the web interface the Xfer PID has only one PID. It has
> been this way for ~12 hours.
> 
> Background: I am in the process of backing up a very large (~1TB,
> maybe 12million files?) filesystem using backuppc. It is a linux setup
> using rsync. Originally the RsyncShareName was '/home', but that back
> up failed to transfer any files after an Out of Memory error and the
> server killed the backup process.
> 
> I split up the RsyncShareName into
> '/home/aa','/home/ab',...,'/home/zy',/home/zz'. Ran another backup,
> ran into a filename too long error and removed that weird directory
> from the client.
> 
> The latest backup has been running for 24 hours, but for the last 12
> has been stuck on /home/hm. This directory is one of the empty
> directories and should not take more than a second to complete.

Can you do an lsof on the client rsync? File descriptor 3 should be
the file in the share you are currently processing. Also maybe do an
strace as well to see if it's moving data or not? If it stuck in a
select loop and never issuing reads/writes I claim it's deadlocked and
needs to be killed.

> Could this be related to the ~400 zombie processes this backup has
> created?

Unlikely. However I was hoping that the zombie issue would be fixed in
BackupPC by now.

> Also, if i were to stop this full backup, should i try again
> with another full or an incremental ? ( i have no full backups in
> backuppc )

A full. If you abort what you currently have transferred should be
kept as a partial backup and the next full will reuse the files to get
a complete full backup that you can use. I have had to run 4 or 5 full
backups to get a base full backup that I could use on some of the
larger partitions.

-- 
-- rouilj

John Rouillard   System Administrator
Renesys Corporation  603-244-9084 (cell)  603-643-9300 x 111

--
SOLARIS 10 is the OS for Data Centers - provides features such as DTrace,
Predictive Self Healing and Award Winning ZFS. Get Solaris 10 NOW
http://p.sf.net/sfu/solaris-dev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Large Linux backup Help

2010-02-09 Thread Adam Goryachev
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Colin Blower wrote:
> Hello everyone,
>
> I was hoping for a solution to my latest problem and also advice on
> my general backup. My latest problem is the backup seems to be
> running, but stalled.
>
> I see the ssh process on the server and the rsync process on the
> client, but in the web interface the Xfer PID has only one PID. It
> has been this way for ~12 hours.
>
> Background: I am in the process of backing up a very large (~1TB,
> maybe 12million files?) filesystem using backuppc. It is a linux
> setup using rsync. Originally the RsyncShareName was '/home', but
> that back up failed to transfer any files after an Out of Memory
> error and the server killed the backup process.
>
> I split up the RsyncShareName into
> '/home/aa','/home/ab',...,'/home/zy',/home/zz'. Ran another backup,
>  ran into a filename too long error and removed that weird
> directory from the client.
>
> The latest backup has been running for 24 hours, but for the last
> 12 has been stuck on /home/hm. This directory is one of the empty
> directories and should not take more than a second to complete.
>
> Could this be related to the ~400 zombie processes this backup has
> created? Also, if i were to stop this full backup, should i try
> again with another full or an incremental ? ( i have no full
> backups in backuppc )
I would advise you split this into different hosts, where each host
has only aa-az or ba-bz etc ...
This way, when a share fails to backup for some reason, only a small
portion of the overall backup will fail instead of the whole lot.
Use ClientNameAlias (from memory) to point hostname-a, hostname-b,
hostname-c, etc to hostname.

Also check your number of concurrent backups is low, you don't want
multiple backups to the same 'host' at the same time. Some people have
written scripts which will check this and cause the backup to fail for
the current backup attempt if there is already a backup running. (See
the archives/google with keywords concurrent, lock, backup, etc or ask
and someone might have a better pointer).

Regards,
Adam
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.9 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org

iEYEARECAAYFAktxvRsACgkQGyoxogrTyiVm5QCfT6VS9/s4pZYc6PgpW30pGHRh
H4gAoI5nRH8wWhwdTvlyIBifWe5fLL0E
=NHUe
-END PGP SIGNATURE-


--
SOLARIS 10 is the OS for Data Centers - provides features such as DTrace,
Predictive Self Healing and Award Winning ZFS. Get Solaris 10 NOW
http://p.sf.net/sfu/solaris-dev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Large Linux backup Help

2010-02-09 Thread Colin Blower
Hello everyone,

I was hoping for a solution to my latest problem and also advice on my
general backup. My latest problem is the backup seems to be running,
but stalled.

I see the ssh process on the server and the rsync process on the
client, but in the web interface the Xfer PID has only one PID. It has
been this way for ~12 hours.

Background: I am in the process of backing up a very large (~1TB,
maybe 12million files?) filesystem using backuppc. It is a linux setup
using rsync. Originally the RsyncShareName was '/home', but that back
up failed to transfer any files after an Out of Memory error and the
server killed the backup process.

I split up the RsyncShareName into
'/home/aa','/home/ab',...,'/home/zy',/home/zz'. Ran another backup,
ran into a filename too long error and removed that weird directory
from the client.

The latest backup has been running for 24 hours, but for the last 12
has been stuck on /home/hm. This directory is one of the empty
directories and should not take more than a second to complete.

Could this be related to the ~400 zombie processes this backup has
created? Also, if i were to stop this full backup, should i try again
with another full or an incremental ? ( i have no full backups in
backuppc )

Thanks,
-Colin B.

--
The Planet: dedicated and managed hosting, cloud storage, colocation
Stay online with enterprise data centers and the best network in the business
Choose flexible plans and management services without long-term contracts
Personal 24x7 support from experience hosting pros just a phone call away.
http://p.sf.net/sfu/theplanet-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Auto Archiving of hosts

2010-02-09 Thread Danielle Tilley


-Original Message-
From: Michael Osburn [mailto:michael.osb...@echostar.com] 
Sent: Tuesday, February 09, 2010 8:24 AM
To: General list for user discussion, questions and support
Subject: Re: [BackupPC-users] Auto Archiving of hosts

On Tuesday 09 February 2010 08:01:59 am Timothy J Massey wrote:
> 
> I have a cron script that I run once a week to archive the latest backup 
> to a removable hard drive.  Let me know if you'd like to see it.  It's 
> really pretty simple, though.  It doesn't do as much error checking as I'd

> like (like making sure the removable drive is mounted properly), but the 
> archive part works perfectly, thanks to BackupPC.
> 
> Tim Massey
> 

Thanks Tim,

I need to get things confirmed with management first but I think that doing
the latest 
would work for what I need. I did not see the "-1" option when looking at
the archive 
host option. Do you mind sharing your script?

Michael


Tim 

I would be interested in seeing this script as well -- I am just starting to
use the archive backups for my offsite backup solution.

Danielle


--
The Planet: dedicated and managed hosting, cloud storage, colocation
Stay online with enterprise data centers and the best network in the business
Choose flexible plans and management services without long-term contracts
Personal 24x7 support from experience hosting pros just a phone call away.
http://p.sf.net/sfu/theplanet-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Auto Archiving of hosts

2010-02-09 Thread Les Mikesell
On 2/9/2010 10:24 AM, Michael Osburn wrote:
>
> I need to get things confirmed with management first but I think that doing 
> the latest
> would work for what I need. I did not see the "-1" option when looking at the 
> archive
> host option. Do you mind sharing your script?

The archive host setup is just a wrapper around BackupPC_tarCreate.  If 
you are doing it in a script you might as well run it directly and have 
more control.

-- 
   Les Mikesell
lesmikes...@gmail.com

--
The Planet: dedicated and managed hosting, cloud storage, colocation
Stay online with enterprise data centers and the best network in the business
Choose flexible plans and management services without long-term contracts
Personal 24x7 support from experience hosting pros just a phone call away.
http://p.sf.net/sfu/theplanet-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Auto Archiving of hosts

2010-02-09 Thread Timothy J Massey
Michael Osburn  wrote on 02/09/2010 11:24:01 
AM:

> I need to get things confirmed with management first but I think 
> that doing the latest 
> would work for what I need.

I can't imagine why that wouldn't work (you're saving the same info, just 
a few weeks early!  :)  ), but sometimes management doesn't always see 
things the same way.

> I did not see the "-1" option when 
> looking at the archive 
> host option. Do you mind sharing your script?

It's not something you'd do from the GUI:  you would use it with the 
archive command from the command line.  I actually found it by reading the 
source code back when I was trying to figure out how to do this.

The biggest disadvantage is that all of the generated files from the same 
host will have the same name.  If you want to keep multiple copies of the 
archives you may have to either rename the files or put them into 
different subdirectories.  I asked that this be addressed in the BacukpPC 
archive command (have it substitute the proper backup number for the -1), 
but the request wasn't followed up on.


Here's the line in the backuppc user's crontab.  Note that it must be run 
as the backuppc user (or have some other script su it as backuppc).

04 07 *   *   6   /data/BackupPC/util/cron_archive localhost 5

And here's cron_archive.  It's designed for a Red Hat-based system.  The 
variables you might need to change are at the top.  ARCHIVE_HOST_PATH is 
the path to the archive host directory within BackupPC (where the archive 
request file will be generated), ARCHIVE_DEST_PATH is the path to store 
the generated archive, and BACKUPPC_BIN_PATH is the path to the BackupPC 
command files.

#!/bin/sh
# Script to create archive of BackupPC host for OBS Backup Server 2.1

ARCHIVE_HOST_PATH=/data/BackupPC/pc/archive
ARCHIVE_DEST_PATH=/data/archive
BACKUPPC_BIN_PATH=/usr/local/BackupPC/bin

parameters() {
echo "$0 hostname par
where hostname is the host to archve and par is the percentage of 
parity
Example:  $0 localhost 5"
exit 1
}

if [ -z $1 ]; then
echo "You must include the hostname in the first parameter."
parameters
fi

if [ -z $2 ]; then
echo "You must include the percentage of parity in the second 
parameter."
parameters
fi

echo "%ArchiveReq = (
   'archiveloc' => '"${ARCHIVE_DEST_PATH}"',
   'reqTime' => '946702800',
   'BackupList' => [
 '-1'
   ],
   'host' => 'archive',
   'parfile' => '"$2"',
   'archtype' => '0',
   'compression' => '/bin/cat',
   'compext' => '.raw',
   'HostList' => [
 '"$1"'
   ],
   'user' => 'backuppc',
   'splitsize' => '000'
);" > ${ARCHIVE_HOST_PATH}/archiveReq.cron.$$

${BACKUPPC_BIN_PATH}/BackupPC_serverMesg archive backuppc archive 
archiveReq.cron.$$



As you can see, the command generates an archive request file (which is 
exactly what the BackupPC archive GUI does) and saves it in the archive 
host's pc directory.  The archive is set up for no splits, no compression, 
and a variable amount of parity (some of my hosts are too big to generate 
parity in a timely manner).  If you want a different style of archive, 
simply capture the archive file generated by the GUI and modify the above 
information to better match your desired style of archive.

If you have further questions or comments, please feel free to ask.  For 
the record, the above code is released into the public domain (FWIW), but 
I would appreciate it if you sent any corrections or updates my way.

Tim Massey


--
The Planet: dedicated and managed hosting, cloud storage, colocation
Stay online with enterprise data centers and the best network in the business
Choose flexible plans and management services without long-term contracts
Personal 24x7 support from experience hosting pros just a phone call away.
http://p.sf.net/sfu/theplanet-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Auto Archiving of hosts

2010-02-09 Thread Michael Osburn
On Tuesday 09 February 2010 08:01:59 am Timothy J Massey wrote:
> Michael Osburn  wrote on 02/08/2010 06:42:34 
> PM:
> 
> > I am looking for a way to automatically archive my oldest full 
> > backups to a different mount point. The goal is to have fairly 
> > recent backups stored off onto a SAN device, then remove the oldest 
> > backup. I have looked at the archive host option, but this would 
> > require a lot of manual intervention as we would need to select the 
> > backups some time during the week and then letting the fulls age away. 
> 
> Is there a reason you're moving the *oldest*?  The newest would be very 
> easy.  The oldest slightly more difficult.
> 
> The archive command accepts negative numbers for the backup number.  So, 
> to get the most recent, simply use "-1".  For the oldest, use the number 
> of fulls plus the number of incrementals.  Of course, if you haven't yet 
> done enough backups to get that far yet, that won't work.  That's the 
> "slightly more difficult" part...  :)
> 
> I have a cron script that I run once a week to archive the latest backup 
> to a removable hard drive.  Let me know if you'd like to see it.  It's 
> really pretty simple, though.  It doesn't do as much error checking as I'd 
> like (like making sure the removable drive is mounted properly), but the 
> archive part works perfectly, thanks to BackupPC.
> 
> Tim Massey
> 
> 
> --
> The Planet: dedicated and managed hosting, cloud storage, colocation
> Stay online with enterprise data centers and the best network in the business
> Choose flexible plans and management services without long-term contracts
> Personal 24x7 support from experience hosting pros just a phone call away.
> http://p.sf.net/sfu/theplanet-com
> ___
> BackupPC-users mailing list
> BackupPC-users@lists.sourceforge.net
> List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
> Wiki:http://backuppc.wiki.sourceforge.net
> Project: http://backuppc.sourceforge.net/
> 

Thanks Tim,

I need to get things confirmed with management first but I think that doing the 
latest 
would work for what I need. I did not see the "-1" option when looking at the 
archive 
host option. Do you mind sharing your script?

Michael

--
The Planet: dedicated and managed hosting, cloud storage, colocation
Stay online with enterprise data centers and the best network in the business
Choose flexible plans and management services without long-term contracts
Personal 24x7 support from experience hosting pros just a phone call away.
http://p.sf.net/sfu/theplanet-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] cygwin, ssh, and rsync

2010-02-09 Thread Trey Nolen



Cody Dunne wrote:

On 2/6/2010 2:18 PM, Trey Nolen wrote:
  

I don't know if this has been mentioned on the list or not, but the new
Cygwin seems to have fixed the long standing bug that prevents rsync
from running in server mode over ssh.   Now, we are able to use the
method "rsync" instead of "rsyncd".  This means that we no longer have
to maintain persistent SSH tunnels for our backups of Windows
machines.   The "rsync" method also seems to be faster for us on WANs as
well.   Just wanted to let everyone know if this hasn't been brought up.


Trey Nolen



I've had the reverse happen to me -- after the upgrade from Cygwin 
1.5.25 to 1.7.1 my rscynd backups over pre-established ssh tunnels 
started hanging randomly after ~24 minutes. I've reinstalled Cygwin, 
installed on fresh Win 7 machines, turned off all anti-virus and 
firewalls, and still can't eliminate the problem. It is definitely an 
rsyncd issue and not a ssh one as the tunnel stays up and active waiting.


I'll give rsync over ssh a try again, though.
  


Yes, that's what happened to us, too.  Like you, we were using rsyncd 
over tunnels maintained with autossh and that quit working.  Now we are 
able to do it using method "rsync" (not rsyncd) and go directly over ssh 
with no tunnel.


Trey Nolen

--
The Planet: dedicated and managed hosting, cloud storage, colocation
Stay online with enterprise data centers and the best network in the business
Choose flexible plans and management services without long-term contracts
Personal 24x7 support from experience hosting pros just a phone call away.
http://p.sf.net/sfu/theplanet-com___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] cygwin, ssh, and rsync

2010-02-09 Thread Cody Dunne

On 2/6/2010 2:18 PM, Trey Nolen wrote:
> I don't know if this has been mentioned on the list or not, but the new
> Cygwin seems to have fixed the long standing bug that prevents rsync
> from running in server mode over ssh.   Now, we are able to use the
> method "rsync" instead of "rsyncd".  This means that we no longer have
> to maintain persistent SSH tunnels for our backups of Windows
> machines.   The "rsync" method also seems to be faster for us on WANs as
> well.   Just wanted to let everyone know if this hasn't been brought up.
>
>
> Trey Nolen

I've had the reverse happen to me -- after the upgrade from Cygwin 
1.5.25 to 1.7.1 my rscynd backups over pre-established ssh tunnels 
started hanging randomly after ~24 minutes. I've reinstalled Cygwin, 
installed on fresh Win 7 machines, turned off all anti-virus and 
firewalls, and still can't eliminate the problem. It is definitely an 
rsyncd issue and not a ssh one as the tunnel stays up and active waiting.

I'll give rsync over ssh a try again, though.

Cody

--
The Planet: dedicated and managed hosting, cloud storage, colocation
Stay online with enterprise data centers and the best network in the business
Choose flexible plans and management services without long-term contracts
Personal 24x7 support from experience hosting pros just a phone call away.
http://p.sf.net/sfu/theplanet-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] BackupPC 3 question

2010-02-09 Thread James Ward
I'm trying to figure out how to do something in the GUI.

I have the following exclude: /data0*

Now I would like to add an exception to that rule and back up: 
/data02/vodvendors/promo_items/

Is it possible to set this up in the GUI?  I can't figure it out.

Ward... James Ward
Tekco Management Group, LLC
jew...@torzo.com
520-290-0910x268
ICQ: 201663408



smime.p7s
Description: S/MIME cryptographic signature
--
The Planet: dedicated and managed hosting, cloud storage, colocation
Stay online with enterprise data centers and the best network in the business
Choose flexible plans and management services without long-term contracts
Personal 24x7 support from experience hosting pros just a phone call away.
http://p.sf.net/sfu/theplanet-com___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] problems with DumpPreUserCmd.

2010-02-09 Thread Jeffrey J. Kosowsky
Mauro wrote at about 11:07:34 +0100 on Tuesday, February 9, 2010:
 > For DumpPreUserCmd I've set $sshPath -p 2322 -q -x -l user $host,
 > /home/user/file.sh.
 > It return DumpPreUserCmd returned error status 65280... exiting.
That just means the exit code of your command was 255
 > Why?n
 > I've tried the command and it works but set in DumpPreUserCmd it
 >  don't.
What is the exit code when you manually run:
 ssh -p 2322 -q -x -l user $host /home/user/file.sh

--
The Planet: dedicated and managed hosting, cloud storage, colocation
Stay online with enterprise data centers and the best network in the business
Choose flexible plans and management services without long-term contracts
Personal 24x7 support from experience hosting pros just a phone call away.
http://p.sf.net/sfu/theplanet-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Auto Archiving of hosts

2010-02-09 Thread Timothy J Massey
Michael Osburn  wrote on 02/08/2010 06:42:34 
PM:

> I am looking for a way to automatically archive my oldest full 
> backups to a different mount point. The goal is to have fairly 
> recent backups stored off onto a SAN device, then remove the oldest 
> backup. I have looked at the archive host option, but this would 
> require a lot of manual intervention as we would need to select the 
> backups some time during the week and then letting the fulls age away. 

Is there a reason you're moving the *oldest*?  The newest would be very 
easy.  The oldest slightly more difficult.

The archive command accepts negative numbers for the backup number.  So, 
to get the most recent, simply use "-1".  For the oldest, use the number 
of fulls plus the number of incrementals.  Of course, if you haven't yet 
done enough backups to get that far yet, that won't work.  That's the 
"slightly more difficult" part...  :)

I have a cron script that I run once a week to archive the latest backup 
to a removable hard drive.  Let me know if you'd like to see it.  It's 
really pretty simple, though.  It doesn't do as much error checking as I'd 
like (like making sure the removable drive is mounted properly), but the 
archive part works perfectly, thanks to BackupPC.

Tim Massey


--
The Planet: dedicated and managed hosting, cloud storage, colocation
Stay online with enterprise data centers and the best network in the business
Choose flexible plans and management services without long-term contracts
Personal 24x7 support from experience hosting pros just a phone call away.
http://p.sf.net/sfu/theplanet-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Exclude not working as expected

2010-02-09 Thread Bowie Bailey
Mark Wass wrote:
>
> Hi Guys
>
> I have a config file that looks likes this:
>
> $Conf{BackupFilesExclude} = {
>
> '/' => [
>
> 'dev',
>
> 'proc',
>
> 'sys',
>
> 'tmp',
>
> 'var/lib/mysql',
>
> 'etc/fstab',
>
> 'var/log/mysql/mysql-bin.*',
>
> 'var/log/apache2/*',
>
> 'shares',
>
> 'var/lib/cvs',
>
> 'var/lib/cvs-old',
>
> 'var/cache/apt/archives/*',
>
> 'var/log/samba/*',
>
> 'var/log/installer/*',
>
> 'var/log/apt/*',
>
> 'var/log/samba/*',
>
> 'HDD2'
>
> ]
>
> };
>
> $Conf{BackupFilesOnly} = {};
>
> $Conf{ClientNameAlias} = '192.168.1.3';
>
> $Conf{RsyncShareName} = [
>
> '/'
>
> ];
>
> I’ve got an exclude in there for “proc”, the problem I’m getting is
> that the “proc” is also getting excluded from “/opt/webmin/proc” I
> only want the proc directly on the root “/” share to be excluded. How
> can I make sure the no other “proc” folders are excluded?
>

You are telling it that you want all files/directories called 'proc' to
be excluded. If you only want to exclude '/proc', then list it that way.
You probably want to do the same thing with most of the rest of your
list unless you are also wanting to exclude all 'tmp' directories, etc.

$Conf{BackupFilesExclude} = {
'/' => [
'/dev',
'/proc',
'/sys',
'/tmp',
...
'HDD2'
]
};

-- 
Bowie

--
The Planet: dedicated and managed hosting, cloud storage, colocation
Stay online with enterprise data centers and the best network in the business
Choose flexible plans and management services without long-term contracts
Personal 24x7 support from experience hosting pros just a phone call away.
http://p.sf.net/sfu/theplanet-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] problems with DumpPreUserCmd.

2010-02-09 Thread Mauro
For DumpPreUserCmd I've set $sshPath -p 2322 -q -x -l user $host,
/home/user/file.sh.
It return DumpPreUserCmd returned error status 65280... exiting.
Why?
I've tried the command and it works but set in DumpPreUserCmd it don't.

--
The Planet: dedicated and managed hosting, cloud storage, colocation
Stay online with enterprise data centers and the best network in the business
Choose flexible plans and management services without long-term contracts
Personal 24x7 support from experience hosting pros just a phone call away.
http://p.sf.net/sfu/theplanet-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/