Hi Holger,
I followed your suggestion and switched to rsync. Everything is working now
and backup-size isn't that big.
Thx, Andy.
--
Want fast and easy access to all the code in your enterprise? Index and
search up to
Hi Holger,
thankyou for your answers. I took the idea with the script from
http://backuppc.sourceforge.net/faq/localhost.html.
I will do a roleback with your suggestions but I am not sure if rsync ist he
right option for what I want. The backup files are stored on a NFS share
which is also reacha
Hi,
raceface wrote on 2014-07-15 10:40:34 +0200 [Re: [BackupPC-users] Incremental
Backup fail]:
> thank you for being that kind to non professionals. I have only 3 blank
> lines in my last posting and 21 non blank, don't know, why you get more
> blank 50 times more blank lines.
o
aceface Cc: backuppc-users@lists.sourceforge.net Betreff:
>> Re: [BackupPC-users] Incremental Backup fail
>>
>> Hi,
>>
>> thank you for sending us 175 blank lines. Unfortunately, the
>> content in your 28 non-blank lines doesn't make up for it, so
>> I'll
On Tue, Jul 15, 2014 at 3:40 AM, raceface wrote:
>
> thank you for being that kind to non professionals. I have only 3 blank
> lines in my last posting and 21 non blank, don't know, why you get more
> blank 50 times more blank lines. Using this script is a suggestion of the
> backuppc FAQ and not
ts.
Best, Andy.
> -Ursprüngliche Nachricht-
> Von: Holger Parplies [mailto:wb...@parplies.de]
> Gesendet: Montag, 14. Juli 2014 18:14
> An: raceface
> Cc: backuppc-users@lists.sourceforge.net
> Betreff: Re: [BackupPC-users] Incremental Backup fail
>
> Hi,
>
Hi,
thank you for sending us 175 blank lines. Unfortunately, the content in your
28 non-blank lines doesn't make up for it, so I'll quote sparingly.
raceface wrote on 2014-07-13 11:20:42 +0200 [[BackupPC-users] Incremental
Backup fail]:
> [...] I have a problem [...]
Obviously.
&
Hi,
I have a problem running incremental backups at the end of an incremental
backup I get the following failure notes:
Contents of file
/share/backup/vserver/pc/localhost/XferLOG.bad.z
, modified 2014-07-13 01:43:55 (Extracting only Errors)
Running: /usr/bi
Hi Luis,
> Did a script, largely based on BackupPC_deleteBackup by Matthias
> Meyer
>
> I'm not pro, but i'm sure it won't harm your files, but still, take it as
> is.
>
> see
> ./BackupPC_listChangedFiles -h
Thank you, I tried it out on one server and it looks very useful. It
doesn't immediat
Did a script, largely based on BackupPC_deleteBackup by Matthias Meyer
I'm not pro, but i'm sure it won't harm your files, but still, take it as
is.
see
./BackupPC_listChangedFiles -h
It can be used like this, for example
sudo /usr/share/backuppc/bin/BackupPC_listChangedFiles -c portatil -n 35
|
Thank you, Jeffrey
But again, I'm counting lines starting with " create" in the backup log. No
attrib files there.
On Wed, Apr 7, 2010 at 2:09 AM, Jeffrey J. Kosowsky
wrote:
> Matthias Meyer wrote at about 08:12:45 +0200 on Monday, April 5, 2010:
> > Luis Paulo wrote:
> >
> > > I was trying
Matthias Meyer wrote at about 08:12:45 +0200 on Monday, April 5, 2010:
> Luis Paulo wrote:
>
> > I was trying to get a way to find what files have changed in an
> > incremental backup.
> > Does any one has a solution for it already?
> >
> > I've look at /var/lib/backuppc/pc//backups and to
I'm not sure I understand you, but maybe this helps:
I'm not counting files. I'm counting lines starting with " create" in the
backup log
file /var/lib/backuppc/pc//XferLOG..z
> /usr/share/backuppc/bin/BackupPC_zcat
/var/lib/backuppc/pc/portatil/XferLOG.35.z |grep -c "^ create"
19491
If I ex
Luis Paulo wrote:
> I was trying to get a way to find what files have changed in an
> incremental backup.
> Does any one has a solution for it already?
>
> I've look at /var/lib/backuppc/pc//backups and to the XferLOG, but
> i'm getting nowhere.
> I did a manual incremental a few minutes after an
Right.
I guess the command gives a good start to see what fles changed, if it is
accurate.
Then I can check the directory, even get the previous version and make a
diff.
Thanks
For any particular directory you can click the 'history' link while browsing
> the
> backup and see which versions of ea
Luis Paulo wrote:
> I was trying to get a way to find what files have changed in an
> incremental backup.
> Does any one has a solution for it already?
>
> I've look at /var/lib/backuppc/pc//backups and to the XferLOG, but
> i'm getting nowhere.
> I did a manual incremental a few minutes after a
I was trying to get a way to find what files have changed in an incremental
backup.
Does any one has a solution for it already?
I've look at /var/lib/backuppc/pc//backups and to the XferLOG, but i'm
getting nowhere.
I did a manual incremental a few minutes after another, and I get:
> echo -e " n
> The interesting part of your log would only be the command shown at the
> top of the xferlog, and it needs to have appropriate quoting.
So this is what I have at the top of the xferlog for an incremental backup:
Running: /usr/bin/env LC_ALL=C LANG=en_EN /usr/bin/ssh -q -x -l backup
-o Preferred
Mester wrote:
>>> What about $Conf{TarIncrArgs}?
>> That is more interesting (because this is where the '+' might be missing). To
>> put it more general: if you want to avoid this debugging ping-pong, provide
>> some relevant information (like your configuration settings and log file
>> extracts, f
>> What about $Conf{TarIncrArgs}?
>
> That is more interesting (because this is where the '+' might be missing). To
> put it more general: if you want to avoid this debugging ping-pong, provide
> some relevant information (like your configuration settings and log file
> extracts, for example). Even
Hi,
Les Mikesell wrote on 2009-12-21 14:58:03 -0600 [Re: [BackupPC-users]
incremental backup question]:
> Mester wrote:
> > My TarClientCmd is:
> > '$sshPath -q -x -l backup -o PreferredAuthentications=publickey $host
> > /usr/bin/env LC_ALL=C LANG=en_EN /usr/bin/s
Mester wrote:
> My TarClientCmd is:
> '$sshPath -q -x -l backup -o PreferredAuthentications=publickey $host
> /usr/bin/env LC_ALL=C LANG=en_EN /usr/bin/sudo $tarPath -c -v -f - -C
> $shareName+ --totals'
Your extra sudo might make an extra level of shell escaping necessary.
What about $Conf{Tar
My TarClientCmd is:
'$sshPath -q -x -l backup -o PreferredAuthentications=publickey $host
/usr/bin/env LC_ALL=C LANG=en_EN /usr/bin/sudo $tarPath -c -v -f - -C
$shareName+ --totals'
Attila Mesterhazy
> Mester wrote on 2009-12-19 21:49:58 +0100 [[BackupPC-users] incremental
>
Mester wrote:
>>> I use backuppc on a Debian Linux 5.0 server for backing up another
>>> Debian Linux 5.0 server over the internet with tar over ssh.
>>> The first full backup is created succesfully but the incremental backups
>>> alway make full backup. What could be the reason of this?
>>>
>> Are
Hi,
Mester wrote on 2009-12-19 21:49:58 +0100 [[BackupPC-users] incremental backup
question]:
> I use backuppc on a Debian Linux 5.0 server for backing up another
> Debian Linux 5.0 server over the internet with tar over ssh.
> The first full backup is created succesfully but the in
>> I use backuppc on a Debian Linux 5.0 server for backing up another
>> Debian Linux 5.0 server over the internet with tar over ssh.
>> The first full backup is created succesfully but the incremental backups
>> alway make full backup. What could be the reason of this?
>>
>
> Are you going by the
Mester wrote:
> Hi,
>
> I use backuppc on a Debian Linux 5.0 server for backing up another
> Debian Linux 5.0 server over the internet with tar over ssh.
> The first full backup is created succesfully but the incremental backups
> alway make full backup. What could be the reason of this?
>
Are
Hi,
I use backuppc on a Debian Linux 5.0 server for backing up another
Debian Linux 5.0 server over the internet with tar over ssh.
The first full backup is created succesfully but the incremental backups
alway make full backup. What could be the reason of this?
Attila Mesterhazy
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Les Mikesell wrote:
> Adam Goryachev wrote:
>> So, I guess this is why the following full backup is attempting to
>> re-transfer all of the files/directories after the "error"... So, the
>> million dollar question, why did backuppc see/log the error, a
Adam Goryachev wrote:
>
> So, I guess this is why the following full backup is attempting to
> re-transfer all of the files/directories after the "error"... So, the
> million dollar question, why did backuppc see/log the error, and then
> continue with marking the rest of the entries deleted, and m
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Matthias Meyer wrote:
> Erik Hjertén wrote:
>
>> Matthias Meyer skrev:
>>> I make a full backup each 7 days and use $Conf{IncrLevels} = [1, 2, 3,
>>> 4, 5, 6];
>>> So the first incremental save all files changed since last full. All
>>> following inc
Erik Hjertén wrote:
> Matthias Meyer skrev:
>> I'm with you. But your incrementals will backup all files changed since
>> last full. Do you want that?
>>
> Not necessarily, but now it behaves as expected.
>> I make a full backup each 7 days and use $Conf{IncrLevels} = [1, 2, 3,
>> 4, 5, 6];
>>
Matthias Meyer skrev:
> I'm with you. But your incrementals will backup all files changed since last
> full. Do you want that?
>
Not necessarily, but now it behaves as expected.
> I make a full backup each 7 days and use $Conf{IncrLevels} = [1, 2, 3, 4,
> 5, 6];
> So the first incremental save
Erik Hjertén wrote:
> Matthias Meyer skrev:
>>
>>> -
>>> incr backup started back to 2009-06-13 10:59:37 (backup #341) for
>>> directory data Connected to eddie:873, remote version 29
>>> Negotiated protocol version 28
>>> Connected to module data
>>> Sending args: --server --sender --numeric-
Matthias Meyer skrev:
-
incr backup started back to 2009-06-13 10:59:37 (backup #341) for
directory data Connected to eddie:873, remote version 29
Negotiated protocol version 28
Connected to module data
Sending args: --server --sender --numeric-ids --perms --owner --group -D
--links --times
Erik Hjertén wrote:
>
>>> Please set $Conf{XferLogLevel} = 2;
>>> Than check the XferLOG after a backup.
>>> Look for:
>>> create = new for this backup
>>> pool= found a match in the pool
>>> same= file is identical to previous backup
>>> skip= file skipped in incremental because att
>> Please set $Conf{XferLogLevel} = 2;
>> Than check the XferLOG after a backup.
>> Look for:
>> create = new for this backup
>> pool= found a match in the pool
>> same= file is identical to previous backup
>> skip= file skipped in incremental because attributes are the same
>>
>> But
Matthias Meyer wrote:
> Erik Hjertén wrote:
>
>> Thanks for your reply.
>>
>> Matthias Meyer skrev:
>>> Erik Hjertén wrote:
>>>
>>>
Hi all
I'm using Backuppc on an Ubuntu-server and Deltacopy on windows
clients.
>>> How do you call DeltaCopy from the Backuppc Ser
Erik Hjertén wrote:
>
>>
>> Unless you are doing different incremental levels, the two runs should
>> be exactly the same from the server's perspective, doing the rsync
>> comparison against the save full run as the base. Can anything have
>> changed on the files (timestamps, owner, etc.)
Les Mikesell skrev:
Erik Hjertén wrote:
I'm using Backuppc on an Ubuntu-server and Deltacopy on windows clients.
How do you call DeltaCopy from the Backuppc Server?
With Rsync. XferMethod = rsyncd
The incremental backup runs inludes all files, but onl
Erik Hjertén wrote:
> Thanks for your reply.
>
> Matthias Meyer skrev:
>> Erik Hjertén wrote:
>>
>>
>>> Hi all
>>>
>>> I'm using Backuppc on an Ubuntu-server and Deltacopy on windows clients.
>>>
>> How do you call DeltaCopy from the Backuppc Server?
>>
> With Rsync. XferMethod = rsync
Erik Hjertén wrote:
>
>>> I'm using Backuppc on an Ubuntu-server and Deltacopy on windows clients.
>>>
>> How do you call DeltaCopy from the Backuppc Server?
>>
> With Rsync. XferMethod = rsyncd
>>
>>> The incremental backup runs inludes all files, but only every other day.
>>>
>>
Thanks for your reply.
Matthias Meyer skrev:
Erik Hjertén wrote:
Hi all
I'm using Backuppc on an Ubuntu-server and Deltacopy on windows clients.
How do you call DeltaCopy from the Backuppc Server?
With Rsync. XferMethod = rsyncd
The incremental backup runs inludes all files,
Erik Hjertén wrote:
> Hi all
>
> I'm using Backuppc on an Ubuntu-server and Deltacopy on windows clients.
How do you call DeltaCopy from the Backuppc Server?
> The incremental backup runs inludes all files, but only every other day.
What did you mean with "but only every other day" ?
> The back
Hi all
I'm using Backuppc on an Ubuntu-server and Deltacopy on windows clients.
The incremental backup runs inludes all files, but only every other day.
The backups on the local host (the ubuntu machine) seems fine and only
includes all files on full runs. What can cause this and how can I fix it?
Holger Parplies wrote:
> Hi,
>
> Tino Schwarze wrote on 2009-03-26 18:21:53 +0100 [Re: [BackupPC-users]
> Incremental Backup]:
>> On Thu, Mar 26, 2009 at 12:56:36PM -0400, kyeto wrote:
>>
>>> Ah ok,
>>> I have copied the files from an another folder.
&
Hi,
Tino Schwarze wrote on 2009-03-26 18:21:53 +0100 [Re: [BackupPC-users]
Incremental Backup]:
> On Thu, Mar 26, 2009 at 12:56:36PM -0400, kyeto wrote:
>
> > Ah ok,
> > I have copied the files from an another folder.
> >
> > How can i tell to backuppc to check t
On Thu, Mar 26, 2009 at 12:56:36PM -0400, kyeto wrote:
> Ah ok,
> I have copied the files from an another folder.
>
> How can i tell to backuppc to check the create date of the file and not the
> modification date ?
As far as I know, you cannot. If you could, BackupPC would do it by
default.
T
kyeto wrote:
> Ah ok,
> I have copied the files from an another folder.
>
> How can i tell to backuppc to check the create date of the file and not the
> modification date ?
I don't think that is possible with the smb method.
--
Les Mikesell
lesmikes...@gmail.com
-
Ah ok,
I have copied the files from an another folder.
How can i tell to backuppc to check the create date of the file and not the
modification date ?
Thanks
+--
|This was sent by alexan...@gi-invest.com via Backup Central.
|F
On Thu, Mar 26, 2009 at 10:53:57AM -0400, kyeto wrote:
> I try to configure BackupPC to my firm with smb.
> I have no problem with the full backup.
>
> But when i create new folder with files, they are not saved with a
> incremental backup, not always.
> Some times, they are copied but not alway
kyeto wrote:
> Hi,
>
> I try to configure BackupPC to my firm with smb.
> I have no problem with the full backup.
>
> But when i create new folder with files, they are not saved with a
> incremental backup, not always.
> Some times, they are copied but not always.
>
> How can i resolv this prob
Hi,
I try to configure BackupPC to my firm with smb.
I have no problem with the full backup.
But when i create new folder with files, they are not saved with a incremental
backup, not always.
Some times, they are copied but not always.
How can i resolv this problem ?
Thanks
+
I am running BackupPC 3.0.0 on Ubuntu Hardy - installed from the apt
repository.
I have been using it for several months without issue - its backing up
all the machines in my office (6 or 7 of them) as well as 1 server
that was on a different subnet though located in the same room as the
b
Craig writes:
> Yes it is. Since the Config hash is empty when each config file is parsed,
> this line in lib/BackupPC/Storage/Text.pm incorrectly forces a default value
> to $Conf{IncrLevels} when it might already be defined in the main config file:
>
> $conf->{IncrLevels} = [1] if ( !defin
John writes:
> Yup, that's the only way I can get it to work. If it's in the main
> config file, it doesn't work. Sounds like a bug to me.
Yes it is. Since the Config hash is empty when each config file is parsed,
this line in lib/BackupPC/Storage/Text.pm incorrectly forces a default value
to $C
On 11/05 06:39 , John Rouillard wrote:
> It may be, but mine is also hand crafted as well. What release of
> backupc are you running?
this is v.3.0.0
--
Carl Soderstrom
Systems Administrator
Real-Time Enterprises
www.real-time.com
On Mon, Nov 05, 2007 at 09:38:16AM -0600, Carl Wilhelm Soderstrom wrote:
> On 11/05 03:22 , John Rouillard wrote:
> > Yup, that's the only way I can get it to work. If it's in the main
> > config file, it doesn't work. Sounds like a bug to me.
>
> I have
> $Conf{IncrLevels} = [1, 2, 3, 4, 5, 6];
On 11/05 03:22 , John Rouillard wrote:
> Yup, that's the only way I can get it to work. If it's in the main
> config file, it doesn't work. Sounds like a bug to me.
I have
$Conf{IncrLevels} = [1, 2, 3, 4, 5, 6];
in my config.pl, and it works fine for me. the web interface reports doing
incrementa
On Mon, Nov 05, 2007 at 07:43:40AM -0500, Rob Owens wrote:
>
> John Rouillard wrote:
> > On Fri, Nov 02, 2007 at 08:20:56AM -0400, Rob Owens wrote:
> >> I have it working in 3.0
> >>
> >> $Conf{IncrLevels} = [1, 2, 3, 4, 5, 6];
> >>
> >> The backup summary shows that it is, in fact, performing mul
John Rouillard wrote:
> On Fri, Nov 02, 2007 at 08:20:56AM -0400, Rob Owens wrote:
>> I have it working in 3.0
>>
>> $Conf{IncrLevels} = [1, 2, 3, 4, 5, 6];
>>
>> The backup summary shows that it is, in fact, performing multilevel backups.
>> John Rouillard wrote:
>>> Hi all:
>>>
>>> I am working
On Fri, Nov 02, 2007 at 08:20:56AM -0400, Rob Owens wrote:
> I have it working in 3.0
>
> $Conf{IncrLevels} = [1, 2, 3, 4, 5, 6];
>
> The backup summary shows that it is, in fact, performing multilevel backups.
> John Rouillard wrote:
> > Hi all:
> >
> > I am working with BackupPC 3.1.0Beta0. I
I have it working in 3.0
$Conf{IncrLevels} = [1, 2, 3, 4, 5, 6];
The backup summary shows that it is, in fact, performing multilevel backups.
-Rob
John Rouillard wrote:
> Hi all:
>
> I am working with BackupPC 3.1.0Beta0. I have
>
> $Conf{IncrLevels} = [1, 2, 3];
>
> set in my config file
Hi all:
I am working with BackupPC 3.1.0Beta0. I have
$Conf{IncrLevels} = [1, 2, 3];
set in my config file. However when I schedule a backup from the web
interface it always shows up as a level 1, even when the prior backup
was a level 1. Is this expected?
Also I sent an email earlier about
"Chir patel" <[EMAIL PROTECTED]> writes:
Hi Chir
> Hi there
>
> I am trying to do an Incremental backup dataserver Departmental data
> (word,excel,access,pics,pdf files) every two hours. Size of the
> departmental data folder is about 80 GB. I have installed Backup pc, I have
> done following
>
>
Hi there
I am trying to do an Incremental backup dataserver Departmental data
(word,excel,access,pics,pdf files) every two hours. Size of the
departmental data folder is about 80 GB. I have installed Backup pc, I have
done following
1) Changed confil.pl using web interface so that backup starts
Hi,
Jes?s Martel wrote on 16.05.2007 at 19:09:41 [[BackupPC-users] Incremental
backup and SMB protocol don't work correctly]:
> Backup Summary:
> ===
>
> Backup# TypeFilled Level Start Date Duration/mins Age/days
> 0 fullyes
Hello! I have a problem with BackupPC 3.0.0. The incremental backups
don't work correctly.
I put a example:
Backup Summary:
===
Backup# TypeFilled Level Start Date Duration/mins Age/days
0 fullyes 0 5/7 21:21 0.0 8.9 <= No
files
Les Mikesell a écrit :
> On Sat, 2006-06-03 at 09:39, Michael Helmling wrote:
>
>>Hello,
>>I set up backuppc to back up my home directory (currently only this, I want
>>to
>>test the software).
>>
>>At first I did a full backup with tar, then switched over to rsync. Now every
>>incremental back
On Sat, 2006-06-03 at 09:39, Michael Helmling wrote:
> Hello,
> I set up backuppc to back up my home directory (currently only this, I want
> to
> test the software).
>
> At first I did a full backup with tar, then switched over to rsync. Now every
> incremental backup he transfers the whole di
Hello,
I set up backuppc to back up my home directory (currently only this, I want to
test the software).
At first I did a full backup with tar, then switched over to rsync. Now every
incremental backup he transfers the whole directory again (about 4
Gigabytes), even though only a few megabytes
"MOORE,JUSTIN THOMAS" <[EMAIL PROTECTED]> writes:
> Then what I was hoping to do was to schedule backups once a week or so
> on client machines to do incremental backups over the internet. My
> problem is that neither I, nor my clients have the bandwidth to send a
> complete copy of their files ov
Hey guys, I was hoping to use this software to assist with backing
up client winxx boxes. My plan was to take my linux laptop down to
the client site and rsync their files to an external hard drive,
then put these files on my backup server. Then what I was hoping
to do was to schedule backups o
Server: redhat ES 3 using Rsyncd
client: redhat with rsync 2.6.3
BackupPC version 2.1.2
The file still exist but as a size of "0".
And it created a new file:
data.10062005 with size = 0
in the Log file (LogLevel = 1) we see:
create 644 1020/10 0 data.10062005
create 744 102
74 matches
Mail list logo