Kiran writes:
> I am trying to install BackupPC on ubuntu server edition. I am running the
> confiure command as
> sudo perl configure.pl
>
> it fails with the error message
>
> Making init.d scripts
> can't chown 1000, 1000 init.d/gentoo-backuppc.conf at configure.pl line 1011.
>
> Not sure w
Simone writes:
> I got a strange problem doing incrementals with tar over ssh using
> --newer=$incrDate+. It seems an "escape problem" of part of the time
> reference for the incremental.
Yes, the escaping isn't happening. The "$incrDate+" form means
to escape the value, so that is what you shou
Omar writes:
> $Conf{TarClientCmd} = ' env LC_ALL=C /usr/bin/sudo $tarPath -c -v -f -
> -C $shareName+'
> . ' --totals';
>
> $Conf{TarClientRestoreCmd} = ' env LC_ALL=C /usr/bin/sudo $tarPath -x -p
> --numeric-owner --same-owner'
>. ' -v -f - -C $shareName+
Matthias writes:
> If a user requests a restore I want to restore one extra file and handle it
> by the RestorePostUserCmd.
> Is it possible to request this additional restore with BackupPC_restore
> during the RestorePreUserCmd or RestorePostUserCmd ?
Yes, you could do it by emulating what the C
Sil writes:
> $Conf{ArchiveClientCmd} = '$Installdir/bin/BackupPC_archiveHost' <=
> add -b 10
>. ' $tarCreatePath $splitpath $parpath $host $backupnumber'
>. ' $compression $compext $splitsize $archiveloc $parfile *';
>
> I don't know how to write this, and where to place it ?
>
Jeff writes:
> Are you sure that you can't get rsync to calculate the checksums (both
> block and full-file) before file transfer begins -- I don't know I'm
> just asking..
I believe rsync's --checksum option precomutes and sends the whole
file checksum (which as has been noted is different to Ba
Jean-Michel writes:
> $Conf{BackupFilesExclude} = [
> '/Users/garant/Library/Preferences/ByHost/*00224126372e.plist' ];
>
> notice the wildcard '*' in the file list...
>
> but it seems that BackupPC_dump stats the file BEFORE to exclude the
> file from backup because there is a "failed to ope
Christian writes:
> I'm having some issues with excluding directories.
>
> If have the following settings in the host.pl:
> =snip===
> $Conf{RsyncShareName} = [
> '/',
> '/srv'
> ];
> $Conf{BackupFilesExclude} = {
> 'srv' => [
>
Brian writes:
> * 0 pending backup requests from last scheduled wakeup,
> * 0 pending user backup requests,
> * 0 pending command requests,
> * Pool is 0.00GB comprising 0 files and 0 directories (as of 1/29 01:00),
> * Pool hashing gives 0 repeated files with longest chain 0,
> *
Nick writes:
> Ive tried putting ";" in between the two commands, ive tried & and &&
> as well, with & in there, it seems to run both commands but the
> variables arent being pulled from backuppc, so the email doesnt work
> correctly. and also the script that runs my vshadow commands doesnt
> see
Tony writes:
> I missed the original post, but I run rsync with the --whole-file
> option, but I still get RStmp files, is that not supposed to happen?
RStmp is a temporary file used to store the uncompressed pool file,
which is needed for the rsync algorithm. It's only used for larger
files -
Cody writes:
> I'd be willing to do a lot of the cleaning myself, though I don't want
> to step on anyone's toes without talking with you first. Also, my
> knowledge of BackupPC is fairly limited to my setup (XP/Vista clients &
> Ubuntu server).
I agree it isn't very well organized. I don't thin
BackupCentral.com has generously offered to contribute some free
banner ads for BackupPC on their site.
To take advantage of this offer I need someone with some graphic
skills to generate a couple of banner images with particular
geometries. If you are willing to contribute some time please
email
James writes:
> Is there a way to block users of editing or viewing the config of a host?
Yes:
$Conf{CgiUserConfigEditEnable} = 0;
If you leave that on, you can also enable/disable user editing of specific
parameters with
$Conf{CgiUserConfigEdit}
Craig
--
James writes:
> I wanted to be able to notify a group of people when any backup starts
> or ends, so I did some googling and found an archived email on this
> list about how I might do it.
>
> I made a script with the following contents as a test and named
> it startbkpemail.sh (just used the exam
James writes:
> Executing DumpPreUserCmd: '/usr/local/bin/startbkpemail.sh nxweb05
> incr';
> Exec of '/usr/local/bin/startbkpemail.sh nxweb05 incr'; failed
It thinks the command is
'/usr/local/bin/startbkpemail.sh nxweb05 incr';
so it is trying to run this executable:
'/usr/local/bin/
sabujp writes:
> In the last command that runs BackupPC_tarPCCopy, does this perl command look
> at any of the configuration files on the local host or does it just get what
> it needs to re-generate the hard links straight from the old "pc" directory?
> I looked through the code and don't see
Matthias writes:
> I backup a windows client with rsyncd over ssh. I am pretty sure the ssh
> connection was interrupted at 23:27.
> In the /var/lib/backuppc/pc/st-ms-wv/XferLOG.0.z I found the error message:
> create 770 4294967295/4294967295 240986
> Help/Windows/de-DE/artcone.h1s
> Re
Paul writes:
> In case this is of use to others, I tweaked the BackupPC_archiveStart
> script to properly (IMHO) deal with the ArchiveComp setting. While my
> coding style may be icky to some, I think my removal of the ".raw" file
> extension for uncompressed archive files may be of issue to oth
David writes:
> I took a closer look at the perl code and I see the cause of the problem.
> Please note I have no DNS. My PCs use DHCP, but are configured in BackupPC
> with the host table's DHCP flag set to zero.
>
> Here is what I think is happening:
>
> 1. BackupPC_dump is called periodically
Ski writes:
> I have a windows client that has been working fine for over a year and
> now there are three files in the 6 - 7GB range that it just ignores. I
> am using cygwin-rsyncd 2.6.6 and backuppc 2.1.2. I was able to force a
> backup of one large file by excluding all other directories exc
John writes:
> Can anybody confirm that xferlogs are not being written if
> DumpPreUserCmd exits non-zero with $Conf{UserCmdCheckStatus} = 1? Also
> does anybody know if it is fixed in a subsequent release?
Yes, this looks like a bug. An error will be written to the per-client
LOG file. But in
Pramathesh writes:
> The documentation on the backuppc mentions that "old unmangled file
> names are still supported by the CGI interace". However, I have not been
> able to figure out how and where this option can be set.
What that means is backups taken with very old versions of BackupPC
(when
Xavier writes:
> *) BackupPC doesn't worked correctly on one host
>
> #86 was supposed to be a full backup but when browsing I found out that it's
> missing a lot of directory ( /bin /home ...)
>
> size of backup# on disk
> 2,8G 86
> 9,3G 87
> 9,4G 88
>
> Moreover, when trying to read logfile,
Paul writes:
> I tried just changing 'RsyncClientCmd' to "$rsyncPath $argList+" but it
> seems BackupPC is expecting the SSH and is now improperly escaping
> 'RsyncArgs'. The hitch is with a space in one.
>
> $Conf{RsyncClientCmd} = '$rsyncPath $argList+';
> $Conf{RsyncArgs} = [
> '--numeric-i
Paul writes:
> Here's what I'm getting:
>
> full backup started for directory /etc (baseline backup #50)
> Running: /usr/bin/rsync --server --sender --numeric-ids --perms --owner
> --group -D --links --hard-links --times --block-size=2048 --recursive
> --filter dir-merge\\\ /.backuppc-filte
John,
> I am seeing corrupted directory listings using BackupPC_tarCreate. One
> of the reported filenames has a bunch of nulls in the middle of it
> using BackupPC-3.1.0.
I'd like to get to the bottom of this. Let's take this off list.
It would be great if you could get this to happen on as sm
J:
> Is it possible to get a CVS copy?
>
> I tried: "cvs -z3.2
> -d:pserver:anonym...@backuppc.cvs.sourceforge.net:/cvsroot/backuppc co
> BackupPC"
>
> ...but received the dreaded "__CONFIGURE_BIN_LIST__" error when I ran
> the "./configure.pl"
You need to read CVS_README (actually I need to up
Les writes:
> I would absolutely love it if the top level directories were still
> created by backuppc first before doing the hardlink test. If those
> directories are created because they dont exist and the hardlink
> test fails then just remove the directories. Or leave them there,
> all you've
Pedro writes:
> Those are good news, where can we see about new stuff is in this upgrade?
Here is the current ChangeLog. This should be pretty much what is
in 3.2.0beta0.
Craig
* Added BackupPC::Xfer::Protocol as a common class for each Xfer
method. This simplifies some of the xfer specific
Tomasz writes:
> Are there any plans to update File-RsyncP to make it compatible with
> newer rsync protocol versions?
I'm experimenting with FUSE to see if native rsync3 + FUSE will
be the best path. Otherwise, yes, I will update File-RsyncP.
Craig
David writes:
> The short story is that you need to configure BackupPC to wake up only
> once per day, in order for wakeonlan to work in a reasonable manner.
That should be fixed in 3.2.0beta0:
* Moved call to NmbLookupFindHostCmd in BackupPC_dump to after the
check of whether a backup
Jeff writes:
> Sounds cool... I imagine this is in line with the thread we had a few
> months ago.
Yes, that's right. I want to be sure the performance and reliability
are high enough before making the decision.
Craig
Obj writes:
> I am running version 3.2.0.
Actually you are running CVS.
> can someone tell me why
> $Conf{BackupFilesExclude} is not working. It still backups up all Temp
> folders, and .mp3 files, etc. The backup method is SMB.
Can you send the first few lines for the XferLOG file?
Craig
---
Madcha writes:
> Since few days, trashClean is started but, won't clean old backups,
> I don't understand why,
> In trash folder there nothing,
That means BackupPC_trashClean is working: its job is to remove
everything that appears in $TopDir/trash.
> It is perhaps for that reason that it does c
Mirco,
> Here error log, mailed from BackupPC:
>
> The following hosts had an error that is probably caused by a
> misconfiguration. Please fix these hosts:
> - elpra01lc (Call timed out: server did not respond after 2
> milliseconds opening remote file \ELPRA01WS\ELPRA06SV\E
> (\ELPRA01WS\
Bharat writes:
> On the host page - I selected SMB with sharename set to D$
> I've setup my username and password (administrator)
>
> I've unticked BackupFilesExclude and Ticked BackupFilesExclude and tried
> various formats
> (including */Temp/* as found in EXCLUDE!!! - yes there is a Temp fold
John,
It's still on my todo list - I didn't get around to it for 3.2.0beta0.
I'll see if I can get it in before the final 3.2.0 release.
Craig
--
This SF.net email is sponsored by:
High Quality Requirements in a Collabor
BackupPC 3.2.0beta0 has been released on SF.net.
3.2.0beta0 is the first beta release of 3.2.0.
3.2.0beta0 has several new features and quite a few bug fixes
since 3.1.0. New features include:
* Added FTP xfer method, implemented by Paul Mantz.
* Added more options to server backup command: rat
Boniforti writes:
> carola/Desktop/CAROLA/Mise à jour des prix 2009-04-01.xls: size doesn't
> match (14702080 vs 0)
>
> Can you tell me why it should tell things like "size doesn't match"?
> Could you please explain what's going on?
You have a high log level enabled ($Conf{XferLogLevel}), so y
Chris writes:
> I didn't see any mention of lib/BackupPC/Lib.pm being updated for the
> case when XFS is used as the pool file system and IO::Dirent is
> installed (as per
> http://www.mail-archive.com/backuppc-de...@lists.sourceforge.net/msg00195.html).
>
> Looking at the source of the Beta, th
Fatih writes:
> I want to translate BackupPc's CGI and the installation part from
> English to Turkish.
> How can i do this ? Is there anyone who is responsible for this kind
> work ? Who can me give some advice where i could begin start
You should look in lib/BackupPC/Lang. Each language has it
Tim writes:
> Hi I just installed the latest backuppc
> version 3.2.0beta0. When I try to do
> a full backup of a test host I'm seeing
> this error in the log
>
> 2009-04-12 11:07:14 User backuppc requested backup of scvffs09 (scvffs09)
> Can't locate File/Listing.pm in @INC (@INC contains: /usr
Holger writes:
> two things are really confusing me:
>
> 1.) The title claims that it is supposed to be an *rsync* xfer, the error
> message clearly indicates that *ftp* is attempted (and fails). Tim, could
> you please clarify which transfer method you are trying to use?
The code loads
Obj writes:
> I am running version 3.2.0. can someone tell me why
> $Conf{BackupFilesExclude} is not working. It still backups up all Temp
> folders, and .mp3 files, etc. The backup method is SMB.
You sent me offlist your config file and XferLOG file. Thanks.
The problem is that if you use wild
Tim writes:
> This is a new install so I thought I would try the beta version
> do you recommend I go back to the stable version?
If you are willing to test the beta version some more that
would be great. You've already found one bug :).
Holger told you where to get the File::Listing module.
An
Oblivian,
As you point out, there isn't a general method for merging per-client
settings with the global ones.
However, in 3.2.0beta0 there is a new config variable
$Conf{RsyncArgsExtra} which is combined with $Conf{RsyncArgs}.
The purpose is exactly for your application: you can use
$Conf{RsyncA
John writes:
> $Conf{SmbShareName} = [
> 'C$'
> ];
> #FILES TO BACKUP
> #-
> $Conf{BackupFilesOnly} = {
> 'c' => ['/MS_OUTLOOK/*'],
> };
First, the 'c' should be 'C$' - it should match the share name.
Also, you can't use wildcar
Alex writes:
> The folder shows up in the backup as 0750. The -p is present
Do you mean when you look at the directory permissions below
the PC directory on the backup server; eg, the output from:
ls -ld /TOPDIR/pc/HOST/nnn/fshare/fhome
What permissions are shown when you browse to that dir
Alex writes:
> [r...@qsbackup f%2f]# pwd
> /opt/backuppc/files/pc/mail/184/f%2f
> [r...@qsbackup f%2f]# ll
> total 16
> -rw-r- 3 backuppc backuppc 26 Apr 17 05:04 attrib
> drwxr-x--- 5 backuppc backuppc 4096 Apr 17 06:00 fetc
> drwxr-x--- 3 backuppc backuppc 4096 Apr 17 06:03 fhome
> drwx
iazmin writes:
> I am using ffp 0.5 on my D-Link DNS-323 NAS.
> I am using this wiki http://wiki.dns323.info/howto:backuppc to install
> backuppc.
>
> I am still unable to start the BackupPC to run on the device. The command
> that I use to start backupPC is:
>
> home/root# /ffp/usr/BackupPC/b
iazmin writes:
> I installed everything provided in
> http://www.inreto.de/dns323/fun-plug/0.5/extra-packages/perl/. So the perl
> version is 5.10-2 while for backuppc I am using 3.1.0 version.
At line 1635 of bin/BackupPC, try replacing this:
foreach my $host ( sort(HostSortCompare keys(%
John,
> Would making the three buttons read something like:
>
>
>
>
>
> with support in the cgi for detecting ?button="Start%20Incr%20Backup"
> with no action or a blank action work?
The reason I moved away from this to small snippets of JavaScript
is that the value string encoding and
Boniforti writes:
> Well, I'll have to wait until *all backups* are done. Do you think that
> the graphics also will be updated? Because actually I see them not
> corresponding to the pool size.
What is your new filesystem type? Do you have IO:Dirent installed?
IO:Dirent doesn't work correctly
BackupPC users,
If you are interested in nominating your favorite open source
projects on SF.net please see below.
If that happens to be BackupPC, you can use this link:
http://sourceforge.net/community/cca09/nominate/?project_name=BackupPC&project_url=http://backuppc.sourceforge.net/
Than
Ralf writes:
> thanks, this seems to solve the problem:
Sounds like you have the IO::Dirent + xfs problem. It's fixed
in 3.2.0 beta0.
Craig
--
Register Now for Creativity and Technology (CaT), June 3rd, NYC. CaT
is a
Craig writes:
> Do you have release notes posted anywhere describing the new features
> included in this release 3.2.0beta0?
http://sourceforge.net/project/shownotes.php?release_id=673692&group_id=34854
Craig
--
Registe
Interesting thread!
Jeffrey writes:
> That being said, I agree that using a database to store both the
> hardlinks along with the metadata stored in the attrib files would be
> a more elegant, extensible, and platform-independent solution though
> presumably it would require a major re-write of B
G.W. Haywood writes:
> Is there any chance that the daily digest could be, well, daily?
I just changed digest_size_threshhold from 30k to 500k.
Craig
--
OpenSolaris 2009.06 is a cutting edge operating system for enterpr
Tino writes:
> Well, we've already got MD4 checksums of file blocks. And if I
> understand everything correctly, we DO GET collisions, therefore the
> hash chains.
These collisions are because the BackupPC digest is only computed
over the first part of the file.
> Of course, this if for 256k blo
ckandreou writes:
> Executing: /usr/local/BackupPC/bin/BackupPC_archiveHost
> /usr/local/BackupPC/bin/BackupPC_tarCreate /usr/bin/split ccdev10 348
> /usr/bin/bzip2 .bz2 000 /dev/sa0 *
> Xfer PIDs are now 81631
> Writing tar archive for host ccdev10, backup #348 to /dev/sa0
> Got unknown t
Hereward writes:
> I'm looking for a way of extract details of backuppc jobs so that I
> can parse them and produce some graphs of usage and general activity.
For current running jobs, host status, server info (pool stats etc)
and command queues you can do this:
bin/BackupPC_serverMesg statu
Jason writes:
> New to Backuppc. Is there a log for files that have been recovered or
> restored?
If an individual file is downloaded (ie: if a user clicks on a
file name) or a zip or tar file is downloaded that is logged to
BackupPC's main log file. These aren't technically restores,
because Ba
Holger writes:
> Craig was kind enough to unsubscribe him for that before; apparently
> he has now resubscribed. Can you do it again, please, Craig?
Done.
Craig
--
___
BackupP
Matthias writes:
> I wonder that BackupPC knows how often a host was reachable or not but I
> can't find a file where this information is stored.
The host status is stored the BackupPC server. It is periodically
saved to the $TOPDIR/log/status.pl file so that it can be restored
if you restart Ba
error403 writes:
> incr backup started back to 2009-06-26 22:53:27 (backup #0); for directory
> /var/lib/backuppc/sshfsuser/
> Xfer PIDs are now 30107,30106
> /bin/tar: Substituting 1901-12-13 15:45 for unknown date format `2009-06-26\\
> 22:53:27'
You didn't include your $Conf{TarClientCmd} se
Mirco writes:
> Is there a way to retrieve at run time also other informations, to
> pass as args to pass to my scripts?
> I mean (for now :-D):
> - user & password (for smb XferMethod, defined in main config.pl or in
> host.pl),
> - values listed in $Conf{BackupFilesOnly}
To get that information
Axel,
> I can also reproduce this problem on the commandline, so it looks to me
> to as if this could be a pure smbclient bug -- if there isn't something
> bogus in the commandline options or parameters to smbclient:
>
> backu...@backuppc:~ $ /usr/bin/smbclient fnord.example.net\\C\$ -U
> ba
Matthias writes:
> I use backuppc 3.1.0-4lenny1 (Debian).
> I have a bash script, run as DumpPreShareCmd, which should return 0 or 1.
> Unfortunately
> BackupPC didn't receive the return codes from my script.
>
> For test I have made a script which contains only two lines:
> #! /bin/bash
> exit
gimili writes:
> I switched back from tar to rsync. It sounds like rsync is far
> superior. I ran a full backup which took 302 minutes and then an
> incremental which only took 26 minutes. So it seems like things are
> working now as the incremental was much quicker. I am not sure what
> happe
ummmax writes:
> So, description of a /backuppc/pc/$host/backups log, could anyone
> give me a hand ? First two columns make sense and figured out: 3
> (Start Time), 4 (End Time), 6 (size) and the rest ? :)
Look in the documentation.
Craig
---
Kanwar writes:
> full backup started for directory /home; updating partial #147
> started full dump, share=/home
> Running: /usr/bin/sudo /usr/bin/rsync --server --sender --numeric-ids
> --perms --owner --group -D --links --hard-links --times
> --block-size=2048 --recursive --checksum-seed=32761 -
Christian writes:
> Hi,
>
> I'm experiencing some strange difficulties with BackupPC
> (3.1.0-3ubuntu1 on Ubuntu 8.04 LTS). It appears that BackupPC is not
> "recognizing" that it put files into the pool already. The log shows
> nightly a message according to which the pool is 0 GB, consisting of
SourceForge has discontinued Wikispaces, and they are migrating
the wiki pages to MediaWiki or Trac.
I've have a support request into SF for them to port the pages
to MediaWiki. Once that's done the Wiki link should work again.
I have a snapshot of the final Wikispaces pages, so if it doesn't
mi
SourceForge has completed the transition to MediaWiki, but it
looks like some of the formatting is messed up. Hopefully all
the content is there, but I haven't checked.
I'd appreciate if people could look at pages they have edited or
created to get them back to their former state. SF was meant
t
Matthias writes:
> Every day I get a message in __LOGDIR__/LOG:
> 2009-08-12 02:35:49 Cpool is 322.19GB, 1142028 files (860 repeated, 31 max
> chain, 11424 max links), 4369 directories
>
> What is the meaning of:
> repeated
The total number of pool files with hash collisions. Since the hash is
Jim writes:
> I get decent transfers with actual rsync, but File::RsyncP has some
> serious design issues (see my other post with profiling information
> titled "File::RsyncP issues"). Is the author of that module (Craig
> Barratt) still around and/or maintaining it?
Yes
Holger writes:
> I believe Craig is researching other alternatives (a fuse FS to handle
> compression and deduplication, so BackupPC could, in fact, use native rsync).
> If that proves unviable, upgrading File::RsyncP to protocol version 30 would
> probably be next. But File::RsyncP is open source
my email describes
some of the methodology that could you use to test your current set up.
Craig
-- Forwarded message --
To: rs...@lists.samba.org
Subject: Rsync performance increase through buffering
From: Craig Barratt
Message-ID:
Date: Sun, 08 Dec 2002 23:48:57 -0800
I've b
BackupPC 3.2.1 has been released on SF.net. It is a bug
fix release for 3.2.0. The ChangeLog is appended.
Craig
#
# Version 3.2.1, 24 Apr 2011
#
* Ens
Over the last year there has been some interesting work in discovering
softlink vulnerabilities that affect tar and rsync. The implication
for BackupPC is significant - these vulnerabilities are exposed in
the typical manner that BackupPC uses tar and rsync - in particular,
due to the elevated pri
On Mon, Jul 2, 2012 at 5:10 AM, Bryan Keadle (.net) wrote:
I see that the perl module,
> /usr/share/BackupPC/lib/BackupPC/CGI/ReloadServer.pm allows the daemon to
> become aware of the hosts file change. How can I call this module from the
> SSI bash script I have that does the provisioning of th
I just released BackupPC 3.3.0 on SF.net. It is mostly minor bug fixes,
with three additional languages and a couple of small features.
Sorry about the long delay, but I've been pretty busy the last few years...
Thanks to everyone who submitted patches or improvements, and also for the
on-going
g new in the next few months, which probably means there
won't be enough time to finish 4.0.
Craig
On Sun, Apr 14, 2013 at 7:38 PM, Adam Goryachev <
mailingli...@websitemanagers.com.au> wrote:
> On 15/04/13 11:05, Craig Barratt wrote:
> > I just released BackupPC 3.3.0 on SF
Eduardo,
Can you send me a complete debian-init file please?
Craig
--
Get 100% visibility into Java/.NET code with AppDynamics Lite
It's a free troubleshooting tool designed for production
Get down to code-level detail fo
> Craig, are you going to add this one to the next revision of backuppc?..
Yes.
Craig
On Mon, Jun 3, 2013 at 9:28 AM, Eduardo Díaz Rodríguez wrote:
> **
>
> Craig, are you going to add this one to the next revision of backuppc?..
>
>
> I am glad to help. :)
>
> On 2013-06-02 21:32, Eduardo Día
BackupPC community,
I'm pleased to announce that BackupPC 4.0.0alpha0 has been released on
SourceForge at:
https://sourceforge.net/projects/backuppc/files/backuppc-beta/4.0.0alpha0/
4.0.0 is a significant upgrade and rewrite. It should be backward
compatible with existing installations.
The r
>
> Would appending the file size to the md5 (either literally or
> notionally) further decrease the astronomically small chance of a
> non-purposely constructed collision?
I don't think so. Also, it would also cause the digests to be different
from the full-file digests in rsync 3.X. (In Backu
>
> I find it interesting that the client file path is being cut off ~= 100
> characters.
That's a very good observation and an important clue. The file header in
any tar file is limited to 100 characters, and there is a special extension
(basically another dummy file header with a payload conta
t::Std;
use File::Path;
On Sun, Jun 23, 2013 at 6:10 PM, Craig Barratt <
cbarr...@users.sourceforge.net> wrote:
> BackupPC community,
>
> I'm pleased to announce that BackupPC 4.0.0alpha0 has been released on
> SourceForge at:
>
> https://sourceforge.net/pro
n has not changed since then, neither
> have the system been updated.
>
> Is there a way to increase the debug reporting level to get more clues?
>
>
>
>
>
> On 06/25/2013 04:42 AM, Craig Barratt wrote:
>
> I find it interesting that the client file path is being cut
BackupPC community,
I'm pleased to announce that BackupPC 4.0.0alpha1 has been released on
SourceForge at:
https://sourceforge.net/projects/backuppc/files/backuppc-beta/4.0.0alpha1/
BackupPC 4.0.0alpha1 contains a few bug fixes since BackupPC 4.0.0alpha0
was released just over a week ago. If y
On Tue, Jul 2, 2013 at 4:39 PM, Brad Alexander wrote:
> I know that the web interface backfills backups from previous ones. But
I'm
> wondering...Is there a straightforward way to see which files were backed
up
> in an incremental from the web?
Les' suggestions are the best idea for getting all
>
> Starting BackupPC: Can't locate BackupPC/Lib.pm in @INC
The $Conf{InstallDir} setting isn't used to find the libraries. It's just
to help configure.pl find the installation location the next time you
upgrade.
Look at the first few lines of code in /usr/local/backuppc/bin/BackupPC for
a line
3 PM, Igor Sverkos wrote:
> Hi,
>
> Craig Barratt wrote:
> > - rsync-bpc-3.0.9.1.tar.gz: a modified rsync that
> > runs on the server that has a shim layer that
> > interfaces directly to the BackupPC file system.
>
> Not sure where this modified rsync version will be us
Yes, the cygwin-rsync release is very out of date.
That old patch contained some performance improvements and added the
--checksum-seed option, both of which have been included upstream circa
rsync 2.6.2. So the README file is out of date - there is no longer a diff
file.
Cygwin actively discour
Jean,
I should have been clearer in my explanation. This isn't a user feature in
4.0.0 that allows users to delete any backup.
What I was trying to say was that 4.0.0 includes the ability to delete any
backup, while maintaining the consistency of the reverse deltas. That
simplifies the deletion
Yes - you have the right solution. These files shouldn't be in the release.
I was using a new MacOSX machine, and I botched the final tarballs for
alpha1 by using the default bsdtar on MacOSX (instead of gnu tar), and it
added the MacOSX AppleDouble files, which contain metadata which is useless
Thanks to Ray Frush, I have released a new version of cygwin-rsyncd, the
cygwin-based rsync for WinXX-based client machines. As users have noted,
the prior version is many years old.
The released files can be found on SF.net at:
https://sourceforge.net/projects/backuppc/files/cygwin-rsyncd/3.0.9
John,
I would like to post questions for BackupPC, kindly grant me the access and
> advise how it is done?
You just did ask a question of the entire user list. Nothing more is
needed. Send your question to this list, as you just did.
Craig
-
201 - 300 of 1559 matches
Mail list logo