Updating the list.
With the help of Daniel and Chris we figured out the problem. With
mod_perl enabled you need to restart apache after upgrading BackupPC,
otherwide apache will still be running parts of the old BackupPC code.
Craig
---
John,
> Well some more digging and a recompile of perl set to use system
> malloc() instead of it's build in malloc() and the problem has gone
> away (usemymalloc=n).Something the restore process does really
> causes perl malloc to go crazy on lage restore jobs. Playing with
> the write buf
Rich writes:
> I don't think BackupPC will update the pool with the smaller file even
> though it knows the source was identical, and some tests I just did
> backing up /tmp seem to agree. Once compressed and copied into the
> pool, the file is not updated with future higher compressed copies.
>
Scott writes:
> TarClientCmd is
>
> /usr/bin/sudo /bin/tar -c -v -f - -C $shareName+ --totals
>
> and /bin/tar --version gives:
>
> tar (GNU tar) 1.15.92
>
> The backup fails with:
>
> 2007-12-04 22:09:50 full backup started for directory /data
> 2007-12-04 22:27:23 Got fatal error during xfe
Jon writes:
> > I'm sure it works, there must be some error in your settings. Could you
> > maybe post your BackupFilesExclude and BackupFilesOnly settings?
>
> Sure. In /etc/BackupPC/config.pl
>
> $Conf{BackupFilesOnly} = {};
> $Conf{BackupFilesExclude} = {};
>
> In /etc/Backu
Nils writes:
> Maybe because you override $Conf{RsyncClientPath} *after* setting
> $Conf{RsyncClientCmd} for this host?
Nice try, but the config settings are order-independent. The values get
interpolated just before the command is run.
Jon, most likely the per-client config isn't getting read.
Jim writes:
> I have a sever that has been happily running backuppc 2.1.0 for well
> over a year, and decided to upgrade it to 3.1.0. The server is Debian
> stable, but BackupPC has only been installed from source, never
> through apt-get or aptitude. So I stopped the BackupPC process, and
> moved
Samrat writes:
> Samrat Mitra wants you to join Yaari!
>
> Is Samrat your friend?
You are now unsubscribed from this list.
Craig
-
SF.Net email is sponsored by:
Check out the new SourceForge.net Marketplace.
It's the best
John writes:
> I am using BackupPC-3.1.0 and I want to back up a share using the
> following exclusion patterns under rsync:
>
> + */
> + */Maildir/**
> - *
>
> basically I want to include any directory tree starting with the
> directory Maildir one level below the root of the transfer: sp
Alexander writes:
> >#
> ># Version 3.1.0beta0, 3 Sep 2007
> >#
> >* Made the default charset for BackupPC_zipCreate cp1252, which
> > appears to work co
Les writes:
> i usually achieve this via a perl one-liner like below.
>
> perl -pi -e "s|$Conf{CgiAdminUsers} =
> ''|$Conf{CgiAdminUsers} = 'root administrator'|g"
> /etc/BackupPC/config.pl
> perl -pi -e "s|$Conf{CgiAdminUserGroup} =''|$Conf{CgiAdminUserGroup}
> = 'administrators'|g"
Jesus writes:
> NT_STATUS_ACCESS_DENIED is not reported, why?
When this error happens at the top-level share it is considered
a fatal error, so it doesn't increment the Xfer error count.
Just to confirm: is the backup complete (ie: are there any other
files or directories that should be backed u
Erik writes:
> I recently did a large restore, 25Gb in 177 minutes. At a first
> glance it seemed to have succeeded, but now I find that in some
> directories files have been misplaced. I am restoring from one
> windows client to another using rsyncd.
This sounds like a problem specific to rsync
Erik writes:
> Ubuntu release 3.0. I haven't upgraded to 3.1 because I prefer to stick
> to the repository, unless it can't be avoided.
It appears the file list sort order is different between BackupPC
and rsync. Let's take this off list. I'll send some debug code
that we can use to try to figu
Sean writes:
> I'm backing up a Linux server via SSH+tar, and it keeps aborting with
> a file name as the error. Most files backups just fine however.
> The file names are not unusual, except that they have spaces in them.
> If I delete the file the name thing happens with files further down.
>
>
Mark writes:
> My home backuppc server has filled up (500GB) and I'd like to delete
> some old backups. To free some space, in the config.pl file I changed:
>
> $Conf{FullKeepCnt} = 3;
>
> $Conf{FullKeepCntMin} = 1;
> $Conf{FullAgeMax} = 30;
>
> I then restarted backuppc and ran BackupPC_ni
Jesus writes:
> There are more files in the share and the full backup is created
> successful. This is the XferLOG of full backup:
Ok, this is a bug that I will fix in the next version.
The fix is simple, just add an increment of the error counter
when NT_STATUS_ACCESS_DENIED is seen. A patch i
Dan writes:
> I've written and tested a start script for FreeBSD. Attached are
> freebsd-backuppc (for init.d/src) and a section for the README in
> init.d.
Thanks. 3.1.0 already includes one from Gabriel Rossetti.
Craig
Paul writes:
> for a long time there was an issue with using rsync as the transport,
> since hard links would not be preserved in the backups. i believe this
> is fixed now -- can someone remind me in which release the fix appeared?
> (i'm running 3.0.0.)
Yes, hardlinks with rsync work in 3.x.
Pablo writes:
> Is it possible to achieve backups with a granularity better than day? I
> see that a lot of options have as the smaller number 1 referring to
> days, is it possible to use backuppc to create backups every hour?
You need to:
- make sure $Conf{WakeupSchedule} wakes up the server
Pablo writes:
> Thanks for that information, this brings another question:
>
> What's the point of having multiple full backups? Wouldn't that be
> redundant with the information of a full backup and the incrementals?
Full backups provide two things:
- a reference point for the following incr
[EMAIL PROTECTED] writes:
> Backuppc (3.0.0) has been running smoothly for me for a week or so
> (backuping up 2 linux boxes and 2 windows boxes).
>
> But just yesterday, my log was filled with errors of BackupPC_link
> (usually -3 or -4).
>
> I searched the mailing list archives and often this
Paul writes:
> i may have been mistaken about what changed. googling, i just
> found the following thread, from this list, from august of this
> year. rob owens describes my problem exactly, and claims the
> buttons stopped working with 3.0.0, not with a change in browser:
>
>
> http://www.ma
Christoph writes:
> As far as I read the docs, starting with 3.0.x backuppc supports
> (unix/linux) acls and extended attributes (MacOS). But when adding
> option --acls to RsyncArgs, backuppc stalls. The BackupPC_dump process
> and the rsync process on the client are just hanging in select calls.
Frederic writes:
> Since I upgrade from 3.0 to 3.1 I've this error when I try to view the
> directory listing of a client.
>
> Software error:
>
> Can't locate object method "dirRead" via package "BackupPC::Lib" at
> /usr/local/BackupPC/lib/BackupPC/View.pm line 133.
>
>
> What can I do to resol
Paul writes:
> sigh. i'm not doing well on this, am i. you're right -- i just checked
> the 3.0.0 tarball. i must have dropped '--hard-links' when i brought my
> config forward from 2.1.x.
>
> sorry, again, for the noise.
No, this time it's not your issue. During an upgrade existing
config s
Jockes writes:
> $Conf{TarClientCmd} = '/usr/bin/env LC_ALL=C $tarPath -c -v -f - -C
> $shareName'
> . ' --totals';
You are running tar as the backuppc user, so it probably
doesn't have access to /var/mail.
Craig
-
Simon writes:
> I`m trying to run a Backup from an Netware 6.5 server running rsyc 2.6.3. I
> tested if the netware side is working by executing the following command on
> my backuppc system directly on the console:
>
> rsync -D --numeric-ids --perms --owner --group --links --hard-links --times
>
Martin writes:
> Don't ask me why $Conf{BackupFilesOnly} did not work and using a file
> gets this right, but it works now!
$Conf{BackupFilesOnly} is less flexible than rsync's
exclude/include options, since it is common the all the
XferMethods. $Conf{BackupFilesOnly} assumes that it is
rooted t
Arron writes:
> Yup its in there and set to en
>
> $Conf{Language} = 'en';
The error is actually a bit more serious: it can't
read or find the config at all. Something is wrong
with the install: incorrect paths or permissions.
Craig
Rich,
> The DHCP flag is a bit of a misnomer.. I have DHCP configured to
> register names into my local bind/named and I do not need the DHCP flag.
You're right. I regret calling it that. It really only needs
to be set for hosts where a broadcast nmb lookup fails, and
a unicast one works. If n
Brad writes:
> I've seen a couple of threads regarding this topic suggesting for the most
> part installing the RPM's via dag weirs or ATRPMs. I've got RHEL5 and
> backuppc 3.0 and 3.1. I did a system update in November and it upgraded
> perl from 5.8.7 to 5.8.8. Since then I have a couple syst
Tim writes:
> archive: /usr/share/BackupPC/bin/BackupPC_archive: bad reqFileName (arg
> #3): archiveReq.host-name.cron.0
The error checking needs to be a bit more generous. In particular, this
line in BackupPC_archive
if ( $ARGV[2] !~ /^([\w.]+)$/ ) {
should be something like:
if ( $A
David writes:
> ./backuppc start
> Starting backuppcsyntax error at /usr/share/backuppc/lib/BackupPC/Lib.pm line
> 139, near "} else"
> Compilation failed in require at /usr/share/backuppc/bin/BackupPC line 60.
> BEGIN failed--compilation aborted at /usr/share/backuppc/bin/BackupPC line 60.
>
Adam writes:
> But now my backup is timeing out on this file, and it doesn't even run
> long enough to backup this entire file, so it removes the file each time
> it attempts to run the backup.
What sort of timeout do you get? If it is an ALARM signal you
should significantly increase $Conf{Clie
Tim writes:
> Will this get added to future releases?
Yes.
Craig
-
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services for
just about anything Open Source.
http://ad.doubleclick.net/cl
David writes:
> Well, all was going well, but almost 3 hours into my backup I get the
> following error:
>
> 2008-01-14 13:23:06 Got fatal error during xfer (Tar exited with error 512
> () status)
> 2008-01-14 13:23:11 Backup aborted (Tar exited with error 512 () status)
Look in the XferLOG.bad
David writes:
> I took a look at XferLOG.bad.z file and I couldn't see any errors from tar
> at all. I'm assuming that the errors would be at the end of the file. At
> the time the error occurred it was backing up /music. This mount point
> contains 9351 files. Would that cause a problem ? Wo
John writes:
> I'm running BackupPC 3.0.0 on CentOS 4, with ext3 file systems. When
> trying to backup one of our servers, BackupPC logs the below error:
>
> Too many links at /backuppc/bin/BackupPC_tarExtract line 320
mkdir() is failing. It appears that certain file systems have a limit
of ~32
dan writes:
> it would be usefull to be able to selectively limit the restore
> methods. allow tar and zip but disallow direct restores for instance.
You can. Direct restores are disabled by making the corresponding
restore command empty (eg: $Conf{TarClientRestoreCmd},
$Conf{RsyncClientRestore
Lorenz writes:
> i'm using backuppc 3.10 on debian 4 to backup a linux machine via rsync.
> backuppc logs
>
> backup/mmf/admins.MYD got digests fa621bfabbcb2994d9b4f7f004a57d2a vs
> fa621bfabbcb2994d9b4f7f004a57d2a
>
> there are about 25 similar lines.
> what does that mean? those files aren't
jbk writes:
> I have had to implement a work around under Fedora 7. It all
> surrounds getting 'BackupFilesExclude' to work using 'smb'
> as the 'Xfermethod'. The main issue is that variable
> substitution for the file list does not work.
>
> Xfer.log output for a full backup:
> #
Gaetano writes:
> After full localhost backup (many gigabyte) I got the following error:
>
> BackupPC_link got error -4 when calling
> MakeFileLink(/data/BackupPC/pc/localhost/1/f%2f/famule/fIncoming/attrib,
> cb5bab76b89c25b11c8f575a567b571a, 1)
>
> I checked the following facts:
>
> 1) /dat
Coyote writes:
> /usr/share/backuppc/bin/BackupPC_tarCreate -h 192.168.0.11 -n 0 -s \/ -t *
> >bambi.tar
You need to quote or escape the "*", eg:
/usr/share/backuppc/bin/BackupPC_tarCreate -h 192.168.0.11 -n 0 -s / -t '*'
>bambi.tar
Craig
Tim writes:
> Some time ago, someone e-mailed a script that performed a dd/netcat in
> an rsync-like manner: it hashed blocks of the disk and if they matched
> between the two sides they were not sent. If they didn't, the block was
> sent. The idea was to limit the amount of data that would be
Bradley writes:
> * I'm gettiing various failures on defiant. Everything from killing the KDM
> login to killing the machine:
>
> 2008-02-07 07:53:06 Got fatal error during xfer (Unexpected end of tar
> archive)
> 2008-02-07 07:53:14 Backup aborted (lost network connection during backup)
> 2008
Pete writes:
> With that said I do have a small problem which has so far eluded my
> attempts at a solution, it has to do with the BlackoutPeriods.
> If I understand it correctly this parameter sets the window in which no
> backups are supposed to happen, and that is exactly what I want, yet all
>
jbk writes:
> My mistake but that still does not explain why $fileList is
> being backslash escaped.
No, $filelist is not a recognized substitution, so BackupPC
thinks you really want a literal '$filelist' in the argument
list, so the '$' is backslash escaped. In contrast, $fileList
gets replace
KOUAO writes:
> Fatal error (bad version): Permission denied, please try again.
Your ssh command (or sudo) is giving the error:
Permission denied, please try again.
You need to find out what part of the command is failing.
Craig
Steven writes:
> We are running backuppc 2.1.2 on a debian 4 host, backing up other
> debian hosts. Rsync version is 2.6.9 protocol version 29 on server and
> clients.
>
> I've set up backuppc on other systems and never run into this issue, but
> whenever we run backups the $Conf{BackupFilesExc
Steven writes:
> Running: /usr/bin/ssh -q -x -l netbackup freedom.rapidxdev.com
> /usr/bin/sudo /usr/bin/rsync --server --sender --numeric-ids --perms
> --owner --group --links --times --block-size=2048 --recursive -D
> --bwlimit=200 --ignore-times . /
Yes, you can see none of the excludes make i
Alex writes:
> I've got a laptop (mine :)) which has been backed up successfully via
> smb for a while now. All of a sudden, it's losing the network
> connection, along with the following errors.
I've seen cases where WinXX disk corruption can cause smbclient to fail.
Have you tried running the
Steven writes:
> Yes, I know that seems like the likely answer, but unfortunately it
> isn't. This happens on all of the hosts, regardless of whether or not
> there are per host configuration files.
Well, if you are confident the likely explanation is wrong, I'd
recommend adding some debug state
Mirco writes:
> On server i have ext3 fs, and in this case on client i have Win XP
> with NTFS filesystem.
If you look at the source for Archive::Zip, the zip file format
it writes uses a 32 bit unsigned integer for the file length.
So zip (or Archive::Zip) imposes a 4GB file size limit.
Craig
Curtis writes:
> Hey, folks! It's W. Curtis Preston, the author of Backup & Recovery and
> the webmaster of www.backupcentral.com.
>
> I'd like to add the backuppc-users list to the backup software mailing
> lists that I support on BackupCentral.com. I added the NetWorker,
> NetBackup, and TSM
Stian writes:
> When I replied to the thread four months later, I got no reply, so I try
>
> resending it in case Craig don't check so old threads...
Yes, your reblocking patch is already on my todo list.
Craig
-
This SF.n
Jean-Claude,
> The RFC 2822 forbids the inclusion of non-ASCII characters in any e-mail
> header, including the subject.
You're right. Try this:
- Add:
use Encode;
near the top of bin/BackupPC_sendEmail (after the other "use" lines).
- Change the charset in $Conf{EMailHeaders
Krsnendu writes:
> I want to exclude the directory which contains vmware virtual machines.
> I have tried to exclude it on my localhost computer and on my windows
> laptop. But both of them are still trying to backup the directory I
> want to exclude.
>
> Have I used the wrong formatting?
>
> I
Tim writes:
> BackupPC seems to be working great, however even on a fresh initial
> full backup, it reports there are existing files. How can that be if
> it's the first backup? Thanks for the info.
There are two places duplicates are detected.
Initially there are no files in the pool, so all fi
Rahul writes:
> i have enabled the ssh for the user and then tried to take backup, but this
> time i got following error
>
> --Running: /usr/bin/ssh -q -x -n -l root 192.168.0.83 env LC_ALL=C
> --/bin/gtar -c -v -f - -C /data/BackupPC --totals .
If you pasted this correctly, then the "--/bin/g
Carlos writes:
> For some reason all backups for a host have disappeared, but the data is
> still there, with correct permissions and there is free space. According
> to the web interface, that host has never been backed up.
>
> -rw-r- 1 backuppc backuppc 1940766 2008-02-23 04:09 backups
> -
Raman writes:
> Ahh, that's it -- since the backup of the previous hosts the night
> before finished about an hour later than the current night, only 23
> hours had passed since the previous backup. My incremental period is
> set to 0.97 (equivalent to 23.28 hours), so therefore it did not run.
>
Tim writes:
> I'm using Rync xfer method. From my config here
> is the RsyncShareName => /mnt/wkm_main/vol1
>
> and here is what I'm trying to exclude
> BackupFilesExclude => ['/mnt/wkm_main/vol1/Email'];
The exclude should be relative to the share, so
you need:
BackupFilesExclude => ['/
Jean-Claude writes:
> I finally managed to solve the problem. This is a patch for BackupPC 3.0
> (in addition to your configuration changes) :
Thanks for the patch. However, I'm wondering why this existing
code doesn't fix the subject:
if ( $utf8 ) {
binmode(MAIL, ":utf8");
Koen writes:
> Ok. Another shorter post about this problem (more info 7/03 -
> BackupPC_nightly not running)
>
> BackupPC_nightly doens't run anymore. Backuppc also doens't start the mail
> alert system since it seems to start after a succesfull nightly cleanup.
>
> The only thing I think which
Sandro writes:
> Now, there's some mistake 0 errors but backup aborted!
> There's some know problems about cooperation with this samba release and
> backuppc?
I've heard of this once before.
I need to send you some extra debug statements to figure
why BackupPC_dump thinks something went wro
Nils writes:
> I'm still seeing this error for this one machine. Doesn't anybody have
> any idea? Craig, what could this be?
Sorry about the delay in replying. I don't have a theory
about why the link should fail. The path name length
should be ok.
However, when the link fails, BackupPC_dump t
Sean writes:
> Is there a backuppc command line reference anywhere? I would like to
> write a script to disable backups on a host until the next wakeup.
No, sorry it isn't documented.
You should just look at the code to figure out the commands.
I'm not sure if this is exactly what you need, bu
Gavin writes:
> etc/config.pl
> $Conf{WakeupSchedule} = [ 1, '1.5', 2, '2.5', 3, '3.5', 4, '4.5', 5,
> '5.5', 6, '6.5', 7, '7.5', 8, '8.5', 9, '9.5', 10, '10.5', 11,
> '11.5', 12, '12.5', 13, '13.5', 14, '14.5', 15, '15.5', 16, '16.5',
> 17, '17.5', 18, '18.5
Daniel writes:
> does anyone know what the specific dependancy on file_rsyncp_perl 0.68
> is in backuppc? I'm working on this nexenta install and only have 0.52
> available. I forced the backuppc install but I'd like to know if the
> version number was just choosen as that is the version backupp
Matthias writes:
> After removing/purging backuppc and install again (apt-get remove /
> apt-get install) I have strange effects.
>
> The Title "Host Backup Summary" will displayed with a green
> background. Did this indicate access problems?
Sounds like you installed a much older version of Ba
Claude writes:
> I use backuppc V2.1.2 on a Debian V4. All is ok, just I have à problem
> with the accentuation on the CGI and when I restore the files with
> accent. I have configure in French but the accent are not good. Where
> can I configure this. My Debian is in locales UTF8.
Correct Utf8 s
WebIntellects writes:
> Running: /usr/bin/ssh -l bupc admin.DOMAIN.com sudo /usr/bin/rsync --server
> --sender --numeric-ids --perms --owner --group --devices --links --times
> --block-size=2048 --recursive --ignore-times . /var/lib/mysql/
> Xfer PIDs are now 12716
> Got remote protocol 29
> Neg
Paul writes:
> I repeatedly get messages like
>
> Backup failed Call timed out: server did not respond after 2
> milliseconds opening remote file
>
> when backing up relatively large files (photos, videos, generally
> bigger than several mb) off a laptop via smb.
>
> Unfortunately there
John writes:
> I am noticing an issue in our backuppc installation.
>
> Every Monday we have 20-30 hosts (of 84) that were not backed up in
> the prior 24 hours. It seems the /var/log filesystem takes much longer
> on the sunday/monday backups than it takes during the rest of the
> week.
>
> [sni
Jason writes:
> Hi, I have BackupPC 3.1.0 installed on FreeBSD 6.2 but I require the
> per-host configuration files to be stored in the same locations as the
> they were in version 2.0.2 (ie. TOP_DIR/pc/HOST/config.pl )? Is there
> a way I can achieve this without upgrading from 2.0.2 to 3.1.0?
Tino writes:
> I found a problem. IO::Dirent returns 0 as the type for the directories,
> so BackupPC::Lib->find() doesn't descent into them. Why it does so if
> run manually - I don't know.
>
> It does return a type 4 on ext3, on xfs it's always 0.
Good detective work.
There is a check in Back
Micha writes:
> I've been scratching my head over this for more than a week.
> I have two backup servers running, both on CentOS (64 bit). One has been
> humming along nicely for several months now, backing up several servers.
>
> The newer one, configured to backup some workstations, won't star
Micha writes:
> $Conf{BlackoutPeriods} = [
> {
> 'hourEnd' => '11.5',
> 'weekDays' => [
> 0,
> 1,
> 2,
> 3,
> 4
> ],
> 'hourBegin' => 7
> }
> ];
Looks ok.
> I don't see any mention on the host summary page of blackout period.
I meant the per-ho
dan writes:
> zip does not have a 2GB limit. you can use Backuppc_zipCreate
> to make a very large zip file, well over 2GB, just not through
> the CGI interface.
Tino meant a per-file limit, not an archive file limit.
Craig
--
Tony writes:
> On one of my BackupPC setups, I back a lot of data. On occasion
> things run for more than 24 hours and I start getting
>
> Botch on admin job for admin : already in use!!
>
> messages in the log file. I'm guessing that this means that a
> BackupPC_nightly has been queue
Tino,
Very good explanation. One minor comment...
> The whole process works roughly like this:
> - backups are done by BackuPC_dump on per-client basis, no explicit
> pooling involved yet (apart from full backups where existing/unaltered
> files are linked into the new backup from last full,
komodo writes:
> I need to define more ArchivePreUserCmd but i don't know how.
> Is there any way to do this ?
Put the multiple commands into a shell script and point
ArchivePreUserCmd at that script.
Craig
---
SF.Net email is sponsored by: D
[EMAIL PROTECTED] writes:
> Is there anything we can do about this? In the last week, BackupPC has
> become my #1 source of such e-mails!
Assuming there are no complaints, starting this weekend I'm planning
on restricting backuppc-users and backuppc-devel to subscribed users
only. This should
[EMAIL PROTECTED] writes:
> When backing up a certain host running Debian stable I get some
> comprehensible directories like /home or /srv but most directories or
> files are like this &íÊH àà,GFäTþwÝh Cp¶?ÑSákhU¹:!¤WB.
> This can be localhost, so no encryption seems to be involved.
Tony Del Porto writes:
> I'm testing BackupPC with a couple of machines and full backups don't
> complete. The test machines are a Redhad 7.2 machine and an OS 10.3.9
> machine. Both are using tar (or xtar) and ssh. The errors in the Xfer
> log are basically the same:
>
> Redhat 7.2
> ...
>
John Pettitt writes:
> > > Actually no - I specifically don't want to go though a tar/untar step
> > > because for some reason on my FreeBSD 5.4 box BackupPC_tarCreate has a
> > > memory leak that causes it to fail after a few thousand files. I was
> > > looking for a way round that bug.
> >
> >
"Chuck Witt" writes:
> I am wondering if there is a way to "regenerate" the data that exists in a
> file named "backups". It is located in /data/pc/username/. There appears to
> be an accompanying file named "backups.old". I don't know what happened but
> my backuppc machine shut down unexpectedly
Olivier LAHAYE writes:
> Is there a way to get a NTFS encrypted file using smbclient (even not
> decrypted) so It can be backed up using backuppc
> (http://backuppc.sourceforge.net) ?
>
> I have exported the NTFS keys in a pfx file. could it help copy the file?
>
> the aim is that files must b
Olivier LAHAYE writes:
> Is there a way to use 7zip (/usr/bin/7za) to compress files?
> If yes whant variables controls that?
The compression performance looks good, but unless there is a
perl module interface for 7zip then there is no chance to
integrate it into BackupPC. Currently BackupPC use
Les Mikesell writes:
> On Wed, 2005-08-03 at 12:54, Craig Barratt wrote:
>
> > Assuming there are no complaints, starting this weekend I'm planning
> > on restricting backuppc-users and backuppc-devel to subscribed users
> > only. This should eliminate the annoyin
Kanwar Ranbir Sandhu writes:
> Hello fellow BackupPCians,
>
> I noticed that on each hosts summary page, the blackout period isn't
> being printed. Instead, it stops short of printing the times. For
> example:
>
> Because mona has been on the network at least 2 consecutive
> times
Thanks to everyone for their offers of help in administration
of the mail lists. Paul Lukins is going to help out.
Starting yesterday, only subscribers can post to backuppc-users
and backuppc-devel. Non-subscriber email will be moderated by
Paul - it will be either discarded (already 3 paypal em
Chris Horn writes:
> I have been trouble getting BackupPC to do a proper tar incremental
> backup. The problem seems to be with tar and the $incrDate format.
> my setup is all on localhost, with no SSH, or other remote access.
>
> Here is some sample log output:
>
> - SNIPPET OF LOG
"Gerald Richter" writes:
> I am using backuppc for over a year now without any problemes, recently I
> started to get the following error when I do a full backup of a Windows XP
> PC:
>
> Call timed out: server did not respond after 2 milliseconds opening
> remote file \Programme\Creative\Sha
Carl Wilhelm Soderstrom writes:
> On 08/18 01:56 , Paul Fox wrote:
> > unless you care about preserving hardlinks. which probably isn't
> > why the original poster is using it, but might be.
>
> AFAIK, rsync preserves hardlinks. that's what the -H option is for.
rsync does, but BackupPC's File:
Craig Barratt writes:
> One explanation (not proven) is that antivirus SW, together with
> other paging activity, can infrequently produce these very long
> response times. You could do a test by disabling your AV SW.
Doug just reminded me that this is in fact establishe
"Kris S. Amundson" writes:
> I would like to purge a single file from ever existing on our back
> system. How can I do this?
If it is a particular path common to all backups, you can
manually "mangle" the name (add "f" to the start of each
directory and file) and just rm the files.
The copy of
[EMAIL PROTECTED] writes:
> I've seen seen on the BackupPC-devel mailing list archive that a new xfer
> method is seems currently in development called BackupPCd.
>
> I have few questions about that:
>
> - What will be the benefits of this clients compared to rsync/smb or tar
> transfer meth
Tony Del Porto writes:
> On Aug 4, 2005, at 7:32 PM, Craig Barratt wrote:
>
> > Tony Del Porto writes:
> >
> >> I'm testing BackupPC with a couple of machines and full backups don't
> >> complete. The test machines are a Redhad 7.2 machine and an
701 - 800 of 1559 matches
Mail list logo