Ralf Gross wrote:
Gerald Brandt schrieb:
You may want to look at this thread
http://www.mail-archive.com/backuppc-users@lists.sourceforge.net/msg17234.html
I've seen this thread, but the pool sizes there are max. in the lower
TB region.
Ralf
Not all of them...
Chris Robertson wrote:
Ralf Gross wrote:
Gerald Brandt schrieb:
You may want to look at this thread
http://www.mail-archive.com/backuppc-users@lists.sourceforge.net/msg17234.html
I've seen this thread, but the pool sizes there are max. in the lower
TB region
Richard Shaw wrote:
On Fri, Feb 19, 2010 at 11:39 AM, Mike Bydalek
mbyda...@compunetconsulting.com wrote:
On Fri, Feb 19, 2010 at 10:11 AM, John Moorhouse
john.moorho...@3jays.me.uk wrote:
I'm happily using backupPC to backup a number of machine within our home
network, I'm
Ralf Gross wrote:
Hi,
I'm faced with the growing storage demands in my department. In the
near future we will need several hundred TB. Mostly large files. ATM
we already have 80 TB of data with gets backed up to tape.
Providing the primary storage is not the big problem. My biggest
concern
James Ward wrote:
I'm trying to figure out how to do something in the GUI.
I have the following exclude: /data0*
Now I would like to add an exception to that rule and back
up: /data02/vodvendors/promo_items/
Is it possible to set this up in the GUI? I can't figure it out.
Chris Baker wrote:
I don't know what your level of expertise is. Please accept my apology if
you already know this.
The two particular distributions you mentioned are pretty different. Debian
basically started as its own branch of distribtuion, and other distributions
like Ubuntu and Mepis
Huw Wyn Jones wrote:
Hi folks,
I'm trying to recover a BackupPC server which is crashing. The system is
throwing kernel panics on normal boot up. I can get the system up only when I
log-in interactively and turn off all services. It looks like a software
issue rather than hardware, but
Tino Schwarze wrote:
Hi,
On Tue, Jan 26, 2010 at 12:22:45PM -, PD Support wrote:
We are going to be backing up around 30 MS-SQL server databases via ADSL to
a number of regional servers running CentOS (about 6 databases per backup
server). 10 sites are 'live' as of now and this is
Dan Smisko wrote:
Yes, only the backup data (pool, cpool, etc) was on the dead drive. The
config is in /etc/BackupPC.
I guess I will try to re-create the backup directories and try another
backup.
Certainly a RAID is worth considering, but the next question is what to
put on a RAID
Stuart Matthews wrote:
Hi all,
I am currently running BackupPC on the following:
1.5GB RAM
older processor - not sure how fast
2TB external USB hard drive
Clearly this isn't cutting it, although it was barely cutting it for a
few months. This wasn't my optimal setup but I have to be cost
Tony Schreiner wrote:
On 12/23/2009 10:06 PM, Claude Gélinas wrote:
Le mercredi 23 décembre 2009 21:33:50, Adam Goryachev a écrit :
Les Mikesell wrote:
No, it should be the same. Look in the root/.ssh/authorized_keys file to
see if the ssh-copy-id command put the
Claude Gélinas wrote:
Le mercredi 23 décembre 2009 22:14:41, Tony Schreiner a écrit :
I forget if anybody has mentioned wrong file permissions as a
possibility. The ~/.ssh directory may not be group or world
writable.This will be logged in /var/log/messages if set incorrectly.
Matthias Meyer wrote:
Claude Gélinas wrote:
I'm trying to setup the backup of the localhost with backuppc. I already
backup several other linux machine via ssh. I've setuped all them via
running the following command as backuppc user:
ssh-keygen -t dsa
cd .ssh
ssh-copy-id -i
Matthias Meyer wrote:
Hi,
I assume BackupPC_zipCreate read the files from the numbered dump and write
them local into a .zip file. This local .zip file will be transfered to the
destination.
I can't find out where this local .zip file is located.
Is my assumption wrong?
Does
Guido Schmidt wrote:
Matthias Meyer schrieb:
Guido Schmidt wrote:
Matthias Meyer wrote:
Guido Schmidt wrote:
What works? The opening and closing of the tunnel.
What does not? The connection to it. Nothing in the rsyncd-logs on
host.example.com.
If I leave
Claude Gélinas wrote:
I've setup a new backuppc server on my main workstation which is a FC12.
Everything look fine except I can't backup my workstation as ssh, keep asking
root password.
I've followed the BackupPC FAQ: SSH Setup for this workstation and a remote
machine FC9. No problem
M. Sabath wrote:
Hello all,
I use backuppc on Debian 5.
Since I upgraded from Debian 4 to Debian 5 backuppc doesn't run
automatically.
Our server runs only during daytime between 7am and 19 pm
Let me see if I have this right... Your server is only powered on from
7 am to 7 pm...
Kameleon wrote:
I have a few remote sites I am wanting to backup using backuppc.
However, two are on slow DSL connections and the other 2 are on T1's.
I did some math and roughly figured that the DSL connections, having a
256k upload, could do approximately 108MB/hour of transfer. With
sabujp wrote:
The problem seems to be that this file:
#91;r...@gluster3 data_jsmith#93;#
/usr/local/BackupPC/bin/BackupPC_deleteBackup.sh -c data_jsmith -l
/usr/local/BackupPC/bin/BackupPC_deleteBackup.sh#58; line 93#58;
/glfsdist/backuppc3/pc/data_jsmith/backups#58; No such file or
sabujp wrote:
I ran an incremental and it completed but it's not being recorded in backups
but instead in backups.new. When does backups.new get copied to backups so
that the new incremental will show up on the webpage or what process causes
this to happen?
The file backups.new is written
Robin Lee Powell wrote:
On Tue, Dec 15, 2009 at 02:33:06PM +0100, Holger Parplies wrote:
Robin Lee Powell wrote on 2009-12-15 00:22:41 -0800:
Oh, I agree; in an ideal world, it wouldn't be an issue. I'm
afraid I don't live there. :)
none of us do, but you're having
Pat Rice wrote:
HI all
Well at the moment I an recovering from a flooding situation.
I had my office flooded to 2.5ft of water. Luckily the Backup server
(backup pc) was above the water line and also my hard drive for my
backup server. Unfortunately my machines that were on the ground, were
ckandreou wrote:
I have the following files
/cmroot/ems_src/view/2010_emsmadd.vws/.pid
/cmroot/ems_src/view/2010_deva.vws/.pid
/cmroot/ems_src/view/emsadmcm_01.03.006.vws/.pid
/ccdev10/cmroot/ems_src/vob/mems.vbs/.pid
I would like backuppc to exclude .pid
I used the following exclude
Timothy J Massey wrote:
Hello!
I have a shell script that I use to install BackupPC. It takes a standard
CentOS installation and performs the configuration that I would normally
do to install BackupPC. There are probably way better ways of doing this,
but this is the way I've chosen.
Ian Levesque wrote:
On Sep 15, 2009, at 7:12 PM, Chris Robertson wrote:
...even though they have more than a mile of physical separation. I
don't currently have good data as to the bandwidth utilization during
backups (the DRBD config is set to limit it to 10M, which is about
110Mbit
Jim Leonard wrote:
James Ward wrote:
I forgot to mention there are 16 disks in the big array. So you'd
recommend RAID5 or 6?
I'd recommend RAID1+0 actually (RAID10) if you have that many disks.
You'll have half the available disk space, but the speed of a stripe.
Plus you can
dan wrote:
On Wed, Sep 16, 2009 at 3:24 AM, Tino Schwarze backuppc.li...@tisc.de
mailto:backuppc.li...@tisc.de wrote:
On Tue, Sep 15, 2009 at 03:12:28PM -0800, Chris Robertson wrote:
In short, it works for me.
[...]
Wow, thanks for sharing your experience. I figure
Les Mikesell wrote:
Chris Robertson wrote:
Read it again. :o) Both the external XFS journal (logdev=/dev/drbd1)
AND the data partition (/dev/drbd0) are DRBD mirrored. It would be
silly to have only one or the other saved in a DR scenario.
Have you investigated/tested what
backu...@omidia.com wrote:
So I have a question about email reminders.
I don't see a way to customize who the emails are sent to, short of
changing the usernames. (I see a way to customize the domain, with
$Conf{EMailUserDestDomain} = '';, but that's it.)
But I have a user with username
Les Mikesell wrote:
Chris Robertson wrote:
Les Mikesell wrote:
Chris Robertson wrote:
Read it again. :o) Both the external XFS journal (logdev=/dev/drbd1)
AND the data partition (/dev/drbd0) are DRBD mirrored. It would be
silly to have only one or the other saved
In short, it works for me.
Machine specs:
CPU : Intel Xeon X3320 (Quad Core @2.50GHz)
Memory: 8GB DDR2-667 ECC
Storage Controller: Adaptec 51645 (BIOS Firmware 5.2-1 17380, driver
1.1-5 2465)
Drives: 16 Seagate ST31000340NS (1TB ES.2) w/AN05 firmware
OS: CentOS 5.3
[r...@archive-1 ~]# uname
James Ward wrote:
I forgot to mention there are 16 disks in the big array. So you'd
recommend RAID5 or 6?
Your best bet is to set it up and run some benchmarks*. Anything else
is just speculation.
For what it's worth, I have a similar setup (Intel Xeon X3320 Quad Core,
8GB RAM, 16 drives
Matthias Meyer wrote:
Is there a way to retain the job queue? Or to check if anything is in it?
Not that I'm aware of.
In theory, storing the job queue over a shutdown shouldn't be tough (it
should just be a matter of writing a construct to a file, and reading it
in on startup). At the
Matthias Meyer wrote:
Hello,
I plan to periodically e2fsck my /var/lib/backuppc.
I want to write a bash script which check if BackupPC_dump is running.
If not, it will stop backuppc, unmount the device and run
e2fsck -fp $device
What is about BackupPC_link? Should I check for this process
Volker Thiel wrote:
Am 27.08.2009 um 23:15 schrieb Chris Robertson:
Volker Thiel wrote:
Also, I'd like to know if there's a way to start BackupPC in daemon
mode?
/path/to/installation/bin/BackupPC -d
Sometimes it is as simple as this. :) Where can I find information
Brent Clark wrote:
Hiya
I got quite a few servers around the world that I need to backup.
The question I would like to ask is, how does backuppc scale to the other
backup solutions, in large environments. Also for those running backuppc for
large scale environment, would you be so kind as
Volker Thiel wrote:
Hello everyone, I'm currently trying to install BackupPC on the QNAP
TS-509 Pro NAS system. So far I managed to get through the
installation and I can start the server by calling the executable as
user backuppc: /path/to/installation/bin/BackupPC
This call results
txoof wrote:
Update:
BackupPC sent an email this morning for the system in question. The mail was
for a system that has had 16.5 days since its last backup. Is there a
setting that I'm missing? Or is this controlled only by EMailNotifyMinDays?
From my understanding of the documents,
wirehead wrote:
I've recently been installing backuppc (for the first time) on CentOS 5.3.
I had to set selinux to permissive in order to get it to run - for some
reason sealert doesn't exist on my system and I couldn't figure out how else
to view the selinux logs to debug it. However,
Adam Goryachev wrote:
I've removed a hosts from backuppc (hosts file, removed the config file,
done a rm -rf /var/lib/backuppc/pc/hostname and the host no longer shows
up in the web interface in the dropdown, host summary page, etc.
However, I still get information on the host from this
Adam Goryachev wrote:
I'm trying to backup a remote host which has recently had a lot of
changes. Initially it kept getting read errors, but after manually (from
the command line) re-running the full backup, it almost completed
(continuing the partial each time). However, eventually the
Adam Goryachev wrote:
One of my remote server being backed up had a very large log file, which
was growing rather quickly (4GB growing at 2k/sec). This caused the
backup to timeout sometimes...
Anyway, see an extract of the log file which 'causes' the problem:
Executing DumpPreUserCmd:
Michał Sawicz wrote:
Hi there,
I have a 3.1.0 installation with apache and mod_perl. Everything is fine
except the messages displayed on the web page.
Most of the language files are latin1 encoded, except for zh_CN and pl
which are utf8, because they're incompatible with latin1.
The
Adam Goryachev wrote:
Chris Robertson wrote:
This depends on how your backups are set up (smbclient being a special
case, apparently), but in general...
$Conf{BackupFilesExclude} = { '*' = [ '/log' ] };
...should prevent a directory named log from being backed up.
Does this also
Matthias Meyer wrote:
Adam Goryachev wrote:
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Langdon Stevenson wrote:
I have a number of servers at remote sites that get backed up over ADSL
connections. Usually the backups run in an hour or two outside of
business hours which is
glubby wrote:
Hi,
I'm a new user of BackupPC. I'm using it to make a backup of some of my linux
servers. It is working pretty well but there is some large log file on my
servers I don't need to backup.
I look over the forum but I didn't find anything.
Heh. You might also want to read
Joseph Holland wrote:
Hi, I have been using BackupPC now for the last year or so but now I
want to be able to keep more backups. I have read the documentation
many times, but don't understand exactly which options I need to change
(and to what). I want to keep the last 7 daily backups (1
Ted To wrote:
Hi,
I'm not sure what the problem is but after reinstalling OS X on my
wife's laptop, a backup has never been successfully completed. I did
not change any of the configuration files and am using xtar as
suggested at: http://wiki.nerdylorrin.net/wiki/Wiki.jsp?page=BackupPC
.
Admiral Beotch wrote:
It sounds like this might be helpful for me:
You can execute the following command as root to relabel your
computer system:
touch /.autorelabel; reboot
As an aside, you can get the same effect, without the reboot with
restorecon -R /. Using restorecon -Rv
Admiral Beotch wrote:
I fixed my SELinux problem by changing the context of the mounted
partition that holds TOPDIR... I can't say for certain that I got the
context 100% accurate, but it seems to be a secure choice given how
the httpd process is trying to interact with that part of the
Les Mikesell wrote:
Matthias Meyer wrote:
Assumed I have 10 hosts and the maximum number of simultaneous backups to
run is 5.
Further assumed all 10 hosts have to make a backup today.
4 of them have theire last backup made 2 days ago and 3 of them yesterday.
2 have a partial backup and 1
Admiral Beotch wrote:
I recently installed BackupPC (BackupPC-3.1.0-3.el5) on a CentOS 5.3
server from the epel repo.
It appears that backups are occurring but I am unable to view host
logs or browse backups.
I can see the data being collected into the TOPDIR/pc directories, but
I the
Nick Smith wrote:
Does anyone know if i can change the password to the backuppc user in linux
and not have any adverse effects with the backuppc system?
Yes, you can. The account doesn't actually NEED a password.
Chris
Boniforti Flavio wrote:
Is this meaning that the tarball is gzipped? In that case, what are
the parameter that follow the binary path?
/bin/gzip is the compression program
.gz is the extension of the output filename (blah.gz)
* I think means all shares...
Hope that helps a little,
Tino Schwarze wrote:
On Wed, Jun 10, 2009 at 01:30:35PM -0400, Jim McNamara wrote:
Have you specifically done a dist-upgrade from etch to lenny?
[...90 lines snipped...]
By the way, top posting (writing above the previous post) is frowned upon by
most mailing lists. Most
error403 wrote:
Hi, I'm trying to find a way to send an email to the personal email of the
people I'm doing their backups for. I tried to search but the terms email
and message are so general it gives me almost all the posts on the forum! :?
Something like
Les Mikesell wrote:
Chris Robertson wrote:
error403 wrote:
I'm thinking of installing/using some sftp server sofware on their
computer.
Better would be an rsyncd service, as that would allow you to only
transfer changes.
If they are unix/linux/mac boxes you
Steve Redmond wrote:
Hi,
I've run in to a bit of a strange issue. We have a large number of
backups running on backuppc and up until recently they have all been
working fine. Now I see everything as Idle with aging last backups
Now, when I attempt to kick backups off manually I get the
Steve Redmond wrote:
Hiya,
Thanks for your reply.
I assume this means the main config file. There are also per-host
config files that will override the main one.
Correct. I have checked that there were no overriding settings in other
configuration files. Did a grep on the
Skip Guenter wrote:
On Tue, 2009-06-02 at 16:36 +1000, Adam Goryachev wrote:
So, using 4 x 100G drives provides 133G usable storage... we can lose
any two drives without any data loss. However, from my calculations
(which might be wrong), RAID6 would be more efficient. On a 4 drive 100G
Bernhard Ott wrote:
Ralf Gross wrote:
Hi,
I use BackupPC since many years without hassle. But something seems to
be broken now.
BackupPC 3.1 (source)
Debian Etch
xfs fs
Hi Ralf,
look for the thread no cpool info shown on web interface (2008-04)in
the archives, Tino
Daniel Carrera wrote:
Yes. I thought BackupPC was more like a cron job that runs once every
hour. My current script runs every 2 hours, so I always know when it's
not running. But if BackupPC runs all the time, then that's different.
Are you aware of any backup tool that might be more
Boniforti Flavio wrote:
You will find it in $HOME/pc/host/backups See backuppc
online documentation for a description of this file
Hy Matthias,
the only thing I got in the docs is:
The file /var/lib/backuppc/pc/$host/backups is read to decide whether a
full or incremental backup
Matthias Meyer wrote:
Hi,
I think about an encrypted backup and find rsyncrypto.
Is there a BackupPC_dump support for rsyncrypto?
Or any other way to use rsyncrypto with backuppc?
From the looks of it, you would just run the rsyncrypto on the client
as a pre-backup command, and then
Holger Parplies wrote:
Hi,
Boniforti Flavio wrote on 2009-04-29 14:22:36 +0200 [Re: [BackupPC-users]
Defining data retention periods]:
OK, now the main question of my post: data retention.
[...]
In my actual setup when the oldest FULL gets deleted (because it is now
older
Matthias Meyer wrote:
Chris Robertson wrote:
Matthias Meyer wrote:
Hi,
I think about an encrypted backup and find rsyncrypto.
Is there a BackupPC_dump support for rsyncrypto?
Or any other way to use rsyncrypto with backuppc?
From the looks of it, you would just run
Craig Barratt wrote:
BackupPC 3.2.0beta0 has been released on SF.net.
3.2.0beta0 is the first beta release of 3.2.0.
3.2.0beta0 has several new features and quite a few bug fixes
since 3.1.0. New features include:
I didn't see any mention of lib/BackupPC/Lib.pm being updated for the
case
Tino Schwarze wrote:
Hi there,
I've got several retired hosts and want to keep only the latest backup
from them. I've set $Config{BackupsDisable}=2 in the server's config.pl
and $Config{FullKeepCntMin}=1 but backups are still kept (I see these
values if I edit the config via web interface).
Mike Dresser wrote:
Chris Robertson wrote:
How many hosts do you back up?
About 30 are active, 12 are sporadic (laptops, etc). Total that gets
written out to off site backup is about 300GB of data a day, compressed.
What does df -i show for the mount point?
/dev
Les Mikesell wrote:
Chris Robertson wrote:
Thanks for the numbers. I'm starting to think my problems might be
related to the kernel I'm running (default Centos 5.2, with xfs-kmod).
It's been years since I rolled my own kernel, but I might just have to
break out the compiler
Mike Dresser wrote:
Chris Robertson wrote:
Hopefully my original message didn't come across as negative of either
XFS or BackupPC. Due to how well BackupPC and XFS handled the load I
threw at it initially, I expanded the retention policy of my backups
without thought, planning
Mike Dresser wrote:
Matthias Meyer wrote:
Dear all,
How scalable is backuppc?
Where are the limits or what can produce performance bottlenecks?
I've heard about hardlinks which can be a problem if theire are millions of
it. Is that true?
The file system can become...
Les Mikesell wrote:
Chris Robertson wrote:
I've heard about hardlinks which can be a problem if theire are millions of
it. Is that true?
The file system can become... interesting to fix or backup when you get
a few million hard links, especially if you're using XFS
Peter Walter wrote:
All,
I have implemented backuppc on a Linux server in my mixed OSX / Windows
/ Linux environment for several months now, and I am very happy with the
results. For additional disaster recovery protection, I am considering
implementing an off-site backup of the backuppc
Nate wrote:
We seem to be routinely having this issue where the server backuppc
is running on throws a kernel panic and thus hard locks the
machine. It's completely random, sometimes happens daily, sometimes
we can have a lucky 2-3 weeks without a lockup. I've taken a
screenshot and
Nate wrote:
Yeah, I doubt very much it's a backuppc issue, sorry if I may have
implied that. I'm fairly confident it's a ext3/driver issue. But as
this popped up when we began using backuppc I suspect it may have to
do with the massive quantities of files and had hoped another
backuppc
Tino Schwarze wrote:
On Mon, Feb 23, 2009 at 04:58:32PM -0600, Les Mikesell wrote:
Craig Barratt wrote:
I don't think there is a way to transfer a single host's backups using
BackupPC_tarPCCopy.
What happens if you just copy a single host's backup tree without regard
to
Brian Woodworth wrote:
Yes, I have complete backups. My problem appears to be a known bug as
stated earlier in the thread. Craig was kind enough to post a
solution, but the problem is I don't know how to go about following
his instructions.
thanks for the response
First, try...
Odhiambo Washington wrote:
On Fri, Feb 6, 2009 at 11:43 PM, Chris Robertson crobert...@gci.net
mailto:crobert...@gci.net wrote:
Odhiambo Washington wrote:
Surprisingly, I am not able to get the CGI interface to show up
things
as beautifully as I see in the screenshots
Odhiambo Washington wrote:
Hello list,
I am new but I am kind of old hand.
Haven't seen you on the Squid-Users list in a while... :o)
Surprisingly, I am not able to get the CGI interface to show up things
as beautifully as I see in the screenshots on the website and I wonder
what I am
Chris Baker wrote:
Which logs should I check?
/var/log/messages
And what should I look for in these logs?
Run the command dmesg (first, man dmesg so you know what this command
does) and take a look at the output. This is the bootup message and
should be replicated in /var/log/messages.
Alex wrote:
Hi there,
i'm not aware using Perl scripting, but ok with bash / php.
Then, i'd like to know if there's a way to set values for per pc config file
using command line tool ?
By exemple, like to change only FullKeepCnt value, but, has it is by default
written that way :
Anand Gupta wrote:
Hi Les,
Thanks for the link. I see BackupPC_tarCreate and BackupPC_zipCreate
for tar and zip. Is there an rsync version ? So instead of creating a
tar or zip, it can rsync the data over to another location ?
The reason i asked is because the amount of data i am going
thomat...@gmail.com wrote:
How dangerous is it to run xfs without write barriers?
http://oss.sgi.com/projects/xfs/faq.html#nulls
As long as your computer shuts down properly, sends a flush to the
drives, and the drives manage to clear their on-board cache before power
is removed or the chip
dan wrote:
If the disk usage is the same as before the pool, the issue isnt
hardlinks not being maintained. I am not convinced that XFS is an
ideal filesystem. I'm sure it has it's merits, but I have lost data
on 3 filesystems ever, FAT*, XFS and NTFS. I have never lost data on
Thomas Smith wrote:
Hi,
No, it continues to take 22 hours or so each day.
-Thomas
How is your XFS volume mounted? Did you add the noatime and
nodiratime directives? If you have battery backed storage, I would
highly recommend using nobarrier as well
Lofton H Alley Jr wrote:
Its been a small struggle with slow progress. I am stymied over this one
though.
Here is the network layout: two desktops and a lappie on wifi. This
should be easy right? One deskie has an 80 GB primary and a 320G storage
HDD divided into 3 partitions. The other
dtktvu wrote:
Hmm, I see...
Thanks very much for pointing that out.
BTW, what do you mean by You might run into some other problems if using GPL
code in a C# environment if you are using shared libraries that are not GPL?
If you are statically linking libraries that do not use a GPL
Ray Todd Stevens wrote:
We have a backuppc system setup that has been running for a while now. We
are
expanding the office and I am going to need more storage space. To do this I
will need to
copy the data off, reconfigure the array with more drives and then reload the
system.
How
James Sefton wrote:
Hi,
Please excuse me if I am using this wrong, in all my years in IT, it
seems this is the first time I have used a mailing list for support.
(I’m usually pretty good at the whole RTFM thing)
We have a backup box (FC6) that is running backups from a lot of
windows
John Goerzen wrote:
Hi everyone,
I installed BackupPC to try it out for backing up Linux systems, and I
have a few questions about it.
First, the on-disk compression format makes me nervous. It appears to
use the deflate algorithm, but cannot be unpacked with either gzip or
unzip. It
Les Mikesell wrote:
Christian Völker wrote:
| If the full backup fails, does it start from scratch every time or are
| some files already stored in the backup and used during the next try, so
| it'll finish some day?
| If you are using rsync as the transfer method it will continue
|
Jeff Rippy wrote:
yes I had thought of that too and have already added the backuppc user
to the tape group. permissions are 660 or rw-rw with owner root
and group tape. Also the backuppc documentation and even the default
configuration uses /dev/st0 so why exactly do you recommend
brunal wrote:
Hi,
One question that I dont understand : can I use another user than
root to connect to my host?
The only reason to connect as root is to make sure you have access to
all the files you want to back up.
And to my host side, a user backuppc exist and have access to all
Kurt Tunkko wrote:
Hello Holker,
Holger Parplies wrote:
3. Mirror partition tables from one of the existing disks:
# sudo sfdisk -d /dev/sda | sfdisk /dev/sdc
apart from something having been mangled (???), I tend to wonder why you
need root permission to read the
Bruno Faria wrote:
Hi,
I have setup backupPC to run 3 backups at the same time, and this was
working very well since all host were pretty much getting a backup
once a day. But for some reason, now the backups are not starting as
often as they were. Sometimes all have running is just
Leandro Tracchia wrote:
i didn't think to check this log before... it has interesting entries.
this is the rsyncd log from the windows server. it has errors right
around the same time the backuppc log shows the 'child exited
prematurely' error. most of the errors complain about the file names
Johnny Stork wrote:
I just installed BPC on a RHEL4 machine, but when trying to access the
gui at http://serverip/cgi-bin//BackupPC_Admin I just see the
text/contents of the perl script? No gui?
Any suggestions?
Here are the steps I used to get the CGI interface working on a fresh
Bruno Faria wrote:
Hi,
I've posted a similar a question before but since I didn't get any response
the first time, I guess I'll try again . :)
You might be edified by reading the documentation
(http://backuppc.sourceforge.net/faq/BackupPC.html). I highly recommend
the section detailing
Sam Przyswa wrote:
Hi,
We have to change our BackupPC server to a new machine, how to copy the
entire BackupPC directory (120Gb) to an other machine ?
I tried rsync, it crash after a long, long time, I tried scp but it
don't pass the link and the dest directory become out of size after
100 matches
Mail list logo