sorry, last time I used a wrong sender-address...
Hello,
we use Backuppc as our primary backup and everything works fine except two
points.
First one is that some of our clients aren't online any more but we would
keep
the backups. According to the help I changed the
Tino Schwarze wrote:
I found an article on an article somewhere online that I should
add the
BackupPC user to the sudoers file, see the /etc/sudoers file below
for the
syntax I used:
IIRC, this is only for local backup via rsync.
I think that article was talking about using another user
Hi,
I've found backing up any linux server's data works using tar and ssh
but when trying to back up a freebsd server it gives the following error:
2006-12-19 11:30:03 full backup started for directory /etc
2006-12-19 11:30:04 Got fatal error during xfer (No files dumped for share /etc)
Unluckily, tar is not tar. Backuppc is written for GNU tar and FreeBSD is most
likely using BSD tar. An additional problem is that using GNU tar 1.16 does
not work for changing (used) filesystems due to a changed return code.
Try tar --version and if it does not say:
tar (GNU tar) 1.14
Hmmm... quite right.
bsdtar 1.02.023, libarchive 1.02.026
Funny thing is it actually does do the copy but just spits out the
error. I can see all the backups done (and retrieve them) but I
unfortunately don't get the warm and fuzzy feeling of seeing the backup
completed successfully.
The
On 12/19 02:06 , Michael Mansour wrote:
BackupPC: Host Summary
This status was generated at 19/12 15:04.
Could we have a setting in there to refresh that browser page every x
seconds?
I would suggest that you should just do it by hand whenever you care to
look; rather than wasting CPU
On 12/19 10:11 , Nils Breunese (Lemonbit) wrote:
Tino Schwarze wrote:
I found an article on an article somewhere online that I should
add the
BackupPC user to the sudoers file, see the /etc/sudoers file below
for the
syntax I used:
IIRC, this is only for local backup via
Hi Carl,
BackupPC: Host Summary
This status was generated at 19/12 15:04.
Could we have a setting in there to refresh that browser page every x
seconds?
I would suggest that you should just do it by hand whenever you care
to look; rather than wasting CPU time rendering pages
On Tue, Dec 19, 2006 at 08:08:46AM -0600, Carl Wilhelm Soderstrom wrote:
BackupPC: Host Summary
This status was generated at 19/12 15:04.
Could we have a setting in there to refresh that browser page every x
seconds?
I would suggest that you should just do it by hand whenever you
Sorry for the newbie questions here, but:
if I use rsync instead of smb, will new files be backed up during
incremental backup.
Also... if a laptop is unavailable for backup for extended period of
time, does backuppc know to do a full backup as soon as it is next
available?
--
Regards,
dbp
On 12/19 08:56 , dbp lists wrote:
if I use rsync instead of smb, will new files be backed up during
incremental backup.
yes. AFAIK, more reliably so. I believe rsync is better at catching new
files than tar. (which is what's run over SMB, again AFAIK).
Also... if a laptop is unavailable for
On Tue, Dec 19, 2006 at 08:56:57AM -0800, dbp lists wrote:
if I use rsync instead of smb, will new files be backed up during
incremental backup.
Yes, of course. rsync is just the transfer method.
I'm not sure about your second question, though. In fact, a full or
incremental backup doesn't
Hi.
Using Backuppc (debian) successfully in my company for about 5 months,
backing up 6 winXX shares using samba.
The machine is a simple Pentium 4 with a 40GB HD that is getting full.
Already ordered a 250GB ata100 hd. but I'm not sure what to do...
should I use the 250Gb for the data only, or
Hello again everyone!
I solved the earlier problems I had with rsyncd by using the package
from the backuppc page for cygwin rather than forcing my own install.
Rsyncd now seems to work, but I have a strange error. I have 2 full
backups from when the backuppc operated on smb, and have now
On 12/19 07:17 , Filipe wrote:
Using Backuppc (debian) successfully in my company for about 5 months,
backing up 6 winXX shares using samba.
The machine is a simple Pentium 4 with a 40GB HD that is getting full.
Already ordered a 250GB ata100 hd. but I'm not sure what to do...
should I use
Hi,
How close is BackupPC is to implementing a VSS shadow copy?
--
Best regards,
Sherman Boyd
-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share
I'm wondering now how to exclude things like /proc globally and per-PC.
You cannot exclude something globally, then exclude more per-PC. The
per-PC settings simply override whatever was set globally, since it's
just setting a perl variable. I suppose you could actually write perl
code
Hi Tino,
BackupPC: Host Summary
This status was generated at 19/12 15:04.
Could we have a setting in there to refresh that browser page every x
seconds?
I would suggest that you should just do it by hand whenever you care to
look; rather than wasting CPU time rendering
Yes, of course. rsync is just the transfer method.
The reason I asked was because with smb method (which is the only
thing I've tried so far), new files are only backed up during full
backup, not incremental backup.
From http://backuppc.sourceforge.net/faq/BackupPC.html#backup_basics:
For SMB
dbp lists wrote:
The reason I asked was because with smb method (which is the only
thing I've tried so far), new files are only backed up during full
backup, not incremental backup.
That should not be the case if 'new' files are defined as having
timestamps later than the last run. There are
20 matches
Mail list logo