Re: [BackupPC-users] BackupPC_Trashclean

2008-04-01 Thread James
Yeah, I'm suspecting an endless loop of some sort. problem is, as soon  
as I kill it it restarts. I've even tried a:
 kill $pid;ktrace Backuppc_Trashclean
and couldn't catch it.

(I'm on osx, so it's ktrace)
On Mar 31, 2008, at 9:07 PM, Les Mikesell wrote:
 James wrote:
 The process has been running almost 24 hours. I tried killing it,  
 but  it just starts right back up. This is quite the load on my  
 backup  disks. I'd very much appreciate any tips on how to begin  
 debugging the  issue as I truly don't want to shut down back up  
 services for my lab.

 I haven't seen anything like that and would suspect filesystem  
 corruption.  You might try an 'strace -p ' on the process id to see  
 if you can make sense out of what it is doing.   Maybe it is getting  
 an error and repeating something.

 -- 
  Les Mikesell
   [EMAIL PROTECTED]


-
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services for
just about anything Open Source.
http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Using a NAS Server to backups

2008-04-01 Thread Ronny Aasen

On Mon, 2008-03-31 at 15:53 +0200, Nils Breunese (Lemonbit) wrote:
 Ronny Aasen wrote:
 
  Are there any NFS/SAN solutions are _are_ usable with backuppc
 
  for example iscsi/fc/afs/codafs etc etc.
 
 Do you really need a networked solution? Local or externally connected  
 drives are not an option?

correct

Ronny Aasen


-
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services for
just about anything Open Source.
http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Incrementals contain full directory structure?

2008-04-01 Thread Christoph Litauer
Hi,

I am new to backuppc so please apologize if this has been asked before.

I am testing backuppc for a few weeks now. Works very robust and fast. 
But I wonder about the number of files and directories backuppc creates 
in an incremental dump.

I am backing up server2 with a filesystem containing about 3 million 
files spread over about 300,000 directories. backuppc's summary for 
server2 tells me
Level 0:
   3084565 files (470 GB), 1159994 files (60GB), 2150650 files (410GB)
Level 1:
   11024 files (2.8 GB), 8482 files (101 MB), 60139 files (2.7 GB)

Counting the number of files on the backup-device leads to
/path/to/backup/pc/server2/0: 3634406 files (including directories)
/path/to/backup/pc/server2/1: 429770 files (including directories)

Investigating the differences regarding the level 1 dump, I found out 
that the whole directory structure of the filesystem is created for the 
incremental dumps, independent if the dir contain files to backup or not.

I think this may lead to problems regarding the number of files on the 
backup-device. My whole test setup running for 2 month lead to 140 
million files/hardlinks/directories. Filesystem checks or removing of 
large parts of the backup took substantial amounts of time.

Option IncrFill is not set neither global nor for the client.

Is this layout really necessary? Can I avoid creating the whole dir 
structure?

-- 
Regards
Christoph

Christoph Litauer  [EMAIL PROTECTED]
Uni Koblenz, Computing Center, http://www.uni-koblenz.de/~litauer
Postfach 201602, 56016 Koblenz Fon: +49 261 287-1311, Fax: -100 1311
PGP-Fingerprint: F39C E314 2650 650D 8092 9514 3A56 FBD8 79E3 27B2


-
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services for
just about anything Open Source.
http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Using a NAS Server to backups

2008-04-01 Thread Ronny Aasen

On Mon, 2008-03-31 at 16:22 +0200, [EMAIL PROTECTED] wrote:
 So, it works with NFS ?

Yes it works with NFS. But it scales poorly

I am running a small installation. only 23 hosts.  on a 6TB nfs nas,
using mostly gigabit ethernet.
5.9T  3.4T  2.6T  57% /var/lib/backuppc

speed varies between 1 MB/sec and 8MB/sec with the average laying about
3MB/sec for a full backup. and  0.5MB/sec for incremental

cpu is often in iowait during backups.

mark: this is all server hardware intel server gigabit network cards.
and lsi sas raid controller with sata 500GB raid edition harddrives 






 Romain
 
 
 

  Stephen Joyce 
  [EMAIL PROTECTED] 
  unc.eduA
  Envoyé par :  [EMAIL PROTECTED]
  backuppc-users-bo t   
  [EMAIL PROTECTED]  cc
  eforge.net
  Objet
Re: [BackupPC-users] Using a NAS
  31/03/2008 16:17  Server to backups   






 
 
 
 
 On Mon, 31 Mar 2008, Tino Schwarze wrote:
 
   On Mon, Mar 31, 2008 at 03:14:34PM +0200, Ronny Aasen wrote:
 
 I would like to know if it's possible and how to configure BackupPC
 in
 order to use a NAS server instead of the path /datas of BackupPC
 (for
 example).
   
Sure, it's possible to do an NFS mount of a remote disk, and use that
 for
your BackupPC storage.
In all likelihood tho, it will be terribly slow. Don't try it.
 BackupPC is
too disk-I/O intensive.
   
  
   Are there any NFS/SAN solutions are _are_ usable with backuppc
   for example iscsi/fc/afs/codafs etc etc.
 
  A SAN would do since you're creating an ordinary file system on top it.
  The file system needs to support real hardlinks to be usable with
  BackupPC. AFS doesn't really fit since you'd need to create one large
  BackupPC volume, so you're just moving the problem from local to remote
  server (that is managing a very large chunk of file system).
 
 AFS doesn't support hard links between directories, which BackupPC
 requires.
 
 Cheers, Stephen
 --
 Stephen Joyce
 Systems AdministratorP A N I C
 Physics  Astronomy Department Physics  Astronomy
 University of North Carolina at Chapel Hill Network Infrastructure
 voice: (919) 962-7214and Computing
 fax: (919) 962-0480   http://www.panic.unc.edu
 
Lazy Programmers know that if a thing is worth doing, it's worth
doing well -- unless doing it well takes so long that isn't worth
doing any more. Then you just do it 'good enough'
   --- Programming Perl, p 282.
 
 -
 Check out the new SourceForge.net Marketplace.
 It's the best place to buy or sell services for
 just about anything Open Source.
 http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace
 
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/
 
 
 
 
 SC2N -S.A  Siège Social : 2, Rue Andre Boulle - 94000 Créteil  - 327 153
 722 RCS Créteil
 
 
 
 
 This e-mail message is intended only for the use of the intended
 recipient(s).
 The information contained therein may be confidential or privileged, and
 its disclosure or reproduction is strictly prohibited.
 If you are not the intended recipient, please return it immediately to its
 sender at the above address and destroy it.
 
 
 
 -
 Check out the new SourceForge.net Marketplace.
 It's the best place to buy or sell services for
 just about anything Open Source.
 http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:

Re: [BackupPC-users] Disable No Backup Warning Emails

2008-04-01 Thread Tino Schwarze
On Mon, Mar 31, 2008 at 12:57:41PM -0500, Carl Wilhelm Soderstrom wrote:
 On 03/31 09:29 , Vince wrote:
  I tried setting EMailNotifyMinDays, EMailNotifyOldBackupDays, 
  EMailNotifyOldOutlookDays all to 0 or -1 to disable it but it has not 
  worked.  I also could not find anything about this in the documentation.
 
 Try setting EMailNotifyMinDays to 365 or 999 or whatever your preference is.
 If you set it to 365 days it'll remind you to check again in a year whether
 or not you want to keep these old backups around.

Very useful advice!

Thank you,

Tino.

-- 
„Es gibt keinen Weg zum Frieden. Der Frieden ist der Weg.” (Mahatma Gandhi)

www.craniosacralzentrum.de
www.forteego.de

-
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services for
just about anything Open Source.
http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Using a NAS Server to backups

2008-04-01 Thread romain . pichard
Hi,

Thanks a lot for your answers.
I will work in considering your answers. It's very interresting.

Thanks.

Romain



   
 Tino Schwarze 
 [EMAIL PROTECTED] 
 isc.de A 
 Envoyé par :  [EMAIL PROTECTED] 
 backuppc-users-bo t   
 [EMAIL PROTECTED]  cc 
 eforge.net
 Objet 
   Re: [BackupPC-users] Using a NAS
 01/04/2008 10:35  Server to backups   
   
   
   
   
   
   




On Tue, Apr 01, 2008 at 09:56:27AM +0200, Ronny Aasen wrote:

  So, it works with NFS ?

 Yes it works with NFS. But it scales poorly

 I am running a small installation. only 23 hosts.  on a 6TB nfs nas,
 using mostly gigabit ethernet.
 5.9T  3.4T  2.6T  57% /var/lib/backuppc

 speed varies between 1 MB/sec and 8MB/sec with the average laying about
 3MB/sec for a full backup. and  0.5MB/sec for incremental

 cpu is often in iowait during backups.

 mark: this is all server hardware intel server gigabit network cards.
 and lsi sas raid controller with sata 500GB raid edition harddrives

My impression is that it scales so badly because there are a lot of
seeks between pc backup directories and the pools. The MD5 hashing
distributes files randomly over the pool, so you end up looking at
another place on disk for each file which involves seeking.

I've got a Dell machine here with PERC5/i (which is by no means fast)
and I also have three WD RAID Edition 500 GB SATA drives. Somehow it
doesn't fit very well. I'm seeing I/O starvation, throughput is at about
300-500 kB/s during backups and device utilisation is at 100% (measured
with iostat).

I've got a second RAID5 on the same controller, but with three 750 GB
SATA drives (Hitachi HUA721075KLA330). This seems to perform better, but
mainly takes large files, so the seeking is not a problem. I wish there
was a swap file systems operation so I could test how BackupPC
performs on the second RAID. Hm. Maybe I'll use a weekend to copy the
raw xfs file system to another logical volume? But then there's still
the file system tuning issue. Or I'm just suffering partition
misalignment so FS accesses cross RAID stripes too much? So much
questions, so little time... :-(

Or maybe I've just got too little memory? I see only 600MB of cached
memory while the backup (our largest one: 2.9 million files, 120 GB)
takes 640 MB plus 480 MB for each of the BackupPC_dump processes (using
rsync). The machine has 2 GB.

Bye,

Tino.

--
„Es gibt keinen Weg zum Frieden. Der Frieden ist der Weg.” (Mahatma Gandhi)

www.craniosacralzentrum.de
www.forteego.de

-
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services for
just about anything Open Source.
http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace

___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/



SC2N -S.A  Siège Social : 2, Rue Andre Boulle - 94000 Créteil  - 327 153
722 RCS Créteil


This e-mail message is intended only for the use of the intended
recipient(s).
The information contained therein may be confidential or privileged, and
its disclosure or reproduction is strictly prohibited.
If you are not the intended recipient, please return it immediately to its
sender at the above address and destroy it.
-
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services for
just about anything Open Source.
http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: 

Re: [BackupPC-users] [BackupPC] Cpool and pc quite are big according to size of backuped data and backup config

2008-04-01 Thread Hervé Richard

Les Mikesell wrote:

Hervé Richard wrote:


Just one thing I forgot :-/
There is a symlink between /opt/backuppc/files/pc/localhost (os file 
system) - /mnt/hdd1/backuppc/pc/localhost (backup hdd file system)


Maybe the problem comes from this symlink, what do you think about?
Maybe I should have make a hardlink?
Or maybe I shouldn't have copied /localhost subdirs to backup hdds


The cpool and pc directories must be on the same filesystem so if you 
are going to symlink to a different disk the symlink must be to the 
directory above cpool and pc.


If I understand what you say, on my backup HDD I need conf, cpool, log, 
pc, pool, save and trash to be there ?

Is for that reason that's not working :-/
I wish I could do a rotation between 2 hdd every 2 weeks
Maybe if I do the same thing on the other HDD: copy all needed directory 
and mount the hdd partition under /files/



Many thanks for your help :-)
I'll try this and let you know

Cheers

Hervé

P.S: As you guessed I supposed, I'm quite new as using backuppc :-/
begin:vcard
fn;quoted-printable:Herv=C3=A9 Richard
n;quoted-printable:Richard;Herv=C3=A9
org:PERIACTES
adr:;;Av. de l'hippodrome, 147;Brussels;;1050;Belgium
email;internet:[EMAIL PROTECTED]
tel;work:+32 (0) 2 649 80 89
tel;fax:+32 (0) 2 648 27 22
x-mozilla-html:FALSE
url:http://www.periactes.com
version:2.1
end:vcard

-
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services for
just about anything Open Source.
http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Error - Unable to reda 4 bytes in Bacluppc-3.1.0

2008-04-01 Thread kanti

Thanx a lot Mike . Now every thing is fine . But when i am trying to take a 
backup of client , again that same error has occurred. (Unable to read 4 
bytes). 
The error is like as follows :- 
full backup started for directory / 
Running: /usr/bin/ssh -q -x -l root scn-ws9 /usr/bin/rsync --server --sender 
--numeric-ids --perms --owner --group -D --links --hard-links --times 
--block-size=2048 --recursive --ignore-times . / 
Xfer PIDs are now 11798 
Read EOF: Connection reset by peer 
Tried again: got 0 bytes 
Done: 0 files, 0 bytes 
Got fatal error during xfer (Unable to read 4 bytes) 
Backup aborted (Unable to read 4 bytes) 
Not saving this as a partial backup since it has fewer files than the prior one 
(got 0 and 0 files versus 0) 

I have checke everything ,also there is no problem in ssh key its resolving 
properly. 

Plz try to help me out this problem . 
I am waiting for best response. 

Thanx a lot in Advance .. 
Thanks  Bye
Appu

+--
|This was sent by [EMAIL PROTECTED] via Backup Central.
|Forward SPAM to [EMAIL PROTECTED]
+--



-
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services for
just about anything Open Source.
http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Cannot edit config via the CGI

2008-04-01 Thread Emmanuel Lesouef
Hello there,

I am trying to install backuppc on a debian stable vserver. I use the
stable package of backuppc : 2.1.2-6 

The installation went ok (except that apache was installed instead of
apache2, I had to do a symlink to /etc/backuppc/apache.conf by hand).

The CGI is working great but there's still something that gives me
headache : I cannont edit hosts config using the CGI.

There are no links to Edit host.

Could it be related to : 

http://backuppc.sourceforge.net/faq/security.html#cgi_interface

Can someone help me ?

Thank you in advance.

-- 
Emmanuel Lesouef
DSI | CRBN
t: 0231069671
e: [EMAIL PROTECTED]

-
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services for
just about anything Open Source.
http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] backing on two raid arrays

2008-04-01 Thread Tino Schwarze
On Tue, Apr 01, 2008 at 12:21:39PM +, Gilles Guiot wrote:

 I'm using backuppc 3.1.0 on a debian distro.
 My backup server has two raid 1 arrays
 I have been backing up some servers on the first array
 I need to backup other servers but not enough space on the first array 
 /dev/sda1. 
 I would like to use the same backuppc install and backup other servers on the 
 second array : dev/sdb
 Is it possible and if yes, how shall one proceed ? 

It's not directly possible - the pool and the backups need to be on the
same file system. You could create an LVM starting with the /dev/sdb as
first physical volume, then migrate the whole file system from /dev/sda1
to the LVM, then extend the LVM by /dev/sda1 and extend the logical
volume and file system.

HTH,

Tino.

-- 
„Es gibt keinen Weg zum Frieden. Der Frieden ist der Weg.” (Mahatma Gandhi)

www.craniosacralzentrum.de
www.forteego.de

-
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services for
just about anything Open Source.
http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] backing on two raid arrays

2008-04-01 Thread Martin Leben
Gilles Guiot wrote:
 Hello Everybody
 I'm using backuppc 3.1.0 on a debian distro.
 My backup server has two raid 1 arrays
 I have been backing up some servers on the first array
 I need to backup other servers but not enough space on the first array 
 /dev/sda1. 
 I would like to use the same backuppc install and backup other servers on the 
 second array : dev/sdb
 Is it possible and if yes, how shall one proceed ? 


Hi,

No it is not possible. The FAQ at 
http://backuppc.sourceforge.net/faq/BackupPC.html#what_type_of_storage_space_do_i_need
 
says:

 BackupPC uses hardlinks to pool files common to different backups. Therefore 
 BackupPC's data store (__TOPDIR__) must point to a single file system that 
 supports hardlinks. You cannot split this file system with multiple mount 
 points or using symbolic links to point a sub-directory to a different file 
 system (it is ok to use a single symbolic link at the top-level directory 
 (__TOPDIR__) to point the entire data store somewhere else). You can of 
 course use any kind of RAID system or logical volume manager that combines 
 the capacity of multiple disks into a single, larger, file system. Such 
 approaches have the advantage that the file system can be expanded without 
 having to copy it.

Best regards,
/Martin


-
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services for
just about anything Open Source.
http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Using a NAS Server to backups

2008-04-01 Thread Les Mikesell
Tino Schwarze wrote:
 
 My impression is that it scales so badly because there are a lot of
 seeks between pc backup directories and the pools. The MD5 hashing
 distributes files randomly over the pool, so you end up looking at
 another place on disk for each file which involves seeking.
 
 I've got a Dell machine here with PERC5/i (which is by no means fast)
 and I also have three WD RAID Edition 500 GB SATA drives. Somehow it
 doesn't fit very well. I'm seeing I/O starvation, throughput is at about
 300-500 kB/s during backups and device utilisation is at 100% (measured
 with iostat).

Small writes on raid5 always have a performance hit.

 I've got a second RAID5 on the same controller, but with three 750 GB
 SATA drives (Hitachi HUA721075KLA330). This seems to perform better, but
 mainly takes large files, so the seeking is not a problem. I wish there
 was a swap file systems operation so I could test how BackupPC
 performs on the second RAID. Hm. Maybe I'll use a weekend to copy the
 raw xfs file system to another logical volume? But then there's still
 the file system tuning issue. Or I'm just suffering partition
 misalignment so FS accesses cross RAID stripes too much? So much
 questions, so little time... :-(

Big writes aren't quite so bad.  But you would probably get much better 
performance if you had 6 identical drives running in a striped/mirrored 
configuration (at some cost in disk space).

 Or maybe I've just got too little memory? I see only 600MB of cached
 memory while the backup (our largest one: 2.9 million files, 120 GB)
 takes 640 MB plus 480 MB for each of the BackupPC_dump processes (using
 rsync). The machine has 2 GB.

If you aren't actively swapping, the ram is probably OK.  Unless you 
have slow network links you will probably want to only run 2 backups 
concurrently.

-- 
   Les Mikesell
[EMAIL PROTECTED]

-
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services for
just about anything Open Source.
http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] How to take Backup of windows machine through Backuppc-3.1.0

2008-04-01 Thread kanti

Hie All , I want to take backup of a windows client machine by Backuppc-3.1.0 
.And my backuppc server is running on Fedora 7.  Can anyone tell me how can i 
out from this problem . Any Idea ???

Plz try to help me out this problem . 

Thanks a lot in Advance ..

Thanks  Bye 

Appu

+--
|This was sent by [EMAIL PROTECTED] via Backup Central.
|Forward SPAM to [EMAIL PROTECTED]
+--



-
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services for
just about anything Open Source.
http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Cannot edit config via the CGI

2008-04-01 Thread Nils Breunese (Lemonbit)
Emmanuel Lesouef wrote:

 I am trying to install backuppc on a debian stable vserver. I use the
 stable package of backuppc : 2.1.2-6

 The installation went ok (except that apache was installed instead of
 apache2, I had to do a symlink to /etc/backuppc/apache.conf by hand).

 The CGI is working great but there's still something that gives me
 headache : I cannont edit hosts config using the CGI.

 There are no links to Edit host.

 Could it be related to :

 http://backuppc.sourceforge.net/faq/security.html#cgi_interface

 Can someone help me ?

Short answer: the user you're logging in as needs be an admin user.

Long answer: see 
http://backuppc.sourceforge.net/faq/BackupPC.html#step_9__cgi_interface 
  and start reading at 'BackupPC_Admin requires that users are  
authenticated by Apache (...)'.

Nils Breunese.

-
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services for
just about anything Open Source.
http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] How to take Backup of windows machine through Backuppc-3.1.0

2008-04-01 Thread Les Mikesell
kanti wrote:
 Hie All , I want to take backup of a windows client machine by Backuppc-3.1.0 
 .And my backuppc server is running on Fedora 7.  Can anyone tell me how can i 
 out from this problem . Any Idea ???
 
 Plz try to help me out this problem . 
 
 Thanks a lot in Advance ..

Add the target to the hosts first, then edit the config for that host. 
The XferMethod should be smb, the SmbShareName can be either the 
administrative drive-letter-dollar-sign shares (C$, etc.) or explicit 
shares that you have created.  The SmbShareUserName and SmbSharePasswd 
must be for a user with at least read access for the share, so if you 
use the administrative shares, the user must be in the administrator or 
backup user groups.

-- 
   Les Mikesell
[EMAIL PROTECTED]

-
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services for
just about anything Open Source.
http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] How to take Backup of windows machine through Backuppc-3.1.0

2008-04-01 Thread Daniel Denson
I'm sorry to be the guy that does this but this is well documented.  If 
you are having some problem AFTER you have followed the instructions 
then you were not clear about that.  You must post what error you are 
having in that circumstance.

kanti wrote:
 Hie All , I want to take backup of a windows client machine by Backuppc-3.1.0 
 .And my backuppc server is running on Fedora 7.  Can anyone tell me how can i 
 out from this problem . Any Idea ???

 Plz try to help me out this problem . 

 Thanks a lot in Advance ..

 Thanks  Bye 

 Appu

 +--
 |This was sent by [EMAIL PROTECTED] via Backup Central.
 |Forward SPAM to [EMAIL PROTECTED]
 +--



 -
 Check out the new SourceForge.net Marketplace.
 It's the best place to buy or sell services for
 just about anything Open Source.
 http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/
   

-
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services for
just about anything Open Source.
http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Error :- No ping Response

2008-04-01 Thread Les Mikesell
kanti wrote:
 Hie thanx for ur valuable reply , now every thing is fine . But when i am 
 trying to take a backup of client again that same error has occurred. (Unable 
 to read 4 bytes).
 The error is like as follows :- 
 full backup started for directory /
 Running: /usr/bin/ssh -q -x -l root scn-ws9 /usr/bin/rsync --server --sender 
 --numeric-ids   --perms --owner  --group -D --links --hard-links --times 
 --block-size=2048 --recursive --ignore-times . /
 Xfer PIDs are now 11798

What happens if you try to execute that exact command line as the 
backuppc user yourself? It should send some odd character as the start 
of the transfer protocol and wait - hit a control-C to exit.  If you get 
a password prompt or ssh error, you still don't have the keys set up 
correctly.  If you see some other text message it may be coming from the 
  login scripts on the remote side and causing trouble with the rsync 
protocol.

-- 
   Les Mikesell
[EMAIL PROTECTED]

-
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services for
just about anything Open Source.
http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] [BackupPC] Cpool and pc quite are big according to size of backuped data and backup config

2008-04-01 Thread Les Mikesell
Hervé Richard wrote:
 
 If I understand what you say, on my backup HDD I need conf, cpool, 
 log, pc, pool, save and trash to be there ?
 Is for that reason that's not working :-/
 I wish I could do a rotation between 2 hdd every 2 weeks
 Maybe if I do the same thing on the other HDD: copy all needed 
 directory and mount the hdd partition under /files/


 P.S: As you guessed I supposed, I'm quite new as using backuppc :-/
 Ok I've done some cleaning it was a little bit a mess :-/
 BackuPC behave a little better now but I still have old backup that 
 BackupPC_nightly doesn't want to remove :-/
 I think I have to choice:
 1°) Comment the line for the host I backup files and let backuppc 
 propose me tout delete its backup files
 2°) Use the BackupPC_deletebackup I saw on the wiki: 
 http://backuppc.wiki.sourceforge.net/How+to+delete+backups
 
 What do you think about?

Since you have 2 disks, you could just start from scratch on one of them 
and wait until it has a complete set before you swap again.  Another 
approach is to always run on an internal drive, but periodically 
image-copy to a matching external disk that you rotate offsite.  I do it 
that way myself but recently switched from firewire externals to a 
swappable SATA enclosure.  I do the mirroring with software raid, but it 
would work as well to unmount the partition temporarily and dd it.

-- 
   Les Mikesell
[EMAIL PROTECTED]


-
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services for
just about anything Open Source.
http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] [BackupPC] Cpool and pc quite are big according to size of backuped data and backup config

2008-04-01 Thread Hervé Richard

Hervé Richard wrote:

Les Mikesell wrote:

Hervé Richard wrote:


Just one thing I forgot :-/
There is a symlink between /opt/backuppc/files/pc/localhost (os file 
system) - /mnt/hdd1/backuppc/pc/localhost (backup hdd file system)


Maybe the problem comes from this symlink, what do you think about?
Maybe I should have make a hardlink?
Or maybe I shouldn't have copied /localhost subdirs to backup hdds


The cpool and pc directories must be on the same filesystem so if you 
are going to symlink to a different disk the symlink must be to the 
directory above cpool and pc.


If I understand what you say, on my backup HDD I need conf, cpool, 
log, pc, pool, save and trash to be there ?

Is for that reason that's not working :-/
I wish I could do a rotation between 2 hdd every 2 weeks
Maybe if I do the same thing on the other HDD: copy all needed 
directory and mount the hdd partition under /files/



Many thanks for your help :-)
I'll try this and let you know

Cheers

Hervé

P.S: As you guessed I supposed, I'm quite new as using backuppc :-/

Ok I've done some cleaning it was a little bit a mess :-/
BackuPC behave a little better now but I still have old backup that 
BackupPC_nightly doesn't want to remove :-/

I think I have to choice:
1°) Comment the line for the host I backup files and let backuppc 
propose me tout delete its backup files
2°) Use the BackupPC_deletebackup I saw on the wiki: 
http://backuppc.wiki.sourceforge.net/How+to+delete+backups


What do you think about?

Thanks,

Cheers

Hervé




__ NOD32 2991 (20080401) Information __

This message was checked by NOD32 antivirus system.
http://www.eset.com
-
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services for
just about anything Open Source.
http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace


__ NOD32 2991 (20080401) Information __

This message was checked by NOD32 antivirus system.
http://www.eset.com
  



___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/



__ NOD32 2991 (20080401) Information __

This message was checked by NOD32 antivirus system.
http://www.eset.com
  


begin:vcard
fn;quoted-printable:Herv=C3=A9 Richard
n;quoted-printable:Richard;Herv=C3=A9
org:PERIACTES
adr:;;Av. de l'hippodrome, 147;Brussels;;1050;Belgium
email;internet:[EMAIL PROTECTED]
tel;work:+32 (0) 2 649 80 89
tel;fax:+32 (0) 2 648 27 22
x-mozilla-html:FALSE
url:http://www.periactes.com
version:2.1
end:vcard

-
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services for
just about anything Open Source.
http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Transfer just rsync differences?

2008-04-01 Thread Hereward Cooper
I'm new to Backuppc, having just came across it via
debian-administration.org last week.

I have one query about my newly setup system:

I presumed that an incremental backup (via rsync) would only copy over
the changed files. However it seems to copy over all files, then only
store the changed ones. 

There is one machine we want to backup which has to be done via the net,
so copying all files nightly takes a large chunk of our bandwidth. 

Is there a solution to this, as I'd love to keep using this program
rather than going back to my custom rsync script. 

Coops.


-
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services for
just about anything Open Source.
http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Transfer just rsync differences?

2008-04-01 Thread Les Mikesell
Hereward Cooper wrote:
 I'm new to Backuppc, having just came across it via
 debian-administration.org last week.
 
 I have one query about my newly setup system:
 
 I presumed that an incremental backup (via rsync) would only copy over
 the changed files. However it seems to copy over all files, then only
 store the changed ones. 

The other xfer methods transfer everything, but rsync should only 
transfer changes from your last full run.

 There is one machine we want to backup which has to be done via the net,
 so copying all files nightly takes a large chunk of our bandwidth. 

Full rsync runs should do block-checksum compares of all files which may 
take a long time but should not take a lot of bandwidth. Incremental 
runs will be faster because they only transfer the differences in files 
where the timestamp or lengths have changed.

 Is there a solution to this, as I'd love to keep using this program
 rather than going back to my custom rsync script. 

It should be doing what you want now.  You just need to balance the full 
vs. incremental runs for the tradeoff you want in runtime vs. rebuilding 
the tree for the next comparison.

-- 
   Les Mikesell
[EMAIL PROTECTED]


-
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services for
just about anything Open Source.
http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Transfer just rsync differences?

2008-04-01 Thread David Rees
On Tue, Apr 1, 2008 at 11:52 AM, Les Mikesell [EMAIL PROTECTED] wrote:
 Hereward Cooper wrote:
   Is there a solution to this, as I'd love to keep using this program
   rather than going back to my custom rsync script.

  It should be doing what you want now.  You just need to balance the full
  vs. incremental runs for the tradeoff you want in runtime vs. rebuilding
  the tree for the next comparison.

I believe you can make each incremental act like a full by setting
$Conf{IncrFill} to 1 so that incrementals don't have to transfer all
the data since the last full backup and only transfer the changes from
the last backup. This way you can get the best of both worlds (reduced
IO when performing an incremental backup, periodic full backups to
really ensure that everything is there).

-Dave

-
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services for
just about anything Open Source.
http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Rotating log files causing backups to fall behind.

2008-04-01 Thread Craig Barratt
John writes:

 I am noticing an issue in our backuppc installation.
 
 Every Monday we have 20-30 hosts (of 84) that were not backed up in
 the prior 24 hours. It seems the /var/log filesystem takes much longer
 on the sunday/monday backups than it takes during the rest of the
 week.

 [snip]
 
 I think the problem is that log files are rotated on Sunday AM, so the
 Sunday night/Monday morning backup run has to transfer the entire
 contents of these log files because the names change during the log
 rotation.

Yes, that makes sense.  Unfortunately a renamed file cannot be easily
matched, so it has to be transferred and then matched against the pool.

(A better approach with log files is to append the date stamp to the
name, rather than an incrementing number.)

Any chance you have sparse files in /var/log?  They can be very large,
and none of the Xfer methods detect sparse files.

Craig

-
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services for
just about anything Open Source.
http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Transfer just rsync differences?

2008-04-01 Thread Les Mikesell
Hereward Cooper wrote:
 On Tue, 2008-04-01 at 13:52 -0500, Les Mikesell wrote:
 
 The other xfer methods transfer everything, but rsync should only 
 transfer changes from your last full run.
 
 Thanks for the quick replies.
 
 Glad to here it behaves like I had expected and hoped.
 
 But I've attached 3 quick screen shots as illustration of my problem.
 
  * Looking at summary.png you can see Backup #2 (04/01 1am) was an
 incremental backup.
 
  * So this should have just transferred the changes yeah? And as shown
 in count.png is says there was only 24Mb of new files, and 1.3Gb already
 backed up.
 
 * So why when I look at my network graphs (see network.png) do I see
 ~1.4Gb outbound starting at 1am for 40mins (the total size of the data
 to be backed up).

There is something very wrong with your initial full. Look at the time 
it took and the number of files compare to the subsequent incrementals. 
  Get a good full run and the files will stop being transferred in the 
incrementals.

-- 
   Les Mikesell
[EMAIL PROTECTED]

-
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services for
just about anything Open Source.
http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Rotating log files causing backups to fall behind.

2008-04-01 Thread Les Mikesell
Craig Barratt wrote:
 John writes:
 
 I am noticing an issue in our backuppc installation.

 Every Monday we have 20-30 hosts (of 84) that were not backed up in
 the prior 24 hours. It seems the /var/log filesystem takes much longer
 on the sunday/monday backups than it takes during the rest of the
 week.

 [snip]

 I think the problem is that log files are rotated on Sunday AM, so the
 Sunday night/Monday morning backup run has to transfer the entire
 contents of these log files because the names change during the log
 rotation.
 
 Yes, that makes sense.  Unfortunately a renamed file cannot be easily
 matched, so it has to be transferred and then matched against the pool.
 
 (A better approach with log files is to append the date stamp to the
 name, rather than an incrementing number.)
 
 Any chance you have sparse files in /var/log?  They can be very large,
 and none of the Xfer methods detect sparse files.

Some versions of 64-bit linux will have a sparse /var/log/lastlog that 
will be 1.2 terabytes if you transfer it the hard way:
http://en.wikipedia.org/wiki/Lastlog
It might be a good thing to exclude on general principles.

-- 
   Les Mikesell
[EMAIL PROTECTED]

-
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services for
just about anything Open Source.
http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Transfer just rsync differences?

2008-04-01 Thread Hereward Cooper

On Tue, 2008-04-01 at 21:52 +0100, Hereward Cooper wrote:

 But I've attached 3 quick screen shots as illustration of my problem.

Sorry - I think I've chosen a bad example. Backup #2 is level 1 so it's
run in comparison to Backup #0, not #1 - yeah?

But I'm still having this issue. I've just done a series of test backups
with small (~30Mb) batches of files, and each time it transfers the
entire load over, not just the new batch of files I've added between
running incrementals. Hhhmm. 

Might it be to do with the article I followed? [1]

Coops.

[1] http://www.debian-administration.org/articles/588


-
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services for
just about anything Open Source.
http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Transfer just rsync differences?

2008-04-01 Thread Les Mikesell
Hereward Cooper wrote:
 On Tue, 2008-04-01 at 21:52 +0100, Hereward Cooper wrote:
 
 But I've attached 3 quick screen shots as illustration of my problem.
 
 Sorry - I think I've chosen a bad example. Backup #2 is level 1 so it's
 run in comparison to Backup #0, not #1 - yeah?

Yes, incrementals are done against the previous full run.

 But I'm still having this issue. I've just done a series of test backups
 with small (~30Mb) batches of files, and each time it transfers the
 entire load over, not just the new batch of files I've added between
 running incrementals. Hhhmm. 

That's expected.  Do a full run to get a new base.

The latest version has an option to do incremental levels but I don't 
think that's in the debian package yet.

-- 
   Les Mikesell
[EMAIL PROTECTED]


-
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services for
just about anything Open Source.
http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Per-host configuration files ver 3.1.0

2008-04-01 Thread Craig Barratt
Jason writes:

 Hi, I have BackupPC 3.1.0 installed on FreeBSD 6.2 but I require the
 per-host configuration files to be stored in the same locations as the
 they were in version 2.0.2  (ie. TOP_DIR/pc/HOST/config.pl )? Is there
 a way I can achieve this without upgrading from 2.0.2 to 3.1.0?

If you did an upgrade it should maintain the original locations.

If it's a new install from sources you can specify the --no-fhs option
to configure.pl:

   --fhs   Use locations specified by the Filesystem Hierarchy Standard
   for installing BackupPC.  This is enabled by default for new
   installations.  To use the pre-3.0 installation locations,
   specify --no-fhs.

Craig

-
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services for
just about anything Open Source.
http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Backup SMB mount via SSH+Rsync

2008-04-01 Thread Nicholas Hall
Hello All,

I am working on building a small distro for backing up Window's clients over
WAN.  My goal is to be able to hand a customer a small piece of hardware (
i.e. http://www.soekris.com/products.htm), have them plug it in their
network, open port 22 to the device be able to do some simple configuring
via web interface to select which hosts and SMB shares to backup.  The
selected shares will be mounted on the device via smbmount which should
allow BackupPC to SSH in to the device and effectively backup internal Win32
hosts.

Before I commit anymore time into this I would like to see if anyone else is
backuping up hosts in a similiar fashion (SMB mounts).  I do not know the
internals of Rsync on how it checks to see if a file has changed and I'm
concerned that an SMB mount point won't allow rsync to do it's magic
(incrimentals) and therefor won't allow BackupPC to do _it's_ magic
(pooling, hardlinks).

Someone please ease my nerves and tell me this will work.

cheers
-- 
Nicholas Hall
[EMAIL PROTECTED]
-
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services for
just about anything Open Source.
http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Cannot edit config via the CGI

2008-04-01 Thread Adam Goryachev
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Nils Breunese (Lemonbit) wrote:
 Emmanuel Lesouef wrote:
 
 I am trying to install backuppc on a debian stable vserver. I use the
 stable package of backuppc : 2.1.2-6

 The installation went ok (except that apache was installed instead of
 apache2, I had to do a symlink to /etc/backuppc/apache.conf by hand).

 The CGI is working great but there's still something that gives me
 headache : I cannont edit hosts config using the CGI.

 There are no links to Edit host.
 Short answer: the user you're logging in as needs be an admin user.
 
 Long answer: see 
 http://backuppc.sourceforge.net/faq/BackupPC.html#step_9__cgi_interface 
   and start reading at 'BackupPC_Admin requires that users are  
 authenticated by Apache (...)'.

Ugh, shorter answer (I think this is more accurate):
backuppc 2.1.2 doesn't have a Edit Host feature, this wasn't added until
backuppc 3.0 or thereabouts...

Either use backuppc from testing, or don't use the edit host feature,
modify the config files from the command line.

Regards,
Adam
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.6 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org

iD8DBQFH8srbGyoxogrTyiURAkSkAKChZrpa48e7YowUsAzHimsoB0BM5gCgw14R
3cP+LjPfSMOLiT726It/YrM=
=x70X
-END PGP SIGNATURE-

-
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services for
just about anything Open Source.
http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup SMB mount via SSH+Rsync

2008-04-01 Thread Les Mikesell
Nicholas Hall wrote:
 Hello All,
 
 I am working on building a small distro for backing up Window's clients 
 over WAN.  My goal is to be able to hand a customer a small piece of 
 hardware (i.e. http://www.soekris.com/products.htm), have them plug it 
 in their network, open port 22 to the device be able to do some simple 
 configuring via web interface to select which hosts and SMB shares to 
 backup.  The selected shares will be mounted on the device via smbmount 
 which should allow BackupPC to SSH in to the device and effectively 
 backup internal Win32 hosts.
 
 Before I commit anymore time into this I would like to see if anyone 
 else is backuping up hosts in a similiar fashion (SMB mounts).  I do not 
 know the internals of Rsync on how it checks to see if a file has 
 changed and I'm concerned that an SMB mount point won't allow rsync to 
 do it's magic (incrimentals) and therefor won't allow BackupPC to do 
 _it's_ magic (pooling, hardlinks).
 
 Someone please ease my nerves and tell me this will work.

Conceptually it should work.  The issues you'll have to deal with will 
be the smbmount credentials, especially if the site has a policy that 
forces regular password changes, and the fact that rsync full runs will 
read the entire filesystem contents over the smb mount although it will 
only exchange block checksums to find the changes across the WAN side.

If I were installing something like this, I'd want at least an option to 
  add disks to the device to hold a current copy of the backups with an 
additional copy and the history kept remotely via backuppc.  Disks are 
fairly cheap and it would give a quicker restore of the copy you are 
most likely to want.

-- 
   Les Mikesell
[EMAIL PROTECTED]

-
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services for
just about anything Open Source.
http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Rotating log files causing backups to fall behind.

2008-04-01 Thread John Rouillard
On Tue, Apr 01, 2008 at 01:46:46PM -0700, Craig Barratt wrote:
 John writes:
  I am noticing an issue in our backuppc installation.
  
  Every Monday we have 20-30 hosts (of 84) that were not backed up in
  the prior 24 hours. It seems the /var/log filesystem takes much longer
  on the sunday/monday backups than it takes during the rest of the
  week.
 
  [snip]
  
  I think the problem is that log files are rotated on Sunday AM, so the
  Sunday night/Monday morning backup run has to transfer the entire
  contents of these log files because the names change during the log
  rotation.
 
 Yes, that makes sense.  Unfortunately a renamed file cannot be easily
 matched, so it has to be transferred and then matched against the pool.
 
 (A better approach with log files is to append the date stamp to the
 name, rather than an incrementing number.)

Yup, but most syslog's don't support that nor does logrotate support
date stamping during rotation. That would definitely reduce it to just
1 file having to be totally re-copied though.
 
 Any chance you have sparse files in /var/log?  They can be very large,

Nope, well none except for lastlog and that doesn't get rotated weekly
and isn't very large.

 and npone of the Xfer methods detect sparse files.

Doesn't rsync support sparse files, or do you mean the rsync perl
module doesn't support them.

-- 
-- rouilj

John Rouillard
System Administrator
Renesys Corporation
603-244-9084 (cell)
603-643-9300 x 111

-
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services for
just about anything Open Source.
http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] IncrPeriod

2008-04-01 Thread shacky
My configuration of BackuPC is the following:

$Conf{WakeupSchedule} = [21];

$Conf{FullPeriod} = 6.97;
$Conf{IncrPeriod} = 0.97;
$Conf{FullKeepCnt} = 2;
$Conf{FullKeepCntMin} = 2;
$Conf{FullAgeMax} = 90;
$Conf{IncrKeepCnt} = 6;
$Conf{IncrKeepCntMin} = 1;
$Conf{IncrAgeMax} = 30;
$Conf{PartialAgeMax} = 3;

But the latest backup for that client was made 10 days ago.
I only have 3 backups for that client, one full (14 days ago) and two
incrementals (10 and 9 days ago).
The latest state of BackupPC was nothing to do yesterday at 21:00.

I don't understand why BackupPC isn't making backups for my clients.
Could you help me, please?

Thank you very much!
Bye.

-
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services for
just about anything Open Source.
http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] IncrPeriod

2008-04-01 Thread Les Mikesell
shacky wrote:
 My configuration of BackuPC is the following:
 
 $Conf{WakeupSchedule} = [21];
 
 $Conf{FullPeriod} = 6.97;
 $Conf{IncrPeriod} = 0.97;
 $Conf{FullKeepCnt} = 2;
 $Conf{FullKeepCntMin} = 2;
 $Conf{FullAgeMax} = 90;
 $Conf{IncrKeepCnt} = 6;
 $Conf{IncrKeepCntMin} = 1;
 $Conf{IncrAgeMax} = 30;
 $Conf{PartialAgeMax} = 3;
 
 But the latest backup for that client was made 10 days ago.
 I only have 3 backups for that client, one full (14 days ago) and two
 incrementals (10 and 9 days ago).
 The latest state of BackupPC was nothing to do yesterday at 21:00.
 
 I don't understand why BackupPC isn't making backups for my clients.
 Could you help me, please?

How full is your disk and what is your setting for $Conf{DfMaxUsagePct}?

-- 
   Les Mikesell
[EMAIL PROTECTED]

-
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services for
just about anything Open Source.
http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/