[BackupPC-users] Keep the last n revisions of files
Hello all Maybe this is a feature request, and maybe it is just a show off how dumb I am---let's see. What would be really great to have is the possibility to ensure that I have the last n revisions of files; no matter how many fulls or incrementals. I guess this is not the main goal of BackupPC (that is, to somewhat be a revision control system) but nontheless it is a feature (at least) I would higly appreciate, and I guess my users would also appreciate it. I read the comprehensive config.pl file, the FAQ, scanned through the latest 50 e-mails in -users and -devel, searched the mailing list and of course googled. However, I haven't found anything about what I've just described. Maybe I have just searched for the wrong terms? Any pointers, good ideas, work-arounds or whatever is of course appreciated. Thanks in advance! OS: FreeBSD 6.0 Port : http://lists.freebsd.org/pipermail/freebsd-ports-bugs/2005-November/070243.html BackupPC : This documentation describes BackupPC version 2.1.2, released on 5 Sep 2005. -- Casper Thomsen - Using Tomcat but need to do more? Need to support web services, security? Get stuff done quickly with pre-integrated technology to make your job easier Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642 ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/backuppc-users http://backuppc.sourceforge.net/
Re: [BackupPC-users] Keep the last n revisions of files
Casper Thomsen said: Maybe this is a feature request, and maybe it is just a show off how dumb I am---let's see. What would be really great to have is the possibility to ensure that I have the last n revisions of files; no matter how many fulls or incrementals. I guess this is not the main goal of BackupPC (that is, to somewhat be a revision control system) but nontheless it is a feature (at least) I would higly appreciate, and I guess my users would also appreciate it. I also think this would be the job of a revision control system. I read the comprehensive config.pl file, the FAQ, scanned through the latest 50 e-mails in -users and -devel, searched the mailing list and of course googled. However, I haven't found anything about what I've just described. Maybe I have just searched for the wrong terms? Any pointers, good ideas, work-arounds or whatever is of course appreciated. Thanks in advance! How will you ensure that a file has not been changed several times since the last backup? In what cycle would you start your backup to get every existing version of that file? Ralf - Using Tomcat but need to do more? Need to support web services, security? Get stuff done quickly with pre-integrated technology to make your job easier Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642 ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/backuppc-users http://backuppc.sourceforge.net/
Re: [BackupPC-users] Keep the last n revisions of files
On Wed, 9 Aug 2006, Ralf Gross wrote: (...) What would be really great to have is the possibility to ensure that I have the last n revisions of files; no matter how many fulls or incrementals. (...) I also think this would be the job of a revision control system. Or the job of a really smart backup system ;-). (...) Any pointers, good ideas, work-arounds or whatever is of course appreciated. Thanks in advance! How will you ensure that a file has not been changed several times since the last backup? In what cycle would you start your backup to get every existing version of that file? I will not ensure that the file has not changed several times between backups. My formulation was inexact, sorry! What I want is the last n revisions of files when they are checked for changes (once a day, week or whatever). I only want the last n revisions of backups, not the real file revision. Just to make it totally clear (I hope): If a file has changed when it is being backed up, if there is less that n revisions of the file backed up, do nothing (just back it up as usual), otherwise (back it up and) delete the oldest revision of the file. Actually there would be other possiblities: you could also (1) set a bit that indicates that the file can be deleted, you could (2) delete all the oldest revisions such that there is exactly n left, and maybe some other strategy I haven't thought of yet. The different approaches imply new decisions. Ad (1). When should it be deleted? Should it be decompressed and deleted imediately, once a day, while _nightly runs, or only when all (or x percent) of the files in the compressed file is set for removal? Ad (2). Maybe it would be possible to have a flag or even the possibility of specifying an algorithm to decide how many revisions to be deleted (dependant on how often the file changes, how many revisions there are, how recent the revisions changes (in about uniformly or not) etc.). This seems, admittedly, quite strange. Ralf -- Casper Thomsen - Using Tomcat but need to do more? Need to support web services, security? Get stuff done quickly with pre-integrated technology to make your job easier Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642 ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/backuppc-users http://backuppc.sourceforge.net/
Re: [BackupPC-users] Keep the last n revisions of files
On Wed, 2006-08-09 at 03:56, Ralf Gross wrote: What would be really great to have is the possibility to ensure that I have the last n revisions of files; no matter how many fulls or incrementals. I guess this is not the main goal of BackupPC (that is, to somewhat be a revision control system) but nontheless it is a feature (at least) I would higly appreciate, and I guess my users would also appreciate it. I also think this would be the job of a revision control system. This would be difficult even with a revision control system because they tend to think that you should never lose the old versions. Subversion with apache/webdav has a mode where you can mount a webdav filesystem and the versions are tracked transparently but the only way to trim the oldest versions is to use a utility to dump and reload the whole repository. -- Les Mikesell [EMAIL PROTECTED] - Using Tomcat but need to do more? Need to support web services, security? Get stuff done quickly with pre-integrated technology to make your job easier Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642 ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/backuppc-users http://backuppc.sourceforge.net/
Re: [BackupPC-users] Keep the last n revisions of files
In the message dated: Wed, 09 Aug 2006 11:21:44 +0200, The pithy ruminations from Casper Thomsen on Re: [BackupPC-users] Keep the last n revisions of files were: = On Wed, 9 Aug 2006, Ralf Gross wrote: = (...) = What would be really great to have is the possibility to ensure that I = have the last n revisions of files; no matter how many fulls or = incrementals. Interesting idea, probably best suited to a revision control system. = (...) = = I also think this would be the job of a revision control system. = = Or the job of a really smart backup system ;-). = = (...) = Any pointers, good ideas, work-arounds or whatever is of course = appreciated. Thanks in advance! = = How will you ensure that a file has not been changed several times since = the last backup? In what cycle would you start your backup to get every = existing version of that file? = = I will not ensure that the file has not changed several times between = backups. My formulation was inexact, sorry! What I want is the last n = revisions of files when they are checked for changes (once a day, week or = whatever). I only want the last n revisions of backups, not the real = file revision. OK. Do you plan to list specific files for which revisions should be retained (in which case there's much more overhead, a more complicated config, but the storage requirement would be lower) or apply the revision settings to every file? If the former (a list of files), then it sounds like something that's best handled via a revision control system. In the simplest sense, you could check-in files manually. This could also be automated, so that when BackupPC connects to the client to initiate a backup, it runs a script first. The script would traverse the filesystem (ie, using find -newer) or check specified files, and automatically check them in to a revision control system running on the client prior to the backup. If you're thinking about applying the concept of saving revisions to the whole system, then it sounds more like you want a filesystem snapshot, rather than a file-by-file revision history. This is much more common in backup systems, and would give you the ability to restore the entire system (or individual files) to a specified point in time. Be aware that there are many files that change often where you probably don't need or want to keep successive revisions (caches, mail spool files, mailboxes, config files that maintain a list of last used files, etc.). = = Just to make it totally clear (I hope): If a file has changed when it is = being backed up, if there is less that n revisions of the file backed up, = do nothing (just back it up as usual), otherwise (back it up and) delete = the oldest revision of the file. That sounds computationaly expensive, and would signifcantly increase both storage and IO. Without a database backend to track file versions (the number of revisions, and the backup number), it would be extremely impractical. You're describing a much more traditional backup system, where each backup is stored onto separate volumes (CDs, disk-based files, tapes, etc.), and each backup has it's own file list and expiration period. This has some advantages (and disadvantages) over BackupPC, and would be much better suited to your revision scheme. = = Actually there would be other possiblities: you could also (1) set a bit = that indicates that the file can be deleted, you could (2) delete all the = oldest revisions such that there is exactly n left, and maybe some other = strategy I haven't thought of yet. Again, that sounds like the act of expiring all backups older than a given date. Consider looking at a traditional backup system (ie., not using BackupPC's concept of pooling and the use of a single storage volume), such as amanda or bacula. = = The different approaches imply new decisions. = = Ad (1). When should it be deleted? Should it be decompressed and deleted = imediately, once a day, while _nightly runs, or only when all (or x = percent) of the files in the compressed file is set for removal? That would add huge overhead. It's much more efficient to deal with an entire backup on a given day as a single revision. = = Ad (2). Maybe it would be possible to have a flag or even the possibility = of specifying an algorithm to decide how many revisions to be deleted = (dependant on how often the file changes, how many revisions there are, = how recent the revisions changes (in about uniformly or not) etc.). This = seems, admittedly, quite strange. Interesting. I like the idea of the dynamic algorithm. This is similar to amanda, in that it dynamically chooses which filesets to backup, based on the queue and backup frequency. However, I see this as having limited application. The idea of keeping successive revisions, in addition to basic backups, seems to be at odds with the idea that older revisions would be deleted dynamically. Mark = = Ralf