Hello,

We are creating here a filelist.txt before backupjobs start and put it on
the incremental backups. So if a server crash and we have to restore it from
scratch with incremental backups included, we don´t risk that the server
runs full while it would be restored if some users moved or deleted big
directories. In a worse case it could took hours to restore while the users
"jump around" and your restorejob cancel just because the harddisk get full
when one big folder was renamed or (re)moved in the past from your users and
you can start again. And even in this bad situation you maybe don´t know
which folder(name)s are the latest one so you must hope that your users can
tell you (because the folders are heavy used by them and you just backup
them).

So in theory when creating a filelist we have just to restore the
filelist.txt from the last incremental set and make the full restore based
on this filelist with option  "7: Enter a list of files to restore" and
enter "<filelist.txt". However it seems this function tooks "long" with
entries >10000 and "useless long" with many entries >100000 here. Does
anybody have an idea how to workaround another way or to speed this up
great?


Greetings,
User100







-------------------------------------------------------------------------
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT & business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV
_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users

Reply via email to