On 15.02.2021 09:27, 'Frank Kirschner | Celebrate Records GmbH' via bareos-users wrote:

Secondly, "First, I will will do copy all audio files to a local hard disk on the same host, where the tape is connected directly, because copying files of the network from different host a slower than writing to tape". Not necessarily. That's what you use spooling for.
Spooling is not working for this scenario, because I have to backup multiple clients, the manual says: "Each Job will reference only a single client." So I use a "run before" script which collects from the 3 clients the data. On each client are placed the files in a "archiving" folder manually by the operator.

Sure. If this is the case, it sounds reasonable :-)

You might also have just three separate clients from which you backup with spooling but it's of course up to you. I don't know your setup sufficiently well to suggest this solution or another.


Thirdly - I used to do a "copy and delete" scenario few years ago but I had a slightly different setup so my solution is not directly copy-paste appliable to you but I'd suggest you look into:

1) Dynamically create a list of files to backup (might involve checking client files for ctime or querying bareos database to verify if the file has already been backed up)

2) Create a post-job script which removes files that have already been backed up in a proper way (i.e. included in a given number of backup jobs if you want to have several copies) - this definitely involves querying director's database.
That's a good idea for my scenario. Thanks for this good hint,


For example, my fileset included something like that:

FileSet {
    Name = "Local-archives"
    Include {
        File = "\\| find /srv/archives -type f -not -path '*backup*' -ctime +60"
    }
}

Which copied onto tape only files located in /srv/archives and not in "backup" in file (or directory in path) name that were created more than two months ago.

Then I would run a script (in my case it was ran asynchronously by cron, not from post-job trigger but post-job script is just as good here) involving a query like

select

    concat(Path.Path,file.name) as filepath,

    count(distinct job.jobid) as jobcount

from

    ((path join

    file

        on

    file.pathid = path.pathid)

        join

    job
        on

    file.jobid=job.jobid)

where job.jobstatus='t' and job.name like '%my_srv%'

group by filepath

having jobcount>=3;


To find files that had already been backed up 3 times with different jobs so I can remove them from disk. Of course you might want to extend the query to include - for example - media table to make sure that files have been copied to separate tapes.



--
You received this message because you are subscribed to the Google Groups 
"bareos-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to bareos-users+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/bareos-users/e84fe859-9cd0-020f-72de-2904b677d5f9%40gmail.com.
  • [bareos-users]... 'Frank Cherry' via bareos-users
    • [bareos-u... 'Kjetil Torgrim Homme' via bareos-users
      • Re: [... 'Frank Kirschner | Celebrate Records GmbH' via bareos-users
        • R... Spadajspadaj
          • ... 'Frank Kirschner | Celebrate Records GmbH' via bareos-users
            • ... Spadajspadaj

Reply via email to