Managed to finally create a decent scheme for what i needed, using 2 pools for
my job.
Thank you everyone for your help!!
Regards,
Larry
+--
|This was sent by larryb...@gmail.com via Backup Central.
|Forward SPAM to ab...@ba
Hello Larryboy
2016-05-27 6:40 GMT-03:00 Larrybwoy :
> Hello again,
>
> A new question : Is there a way I can set up Bacula so that it recycles
> only the differential jobs from a volume? For example, I have one volume
> that has the initial full backup in it and 1 other differential job. Can I
Hello again,
A new question : Is there a way I can set up Bacula so that it recycles only
the differential jobs from a volume? For example, I have one volume that has
the initial full backup in it and 1 other differential job. Can I set it up so
that only the differential job in there gets recy
On 05/23/16 11:09, Larrybwoy wrote:
> I see your point. Interesting. So I can make it so that each new differential
> backup that gets created writes itself in a totally new volume for that
> particular job ? Is there an option for this I need to add to the pool or ?
>
Here's how I do it. This
On 05/23/2016 11:15 AM, shouldbe q931 wrote:
> Several years ago I did something similar where there was 24 hours of
> hourly rsync "backups" and daily runs of Bacula to tape
Thankfully you don't have to deal with rsync anymore: there's now zfs
with COW snapshots and the ability to mirror snapsho
On Mon, May 23, 2016 at 9:20 AM, Larrybwoy
wrote:
I do not need to have backups that are old; in case of a disaster, I
need to be able to bring back the data that was lost during the past
hour at most, so that the people working with the applications only
lose 1 hour of work in the worst
case sc
I see your point. Interesting. So I can make it so that each new differential
backup that gets created writes itself in a totally new volume for that
particular job ? Is there an option for this I need to add to the pool or ?
+-
Think about it this way:
If you're backing up to tape, jobs smaller than a full tape waste tape,
because you can't make the tape smaller.
But if you're backing up to disk, putting more than one batch of jobs
into a volume wastes disk. Because you can't compact a disk volume, and
you can't delete
On 05/23/16 04:20, Larrybwoy wrote:
> The only problem is the filesystem that I back up has 91 gigs, and
> the backup keeps getting bigger and bigger with all the differential
> jobs. So far the max vol size is set to 300GB, and with the backups
> running all weekend it now created a second volume
On 5/23/2016 5:44 AM, Larrybwoy wrote:
> Hey guys,
>
> Thanks for the replies and good advice. The reason I thought of this backup
> plan is because what I need to back up are multiple dynamic file systems from
> abut 20 servers. These file systems contain data that is always changing
> since t
Hey guys,
Thanks for the replies and good advice. The reason I thought of this backup
plan is because what I need to back up are multiple dynamic file systems from
abut 20 servers. These file systems contain data that is always changing since
they contain various dynamic application files. I do
Hey guys,
Thanks for the replies and good advice. The reason I thought of this backup
plan is because what I need to back up are multiple dynamic file systems from
abut 20 servers. These file systems contain data that is always changing since
they contain various dynamic application files. I do
The CLIENT config :
Client {
Name = server1-fd
Address = server1.com
FDPort = 9102
Catalog = MyCatalog
Password = "NDdkODYyMDM4NTZmNjYzNjYwZmE5MzIwZ" # password for
Remote FileDaemon
File Retention = 30 days
Hey guys,
Thanks for the replies and good advice. The reason I thought of this backup
plan is because what I need to back up are multiple dynamic file systems from
abut 20 servers. These file systems contain data that is always changing since
they contain various dynamic application files. I do
> On 19.05.2016 14:20, Larrybwoy wrote:
>> Hello dear community,
>> I could use some advice on the following scenario:
>> The Bacula job that I have set up is ran on an HOURLY basis, the first backup
>> being FULL and the rest that follow are all differential. What I need to know
>> is how do I
On 19.05.2016 14:20, Larrybwoy wrote:
> Hello dear community,
>
> I could use some advice on the following scenario:
>
> The Bacula job that I have set up is ran on an HOURLY basis, the first backup
> being FULL and the rest that follow are all differential. What I need to know
> is how do I set
Hello dear community,
I could use some advice on the following scenario:
The Bacula job that I have set up is ran on an HOURLY basis, the first backup
being FULL and the rest that follow are all differential. What I need to know
is how do I set up my job so that it deletes the previous differe
17 matches
Mail list logo