Re: [Bacula-users] Allow Higher Duplicates directive

2010-01-31 Thread ganiuszka
Dan Langille pisze:
> ganiuszka wrote:
>> Dan Langille pisze:
>>> Resending with additional information
>>>
>>> ganiuszka wrote:
 I used to this job next directives:

 Allow Duplicate Jobs = no
 Allow Higher Duplicates = yes

 I ran the job, and the moment I started a job of the same name but 
 with a higher priority. First job still worked, second job had a 
 status "waiting for higher priority jobs to finish". Why the first 
 job does not abort and the second job starts out of hand?
>>> My guess: Because it was already running.  Bacula does not cancel
>>> running jobs.  The directives are applied to jobs as they are being
>>> added to the queue.
>>>
>>> Try running three jobs and then you'll see.
>>
>> Thanks for reply.
>>
>> Yes, I ran three jobs, and I have the same effect. All three has been 
>> finished OK.
>>
>> I guess that  "Allow Higher Duplicates" does not work correctly. I saw 
>> the source code and I guess that if Allow Higher Duplicates is set 
>> "yes" than actions described in Bacula documentation for this 
>> directive do not work.
> 
> In my previous message I mentioned these items.  Did you try them?

Yes, I did. They did not work.

If "Allow Higher Duplicates" is set to "yes" then below directives are 
not taken into account.

Cancel Queued Duplicates
Cancel Running Duplicates

Regards
gani

--
The Planet: dedicated and managed hosting, cloud storage, colocation
Stay online with enterprise data centers and the best network in the business
Choose flexible plans and management services without long-term contracts
Personal 24x7 support from experience hosting pros just a phone call away.
http://p.sf.net/sfu/theplanet-com
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] Allow Higher Duplicates directive

2010-01-31 Thread Dan Langille
ganiuszka wrote:
> Dan Langille pisze:
>> Resending with additional information
>>
>> ganiuszka wrote:
>>> I used to this job next directives:
>>>
>>> Allow Duplicate Jobs = no
>>> Allow Higher Duplicates = yes
>>>
>>> I ran the job, and the moment I started a job of the same name but 
>>> with a higher priority. First job still worked, second job had a 
>>> status "waiting for higher priority jobs to finish". Why the first job 
>>> does not abort and the second job starts out of hand?
>> My guess: Because it was already running.  Bacula does not cancel
>> running jobs.  The directives are applied to jobs as they are being
>> added to the queue.
>>
>> Try running three jobs and then you'll see.
> 
> Thanks for reply.
> 
> Yes, I ran three jobs, and I have the same effect. All three has been 
> finished OK.
> 
> I guess that  "Allow Higher Duplicates" does not work correctly. I saw 
> the source code and I guess that if Allow Higher Duplicates is set "yes" 
> than actions described in Bacula documentation for this directive do not 
> work.

In my previous message I mentioned these items.  Did you try them?

> Then you might want to look into these directives:
> 
> Cancel Queued Duplicates = 
> If this directive is set to yes (default) any job that is already queued 
> to run but not yet running will be canceled.
> 
> Cancel Running Duplicates = 
> If this directive is set to yes any job that is already running will be 
> canceled. The default is no.



--
The Planet: dedicated and managed hosting, cloud storage, colocation
Stay online with enterprise data centers and the best network in the business
Choose flexible plans and management services without long-term contracts
Personal 24x7 support from experience hosting pros just a phone call away.
http://p.sf.net/sfu/theplanet-com
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] Allow Higher Duplicates directive

2010-01-31 Thread ganiuszka
Dan Langille pisze:
> Resending with additional information
> 
> ganiuszka wrote:
>> I used to this job next directives:
>>
>> Allow Duplicate Jobs = no
>> Allow Higher Duplicates = yes
>>
>> I ran the job, and the moment I started a job of the same name but 
>> with a higher priority. First job still worked, second job had a 
>> status "waiting for higher priority jobs to finish". Why the first job 
>> does not abort and the second job starts out of hand?
> 
> My guess: Because it was already running.  Bacula does not cancel
> running jobs.  The directives are applied to jobs as they are being
> added to the queue.
> 
> Try running three jobs and then you'll see.

Thanks for reply.

Yes, I ran three jobs, and I have the same effect. All three has been 
finished OK.

I guess that  "Allow Higher Duplicates" does not work correctly. I saw 
the source code and I guess that if Allow Higher Duplicates is set "yes" 
than actions described in Bacula documentation for this directive do not 
work.

I created a Duplicate Job Control actions diagram. You can see it here:

http://www.image-share.com/image.php?img=159/63.jpg

Here are my steps with three jobs:

*run job=QemuImages storage=UP pool=Paktos priority=15
Run Backup job
JobName:  QemuImages
Level:Full
Client:   darkstar-fd
FileSet:  QemuImages_FileSet
Pool: Paktos (From User input)
Storage:  UP (From command line)
When: 2010-01-31 19:18:37
Priority: 15
OK to run? (yes/mod/no): yes
Job queued. JobId=276
*run job=QemuImages storage=UP pool=Paktos priority=10
Run Backup job
JobName:  QemuImages
Level:Full
Client:   darkstar-fd
FileSet:  QemuImages_FileSet
Pool: Paktos (From User input)
Storage:  UP (From command line)
When: 2010-01-31 19:18:42
Priority: 10
OK to run? (yes/mod/no): yes
Job queued. JobId=277
You have messages.
*run job=QemuImages storage=UP pool=Paktos priority=5
Run Backup job
JobName:  QemuImages
Level:Full
Client:   darkstar-fd
FileSet:  QemuImages_FileSet
Pool: Paktos (From User input)
Storage:  UP (From command line)
When: 2010-01-31 19:18:50
Priority: 5
OK to run? (yes/mod/no): yes
Job queued. JobId=278
*status dir
.
.
Running Jobs:
Console connected at 31-sty-10 19:16
  JobId Level   Name   Status
==
276 FullQemuImages.2010-01-31_19.18.40_15 is running
277 FullQemuImages.2010-01-31_19.18.44_16 is waiting for higher 
priority jobs to finish
278 FullQemuImages.2010-01-31_19.18.51_17 is waiting for higher 
priority jobs to finish

.
.

Regards.
gani

--
The Planet: dedicated and managed hosting, cloud storage, colocation
Stay online with enterprise data centers and the best network in the business
Choose flexible plans and management services without long-term contracts
Personal 24x7 support from experience hosting pros just a phone call away.
http://p.sf.net/sfu/theplanet-com
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] Allow Higher Duplicates directive

2010-01-31 Thread Dan Langille
ganiuszka wrote:
>   Hi,
> 
>   I am testing Duplicate Job Control. I found a few strange behaviour. 
> One of these is Allow Higher Duplicates.
> 
> I used to this job next directives:
> 
> Allow Duplicate Jobs = no
> Allow Higher Duplicates = yes
> 
> I ran the job, and the moment I started a job of the same name but with 
> a higher priority. First job still worked, second job had a status 
> "waiting for higher priority jobs to finish". Why the first job does not 
> abort and the second job starts out of hand?

My guess: Because it was already running.  Bacula does not cancel 
running jobs.  The directives are applied to jobs as they are being 
added to the queue.

Try running three jobs and then you'll see.

> 
> This is my configuration and steps for this situation:
> 
> 
> Storage {
>  Name = UrzadzeniePlikowe
>  Address = darkstar
>  SDPort = 9103
>  Password = "*"
>  Media Type = Plik
>  Device = UrzadzeniePlikoweDev
>  Maximum Concurrent Jobs = 10
> }
> 
> Storage {
>  Name = UP
>  Address = darkstar
>  SDPort = 9103
>  Password = "*"
>  Media Type = Plik
>  Device = UPDev
>  Maximum Concurrent Jobs = 10
> }
> 
> Pool {
>  Name = Tescik
>  Pool Type = Backup
>  Recycle = no
>  Storage = UrzadzeniePlikowe
> }
> 
> Pool {
>  Name = Paktos
>  Pool Type = Backup
>  Recycle = no
>  Storage = UP
> }
> 
> Job {
>  Name = QemuImages
>  Type = Backup
>  Level = Full
>  Pool = Tescik
>  Client = darkstar-fd
>  Messages = DirMessages
>  FileSet = QemuImages_FileSet
>  Allow Duplicate Jobs = no
>  Allow Higher Duplicates = yes
>  Maximum Concurrent Jobs = 10
> }
> 
> run job=QemuImages storage=UP pool=Paktos priority=15
> run job=QemuImages storage=UrzadzeniePlikowe pool=Tescik priority=10
> 
> *run job=QemuImages storage=UP pool=Paktos priority=15
> Run Backup job
> JobName:  QemuImages
> Level:Full
> Client:   darkstar-fd
> FileSet:  QemuImages_FileSet
> Pool: Paktos (From User input)
> Storage:  UP (From command line)
> When: 2010-01-30 22:01:17
> Priority: 15
> OK to run? (yes/mod/no): yes
> Job queued. JobId=246
> *run job=QemuImages storage=UrzadzeniePlikowe pool=Tescik priority=10
> Run Backup job
> JobName:  QemuImages
> Level:Full
> Client:   darkstar-fd
> FileSet:  QemuImages_FileSet
> Pool: Tescik (From Job resource)
> Storage:  UrzadzeniePlikowe (From command line)
> When: 2010-01-30 22:01:58
> Priority: 10
> OK to run? (yes/mod/no): yes
> Job queued. JobId=247
> 
> * status dir
> 
> ..
> ..
> Running Jobs:
> Console connected at 30-sty-10 22:00
>   JobId Level   Name   Status
> ==
> 246 FullQemuImages.2010-01-30_22.01.48_28 is running
> 247 FullQemuImages.2010-01-30_22.02.00_29 is waiting for higher 
> priority jobs to finish
> 
> .
> .
> 


--
The Planet: dedicated and managed hosting, cloud storage, colocation
Stay online with enterprise data centers and the best network in the business
Choose flexible plans and management services without long-term contracts
Personal 24x7 support from experience hosting pros just a phone call away.
http://p.sf.net/sfu/theplanet-com
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] Allow Higher Duplicates directive

2010-01-31 Thread Dan Langille
Resending with additional information

ganiuszka wrote:
>   Hi,
> 
>   I am testing Duplicate Job Control. I found a few strange behaviour. 
> One of these is Allow Higher Duplicates.
> 
> I used to this job next directives:
> 
> Allow Duplicate Jobs = no
> Allow Higher Duplicates = yes
> 
> I ran the job, and the moment I started a job of the same name but with 
> a higher priority. First job still worked, second job had a status 
> "waiting for higher priority jobs to finish". Why the first job does not 
> abort and the second job starts out of hand?

My guess: Because it was already running.  Bacula does not cancel
running jobs.  The directives are applied to jobs as they are being
added to the queue.

Try running three jobs and then you'll see.

Then you might want to look into these directives:

Cancel Queued Duplicates = 
If this directive is set to yes (default) any job that is already queued 
to run but not yet running will be canceled.

Cancel Running Duplicates = 
If this directive is set to yes any job that is already running will be 
canceled. The default is no.

> 
> This is my configuration and steps for this situation:
> 
> 
> Storage {
>  Name = UrzadzeniePlikowe
>  Address = darkstar
>  SDPort = 9103
>  Password = "*"
>  Media Type = Plik
>  Device = UrzadzeniePlikoweDev
>  Maximum Concurrent Jobs = 10
> }
> 
> Storage {
>  Name = UP
>  Address = darkstar
>  SDPort = 9103
>  Password = "*"
>  Media Type = Plik
>  Device = UPDev
>  Maximum Concurrent Jobs = 10
> }
> 
> Pool {
>  Name = Tescik
>  Pool Type = Backup
>  Recycle = no
>  Storage = UrzadzeniePlikowe
> }
> 
> Pool {
>  Name = Paktos
>  Pool Type = Backup
>  Recycle = no
>  Storage = UP
> }
> 
> Job {
>  Name = QemuImages
>  Type = Backup
>  Level = Full
>  Pool = Tescik
>  Client = darkstar-fd
>  Messages = DirMessages
>  FileSet = QemuImages_FileSet
>  Allow Duplicate Jobs = no
>  Allow Higher Duplicates = yes
>  Maximum Concurrent Jobs = 10
> }
> 
> run job=QemuImages storage=UP pool=Paktos priority=15
> run job=QemuImages storage=UrzadzeniePlikowe pool=Tescik priority=10
> 
> *run job=QemuImages storage=UP pool=Paktos priority=15
> Run Backup job
> JobName:  QemuImages
> Level:Full
> Client:   darkstar-fd
> FileSet:  QemuImages_FileSet
> Pool: Paktos (From User input)
> Storage:  UP (From command line)
> When: 2010-01-30 22:01:17
> Priority: 15
> OK to run? (yes/mod/no): yes
> Job queued. JobId=246
> *run job=QemuImages storage=UrzadzeniePlikowe pool=Tescik priority=10
> Run Backup job
> JobName:  QemuImages
> Level:Full
> Client:   darkstar-fd
> FileSet:  QemuImages_FileSet
> Pool: Tescik (From Job resource)
> Storage:  UrzadzeniePlikowe (From command line)
> When: 2010-01-30 22:01:58
> Priority: 10
> OK to run? (yes/mod/no): yes
> Job queued. JobId=247
> 
> * status dir
> 
> ..
> ..
> Running Jobs:
> Console connected at 30-sty-10 22:00
>   JobId Level   Name   Status
> ==
> 246 FullQemuImages.2010-01-30_22.01.48_28 is running
> 247 FullQemuImages.2010-01-30_22.02.00_29 is waiting for higher 
> priority jobs to finish
> 
> .
> .
> 



--
The Planet: dedicated and managed hosting, cloud storage, colocation
Stay online with enterprise data centers and the best network in the business
Choose flexible plans and management services without long-term contracts
Personal 24x7 support from experience hosting pros just a phone call away.
http://p.sf.net/sfu/theplanet-com
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users