Re: Continues Job

2019-11-26 Thread Sreejith Variyath
Thanks Karl.

On Tue, Nov 26, 2019 at 1:39 PM Karl Wright  wrote:

> No, just changing the job characteristics will NOT cause the incremental
> behavior to be erased.
>
> Karl
>
>
> On Mon, Nov 25, 2019 at 10:20 PM Sreejith Variyath <
> sreejith.variy...@tarams.com> wrote:
>
>> Yes. I understood. Thanks Karl.
>>
>> I have another question. If I update job type from  TYPE_SPECIFIED  to
>> TYPE_CONTINUOUS , Then the document versioning will reset and job will
>> pick all the documents again?.
>>
>> On Tue, Nov 26, 2019, 05:12 Karl Wright  wrote:
>>
>>> One of the characteristics of continuous jobs is that they call
>>> addSeedDocuments multiple times on a single job run.  The job run never
>>> ends, so this is how the job picks up documents for the infinitely-running
>>> job.  That's just the way it works.  Have you read the book?
>>>
>>> Karl
>>>
>>>
>>> On Mon, Nov 25, 2019 at 5:37 PM SREEJITH va 
>>> wrote:
>>>
 Hi Every One,

 I am trying to setup a job which is having a JDBC repository connector.
 One transformation connector and a custom output connector.

 I want this job needs to run in two mode.

- Sample Mode : This is a sample migration mode. Job will pick 10
documents and migrate to output repository. Then pause the job. I am
planning to pause the job using quartz job depends on the document
processing and document in queue count. This sample run can do "n" 
 times.
- Actual Mode : This is the actual migration mode. In this mode,
The same job needs to runs continuously. The remaining documents after 
 the
sample migration and also any new documents should pick and migrate.

 Could any one please help me on what schedule settings I should follow
 for this kind of job.  Currently I tried with following settings


   jobDescription.setStartMethod(IJobDescription.START_DISABLE);
 jobDescription.setType(IJobDescription.TYPE_CONTINUOUS);
 jobDescription.setInterval(30l);
 jobDescription.setReseedInterval(12l);

 Also I have idNodeQuery set to pick 10 documents  (in sample mode) in
 the JDBC repo connector. But during the job startup in sample mode,
 The addSeedDocuments(...) API getting invoked twice and it causing to
 process more than 10 documents. I do have version query. And the documents
 are not processing in subsequent runs unless there is a version change.

 Really appreciate if some one can help me on these two queries.





 --
 Regards
 -Sreejith

>>>
>> www.tarams.com
>> =
>> DISCLAIMER: The information in this message is confidential and may be
>> legally privileged. It is intended solely for the addressee. Access to this
>> message by anyone else is unauthorized. If you are not the intended
>> recipient, any disclosure, copying, or distribution of the message, or any
>> action or omission taken by you in reliance on it, is prohibited and may be
>> unlawful. Please immediately contact the sender if you have received this
>> message in error. Further, this e-mail may contain viruses and all
>> reasonable precaution to minimize the risk arising there from is taken by
>> Tarams. Tarams is not liable for any damage sustained by you as a result of
>> any virus in this e-mail. All applicable virus checks should be carried out
>> by you before opening this e-mail or any attachment thereto.
>> Thank you - Tarams Software Technologies Pvt.Ltd.
>> =
>>
>

-- 
Best Regards,


*Sreejith Variyath*
Lead Software Engineer
Tarams Software Technologies Pvt. Ltd.
Venus Buildings, 2nd Floor 1/2,3rd Main,
Kalyanamantapa Road Jakasandra, 1st Block Kormangala
Bangalore - 560034
Tarams 

-- 
www.tarams.com     
=

DISCLAIMER:
 The information in this message is confidential and may be 
legally 
privileged. It is intended solely for the addressee. Access to 
this 
message by anyone else is unauthorized. If you are not the intended 

recipient, any disclosure, copying, or distribution of the message, or 
any 
action or omission taken by you in reliance on it, is prohibited and
 may 
be unlawful. Please immediately contact the sender if you have 
received 
this message in error. Further, this e-mail may contain viruses
 and all 
reasonable precaution to minimize the risk arising there from 
is taken by 
Tarams. Tarams is not liable for any damage sustained by you
 as a result 
of any virus in this e-mail. All applicable virus checks 
should be carried 
out by you before opening this e-mail or any 
attachment thereto.
Thank you 
- Tarams Software Technologies Pvt.Ltd.
=


Re: Continues Job

2019-11-26 Thread Karl Wright
No, just changing the job characteristics will NOT cause the incremental
behavior to be erased.

Karl


On Mon, Nov 25, 2019 at 10:20 PM Sreejith Variyath <
sreejith.variy...@tarams.com> wrote:

> Yes. I understood. Thanks Karl.
>
> I have another question. If I update job type from  TYPE_SPECIFIED  to
> TYPE_CONTINUOUS , Then the document versioning will reset and job will
> pick all the documents again?.
>
> On Tue, Nov 26, 2019, 05:12 Karl Wright  wrote:
>
>> One of the characteristics of continuous jobs is that they call
>> addSeedDocuments multiple times on a single job run.  The job run never
>> ends, so this is how the job picks up documents for the infinitely-running
>> job.  That's just the way it works.  Have you read the book?
>>
>> Karl
>>
>>
>> On Mon, Nov 25, 2019 at 5:37 PM SREEJITH va 
>> wrote:
>>
>>> Hi Every One,
>>>
>>> I am trying to setup a job which is having a JDBC repository connector.
>>> One transformation connector and a custom output connector.
>>>
>>> I want this job needs to run in two mode.
>>>
>>>- Sample Mode : This is a sample migration mode. Job will pick 10
>>>documents and migrate to output repository. Then pause the job. I am
>>>planning to pause the job using quartz job depends on the document
>>>processing and document in queue count. This sample run can do "n" times.
>>>- Actual Mode : This is the actual migration mode. In this mode, The
>>>same job needs to runs continuously. The remaining documents after the
>>>sample migration and also any new documents should pick and migrate.
>>>
>>> Could any one please help me on what schedule settings I should follow
>>> for this kind of job.  Currently I tried with following settings
>>>
>>> jobDescription.setStartMethod(IJobDescription.START_DISABLE);
>>> jobDescription.setType(IJobDescription.TYPE_CONTINUOUS);
>>> jobDescription.setInterval(30l);
>>> jobDescription.setReseedInterval(12l);
>>>
>>> Also I have idNodeQuery set to pick 10 documents  (in sample mode) in
>>> the JDBC repo connector. But during the job startup in sample mode,
>>> The addSeedDocuments(...) API getting invoked twice and it causing to
>>> process more than 10 documents. I do have version query. And the documents
>>> are not processing in subsequent runs unless there is a version change.
>>>
>>> Really appreciate if some one can help me on these two queries.
>>>
>>>
>>>
>>>
>>>
>>> --
>>> Regards
>>> -Sreejith
>>>
>>
> www.tarams.com
> =
> DISCLAIMER: The information in this message is confidential and may be
> legally privileged. It is intended solely for the addressee. Access to this
> message by anyone else is unauthorized. If you are not the intended
> recipient, any disclosure, copying, or distribution of the message, or any
> action or omission taken by you in reliance on it, is prohibited and may be
> unlawful. Please immediately contact the sender if you have received this
> message in error. Further, this e-mail may contain viruses and all
> reasonable precaution to minimize the risk arising there from is taken by
> Tarams. Tarams is not liable for any damage sustained by you as a result of
> any virus in this e-mail. All applicable virus checks should be carried out
> by you before opening this e-mail or any attachment thereto.
> Thank you - Tarams Software Technologies Pvt.Ltd.
> =
>


Re: Continues Job

2019-11-25 Thread Sreejith Variyath
Yes. I understood. Thanks Karl.

I have another question. If I update job type from  TYPE_SPECIFIED  to
TYPE_CONTINUOUS , Then the document versioning will reset and job will pick
all the documents again?.

On Tue, Nov 26, 2019, 05:12 Karl Wright  wrote:

> One of the characteristics of continuous jobs is that they call
> addSeedDocuments multiple times on a single job run.  The job run never
> ends, so this is how the job picks up documents for the infinitely-running
> job.  That's just the way it works.  Have you read the book?
>
> Karl
>
>
> On Mon, Nov 25, 2019 at 5:37 PM SREEJITH va  wrote:
>
>> Hi Every One,
>>
>> I am trying to setup a job which is having a JDBC repository connector.
>> One transformation connector and a custom output connector.
>>
>> I want this job needs to run in two mode.
>>
>>- Sample Mode : This is a sample migration mode. Job will pick 10
>>documents and migrate to output repository. Then pause the job. I am
>>planning to pause the job using quartz job depends on the document
>>processing and document in queue count. This sample run can do "n" times.
>>- Actual Mode : This is the actual migration mode. In this mode, The
>>same job needs to runs continuously. The remaining documents after the
>>sample migration and also any new documents should pick and migrate.
>>
>> Could any one please help me on what schedule settings I should follow
>> for this kind of job.  Currently I tried with following settings
>>
>> jobDescription.setStartMethod(IJobDescription.START_DISABLE);
>> jobDescription.setType(IJobDescription.TYPE_CONTINUOUS);
>> jobDescription.setInterval(30l);
>> jobDescription.setReseedInterval(12l);
>>
>> Also I have idNodeQuery set to pick 10 documents  (in sample mode) in the
>> JDBC repo connector. But during the job startup in sample mode,
>> The addSeedDocuments(...) API getting invoked twice and it causing to
>> process more than 10 documents. I do have version query. And the documents
>> are not processing in subsequent runs unless there is a version change.
>>
>> Really appreciate if some one can help me on these two queries.
>>
>>
>>
>>
>>
>> --
>> Regards
>> -Sreejith
>>
>

-- 
www.tarams.com     
=

DISCLAIMER:
 The information in this message is confidential and may be 
legally 
privileged. It is intended solely for the addressee. Access to 
this 
message by anyone else is unauthorized. If you are not the intended 

recipient, any disclosure, copying, or distribution of the message, or 
any 
action or omission taken by you in reliance on it, is prohibited and
 may 
be unlawful. Please immediately contact the sender if you have 
received 
this message in error. Further, this e-mail may contain viruses
 and all 
reasonable precaution to minimize the risk arising there from 
is taken by 
Tarams. Tarams is not liable for any damage sustained by you
 as a result 
of any virus in this e-mail. All applicable virus checks 
should be carried 
out by you before opening this e-mail or any 
attachment thereto.
Thank you 
- Tarams Software Technologies Pvt.Ltd.
=


Re: Continues Job

2019-11-25 Thread Karl Wright
One of the characteristics of continuous jobs is that they call
addSeedDocuments multiple times on a single job run.  The job run never
ends, so this is how the job picks up documents for the infinitely-running
job.  That's just the way it works.  Have you read the book?

Karl


On Mon, Nov 25, 2019 at 5:37 PM SREEJITH va  wrote:

> Hi Every One,
>
> I am trying to setup a job which is having a JDBC repository connector.
> One transformation connector and a custom output connector.
>
> I want this job needs to run in two mode.
>
>- Sample Mode : This is a sample migration mode. Job will pick 10
>documents and migrate to output repository. Then pause the job. I am
>planning to pause the job using quartz job depends on the document
>processing and document in queue count. This sample run can do "n" times.
>- Actual Mode : This is the actual migration mode. In this mode, The
>same job needs to runs continuously. The remaining documents after the
>sample migration and also any new documents should pick and migrate.
>
> Could any one please help me on what schedule settings I should follow for
> this kind of job.  Currently I tried with following settings
>
> jobDescription.setStartMethod(IJobDescription.START_DISABLE);
> jobDescription.setType(IJobDescription.TYPE_CONTINUOUS);
> jobDescription.setInterval(30l);
> jobDescription.setReseedInterval(12l);
>
> Also I have idNodeQuery set to pick 10 documents  (in sample mode) in the
> JDBC repo connector. But during the job startup in sample mode,
> The addSeedDocuments(...) API getting invoked twice and it causing to
> process more than 10 documents. I do have version query. And the documents
> are not processing in subsequent runs unless there is a version change.
>
> Really appreciate if some one can help me on these two queries.
>
>
>
>
>
> --
> Regards
> -Sreejith
>


Re: Continues Job Crawling

2014-12-08 Thread Karl Wright
Hi Babita,

How you use continuous crawling depends on what you are trying to
accomplish with it.  In the continuous crawling model, ManifoldCF requeues
documents after it crawls them, and checks them again after an interval
that is determined in part by how often they've changed in the past.  The
job therefore runs forever, or at least until there are no documents
whatsoever in the job queue.

Continuous crawling also has no way of deleting documents that are no
longer reachable from seeds, EXCEPT when hop count is in play.  For this
reason, in many cases it is a good idea to have documents expire after a
time.

Since you are crawling SharePoint, I can say the following:

- There is no point in reseeding, because only one document is ever seeded
(the root document).  So set the reseed interval to infinity (blank).
- Refetching of documents is sufficient to determine if a document has been
deleted.  So set the expiration interval to infinity (blank).
- Recrawl interval and maximum recrawl interval is up to you.  You would
need to set these based on how often you want MCF to recheck any given
SharePoint document for changes. Setting this parameter too small means
that MCF will be refetching documents constantly, which would place a heavy
load on SharePoint.  Setting this too high would mean that changes might
not be noticed for more time.

Thanks,
Karl


On Mon, Dec 8, 2014 at 9:34 AM, Babita Bansal babita.ban...@gmail.com
wrote:

 Hi Karl

 Hope you are doing good.

 We will scheduling share point jobs continues, Could you please let me
 know the recommended values for these 4 parameters?

 Recrawl interval (if continuous): minutes (blank=infinity)Maximum recrawl
 interval (if continuous): minutes (blank=infinity)Expiration interval (if
 continuous): minutes (blank=infinity)Reseed interval (if continuous): minutes
 (blank=infinity)


 Thanks
 Babita Bansal