The kind of program I intend to submit would be one that sets up
a StreamExecutionEnvironment, connects to a stream from a Kafka topic, and
uses a PatternStream over the kafka events. I could have the jar for this
program readily available in the "jobmanager.web.upload.dir", and use the
REST API for submitting a program with some configuration params.

Does that sound like it'd work or am I missing something?

Moiz

On Wed, May 3, 2017 at 3:23 PM, Moiz S Jinia <moiz.ji...@gmail.com> wrote:

> Not sure I understand Operators. What I need is to have a Pattern that
> starts consuming from a Kafka stream. And I need the Patterns to come and
> go.
>
> Another option that comes to mind is this -
> The Patterns I'll need are well known in advance. Only certain parameters
> such as the time duration of the within clause, and maybe certain filter
> confitions of the where clause need tweaking. So I could pre-deploy the
> Patterns (or jobs) and start or stop them (with parameters).
>
> Does that sound feasible?
>
> On Wed, May 3, 2017 at 3:15 PM, Aljoscha Krettek <aljos...@apache.org>
> wrote:
>
>> What would the pattern be added to. An existing custom operator?
>>
>> The REST interface only allows for managing the lifecycle of a job, not
>> modifying their graph structure.
>>
>> On 3. May 2017, at 11:43, Moiz S Jinia <moiz.ji...@gmail.com> wrote:
>>
>> Thanks for the references. Looking at the REST API, would adding new
>> Patterns not work via this?
>>
>> https://ci.apache.org/projects/flink/flink-docs-release-1.3/
>> monitoring/rest_api.html#submitting-programs
>>
>> Moiz
>>
>> On Wed, May 3, 2017 at 2:52 PM, Aljoscha Krettek <aljos...@apache.org>
>> wrote:
>>
>>> Hi,
>>> For managing a Job you can either use the bin/flink command-line tool or
>>> the Rest API [1]. As for dynamically adding patterns, that’s outside of the
>>> scope of Flink right now. There are, however, some users that implemented
>>> this on top of Flink, see for example RBEA [2]. The basic idea is to use a
>>> ConnectedStream where one input is the main input and the other input is a
>>> control stream that updates the existing patterns.
>>>
>>> [1] https://ci.apache.org/projects/flink/flink-docs-release-
>>> 1.3/monitoring/rest_api.html
>>> [2] https://techblog.king.com/rbea-scalable-real-time-analytics-king/
>>>
>>> On 3. May 2017, at 10:02, Moiz S Jinia <moiz.ji...@gmail.com> wrote:
>>>
>>> Is there an API that allows remotely adding, modifying, and cancelling
>>> Flink jobs? Example - changing the time window of a deployed Pattern,
>>> adding new Patterns, etc.
>>>
>>> Whats the best way to go about this? To the end user the Pattern would
>>> manifest as rules that can be updated anytime.
>>>
>>>
>>> Moiz
>>>
>>>
>>>
>>
>>
>

Reply via email to