Fetching data from database on cluster.

2018-11-27 Thread Bronislav Jitnikov
I have small cluster with 3 instances of NiFi, and found some problem, as I
think.
Processor QueryDatabaseTable set to work on PrimaryNode only and Concurent
tasks set to 1. Run Schedule set to large value (something like 20
minutes), so I expect only one execution at a time. While query is
executed, primary node changed and new Task started on new primary node. So
I see two ways to resolve this problem:
1. Create some sort of lock on QueryDatabaseTable (maybe custom proc that
lock run across the cluster StateManager)
2. Add some check in  connectableTask.invoke() (Better for me because I
have similar problems with get data from REST).

May be I miss something: So any help and ideas would be appreciated.

Bronislav Zhitnikov

PS: and sorry for my bad English.


Re: Reading avro encoded message key from kafka

2018-11-27 Thread Ashwin Konale
Hi,
Its not encoding issue. I am not able to figure out how to read Kafka key 
itself.
eg. 
Kafka key = {type: foo, meta: etc, etc }
Kafka message = {Avro Payload}

I want to use RouteOnAttribute processor based on type = foo or bar. For this 
to happen, I need to extract value foo from Kafka.key to flow file.  Basically 
I am not able to figure out how to read key and extract attributes from it from 
Kafka message in nifi. Could you suggest me something here.

Thanks


On 2018/11/23 15:14:53, Mike Thomsen  wrote: 
> If you are having encoding-related issues with reading that attribute, try> 
> switching to the Kafka string serializer in your producer.> 
> 
> On Fri, Nov 23, 2018 at 10:12 AM ashwin konale > 
> wrote:> 
> 
> > Hi,> 
> > I have key-value pair of avro messages in kafka topic I want to consume> 
> > from. I can easily do modifications on message value using nifi> 
> > consumeKafkaRecord processor, but it doesnt show key of the message.> 
> > ConsumeKafka processor has kafka.key attribute but I am not sure how to> 
> > read its contents(Since it is avro encoded) and add certain values as> 
> > attributes to flowfile. Any pointers will be much helpful.> 
> >> 
> > Thanks> 
> >> 
> 

smime.p7s
Description: S/MIME cryptographic signature


Re: Reading avro encoded message key from kafka

2018-11-27 Thread Bryan Bende
Unfortunately I don't think there is a good way to interpret the value
of the key when it is Avro because we don't have any expression
language functions that understand Avro or record-oriented values.

The main option would be to change how the data is being produced in some way...

- Put the value you are interested in in a message header, then it
will come across as a string key/value pair in a flow file attribute
and use RouteOnAttrbute
- Put the value you are interested in in the message body somewhere,
use PartitionRecord to route on the value of the field in the message
- Use a different kind of key serialization like json which can then
be parsed with expression language functions

A possible improvement we could make is to add some kind of
"avro-to-json" EL function, then from there use the EL jsonPath
function.

On Tue, Nov 27, 2018 at 1:01 PM Ashwin Konale
 wrote:
>
> Hi,
> Its not encoding issue. I am not able to figure out how to read Kafka key 
> itself.
> eg.
> Kafka key = {type: foo, meta: etc, etc }
> Kafka message = {Avro Payload}
>
> I want to use RouteOnAttribute processor based on type = foo or bar. For this 
> to happen, I need to extract value foo from Kafka.key to flow file.  
> Basically I am not able to figure out how to read key and extract attributes 
> from it from Kafka message in nifi. Could you suggest me something here.
>
> Thanks
>
>
> On 2018/11/23 15:14:53, Mike Thomsen  wrote:
> > If you are having encoding-related issues with reading that attribute, try>
> > switching to the Kafka string serializer in your producer.>
> >
> > On Fri, Nov 23, 2018 at 10:12 AM ashwin konale >
> > wrote:>
> >
> > > Hi,>
> > > I have key-value pair of avro messages in kafka topic I want to consume>
> > > from. I can easily do modifications on message value using nifi>
> > > consumeKafkaRecord processor, but it doesnt show key of the message.>
> > > ConsumeKafka processor has kafka.key attribute but I am not sure how to>
> > > read its contents(Since it is avro encoded) and add certain values as>
> > > attributes to flowfile. Any pointers will be much helpful.>
> > >>
> > > Thanks>
> > >>
> >


Re: API to get all Policies

2018-11-27 Thread Lars Francke
Kevin,

thank you for your answer and sorry for the long delay.

I agree that paging is a concern and I thought about it as well but I also
agree that in this case it's probably not worth the hassle.

I'll go ahead and create a Jira. I plan on providing a patch but not sure
when I'll get to it.

Cheers,
Lars

On Fri, Nov 9, 2018 at 6:38 PM Kevin Doran  wrote:

> Hi Lars,
>
> I think as long as the following are true (it sounds like they are from
> what you have looked at):
>
> 1. the proposed endpoint does not require adding any additional
> Authorizable or policy to protect, and
> 2. the proposed endpoint does not expose any information that the
> authenticated client/user would not already have access to view, and is
> merely acting as a convenience method to return a list of things they could
> fetch individually
>
> then in that case this is probably fine. No objection from me.
>
> Any time we are adding a collection endpoint, my main concern is if
> pagination of the results also needs to be added (i.e., if for typical
> usage of NiFi, the response size of the JSON result would be larger than is
> reasonable to transmit in a single HTTP round trip, or if creating the
> response would be unreasonable load on the server). In typical usage of
> NiFi, I don't think the number of policies is that large (perhaps others
> can chime in if they feel differently?), so it would come down to what is
> the size of a policy element when returned in a list. If it is very large,
> you may also want to introduce a summary view/perspective of the policy
> that reduces the amount of information to the minimal that is required for
> a list view... I think that may already exist for NiFi in the
> AccessPolicySummary object, but it's been a while since I've looked at the
> code so I may be forgetting the details or confusing it with the NiFi
> Registry implementation, which does have a get all policies endpoint.
>
> Lastly, take care that the Swagger annotations that are used to drive the
> Rest API documentation are accurate. If you have any questions on that let
> me know. Happy to help review a PR if you submit one.
>
> Regards,
> Kevin
>
> On 11/9/18, 06:23, "Lars Francke"  wrote:
>
> I've just tried implementing my new resource and it seems to work fine
> and
> as I expect it to. Also in regards to authorization. Users cannot see
> anything that they are not allowed to do anyway.
>
> Regarding your other comments: I agree that it's probably not a super
> common use case.
>
> Either way I'd love to use a API that I can access remotely as I need
> to
> connect to other systems as well (e.g. Kafka, HBase etc.) so I don't
> want
> to colocate my service on one of the NiFi machines.
> But yes I could probably get a list of all resources somehow using the
> API
> and then send one request per resource. But that seems overly
> complicated.
>
> So if you don't object I'd create a Jira.
>
> Cheers,
> Lars
>
>
> On Fri, Nov 9, 2018 at 10:01 AM Lars Francke 
> wrote:
>
> > Andy,
> >
> > that's a good question. I have to admit that I thought about it and
> then
> > saw that there is already an Authorizable for this scenario so I
> assumed
> > that part was already taken care of. So whoever has the permission
> to view
> > "access all policies" would also be able to use the API? Were you
> thinking
> > of something different?
> >
> > Cheers,
> > Lars
> >
> >
> >
> > On Fri, Nov 9, 2018 at 12:35 AM Andy LoPresto 
> > wrote:
> >
> >> Lars,
> >>
> >> What access controls do you anticipate putting on this API endpoint
> and
> >> what potential issues do you see? I could see this being abused if
> not
> >> secured very carefully, and it doesn’t seem like a common use case
> >> (notwithstanding your current requirement). Is this something that
> can be
> >> done by using the NiFi CLI to iterate/recurse through the various
> PGs and
> >> retrieve these policies?
> >>
> >> Andy LoPresto
> >> alopre...@apache.org
> >> alopresto.apa...@gmail.com
> >> PGP Fingerprint: 70EC B3E5 98A6 5A3F D3C4  BACE 3C6E F65B 2F7D EF69
> >>
> >> > On Nov 9, 2018, at 3:31 AM, Lars Francke 
> >> wrote:
> >> >
> >> > Hi,
> >> >
> >> > I was tasked with writing a tool to generate a kind of "audit
> report".
> >> For
> >> > that I need to get all policies that people have across various
> systems.
> >> > NiFi is one of them.
> >> >
> >> > I see that we have a REST API for Policies but that doesn't
> expose a
> >> method
> >> > to expose _all_ policies. I'd like to add a REST endpoint that
> allows
> >> > retrieving all policies.
> >> > Before I open a Jira I wanted to get feedback whether this
> addition
> >> would
> >> > be acceptable.
> >> >
> >> > Implementation notes
> >> > This is how I see the current flow

Re: Reading avro encoded message key from kafka

2018-11-27 Thread Ashwin Konale
Hi,
Thanks a lot for the suggestion. I didn’t know about jsonpath EL functions. I 
can easily implement that in my use case.

- Ashwin

On 2018/11/27 18:52:05, Bryan Bende  wrote: 
> Unfortunately I don't think there is a good way to interpret the value> 
> of the key when it is Avro because we don't have any expression> 
> language functions that understand Avro or record-oriented values.> 
> 
> The main option would be to change how the data is being produced in some 
> way...> 
> 
> - Put the value you are interested in in a message header, then it> 
> will come across as a string key/value pair in a flow file attribute> 
> and use RouteOnAttrbute> 
> - Put the value you are interested in in the message body somewhere,> 
> use PartitionRecord to route on the value of the field in the message> 
> - Use a different kind of key serialization like json which can then> 
> be parsed with expression language functions> 
> 
> A possible improvement we could make is to add some kind of> 
> "avro-to-json" EL function, then from there use the EL jsonPath> 
> function.> 
> 
> On Tue, Nov 27, 2018 at 1:01 PM Ashwin Konale> 
>  wrote:> 
> >> 
> > Hi,> 
> > Its not encoding issue. I am not able to figure out how to read Kafka key 
> > itself.> 
> > eg.> 
> > Kafka key = {type: foo, meta: etc, etc }> 
> > Kafka message = {Avro Payload}> 
> >> 
> > I want to use RouteOnAttribute processor based on type = foo or bar. For 
> > this to happen, I need to extract value foo from Kafka.key to flow file.  
> > Basically I am not able to figure out how to read key and extract 
> > attributes from it from Kafka message in nifi. Could you suggest me 
> > something here.> 
> >> 
> > Thanks> 
> >> 
> >> 
> > On 2018/11/23 15:14:53, Mike Thomsen  wrote:> 
> > > If you are having encoding-related issues with reading that attribute, 
> > > try>> 
> > > switching to the Kafka string serializer in your producer.>> 
> > >> 
> > > On Fri, Nov 23, 2018 at 10:12 AM ashwin konale >> 
> > > wrote:>> 
> > >> 
> > > > Hi,>> 
> > > > I have key-value pair of avro messages in kafka topic I want to 
> > > > consume>> 
> > > > from. I can easily do modifications on message value using nifi>> 
> > > > consumeKafkaRecord processor, but it doesnt show key of the message.>> 
> > > > ConsumeKafka processor has kafka.key attribute but I am not sure how 
> > > > to>> 
> > > > read its contents(Since it is avro encoded) and add certain values as>> 
> > > > attributes to flowfile. Any pointers will be much helpful.>> 
> > > >>> 
> > > > Thanks>> 
> > > >>> 
> > >> 
> 

smime.p7s
Description: S/MIME cryptographic signature