Hi,
I am pushing some database records into HDFS using Sqoop.
I want to perform some validations on each record in the HDFS data. Which
NiFi processor can I use to split each record (separated by a new line
character) and perform validations?
For validations I want to verify a particular column
his time.
>
> I'm interested to understand your case more if you don't mind though.
> You mention you're getting data from Sqoop into HDFS. How is NiFi
> involved in that flow - is it after data lands in HDFS you're pulling
> it into NiFi?
>
> Thanks
Hi,
I can cache some data to be used in NiFi flow. I can see the
processor PutDistributedMapCache in the documentation which saves key-value
pairs in DistributedMapCache for NiFi but I do not see any processor to red
this data. How can I read data from DistributedMapCache in my data flow?
Thanks
uplicate processor to find
>> those duplicates.
>>
>> Was there a different use case you were looking to solve using the
>> Distributed cache service?
>>
>> Thanks,
>> Matt
>>
>> On Tue, Jan 12, 2016 at 4:36 AM, sudeep mishra
>
t me know or comment on the ticket.
>
> [1] https://issues.apache.org/jira/browse/NIFI-1382
>
> Joe
> - - - - - -
> *Joseph Percivall*
> linkedin.com/in/Percivall
> e: joeperciv...@yahoo.com
>
>
>
> On Tuesday, January 12, 2016 9:46 AM, sudeep mishra <
> s
Hi,
Do we have any processor to push and retrieve data from Redis?
Thanks & Regards,
Sudeep Shekhar Mishra
ow what is in there in order to use it later.
>
> [1] https://issues.apache.org/jira/browse/NIFI-1382
>
> Joe
> - - - - - -
> Joseph Percivall
> linkedin.com/in/Percivall
> e: joeperciv...@yahoo.com
>
>
>
> On Tuesday, January 12, 2016 11:34 PM, sudeep mishra &
.com/in/Percivall
> e: joeperciv...@yahoo.com
>
>
>
> On Wednesday, January 13, 2016 10:56 AM, sudeep mishra <
> sudeepshekh...@gmail.com> wrote:
>
>
>
> Thank you very much Joe.
>
> Can you please let me know how I can use the .patch file? I am using the
> N
Is it possible to build the code for only a particular processor? Just
curious if we can build and deploy a particular processor in an existing
NiFi environment.
On Wed, Jan 13, 2016 at 9:33 PM, sudeep mishra
wrote:
> Thanks Joe. I will try out the patch.
>
> On Wed, Jan 13, 2016 a
Upon building the repository we get different .nar files which can be
updated in the lib for my requirement.
Thanks for your help.
On Thu, Jan 14, 2016 at 9:27 AM, sudeep mishra
wrote:
> Is it possible to build the code for only a particular processor? Just
> curious if we can build and
Hi,
Can we configure to run a processor to run only 'N' times. In my data flow
I want that some processor should run only once. How can I achieve it?
Thanks & Regards,
Sudeep
he thread here <
> https://www.mail-archive.com/users@nifi.apache.org/msg01051.html>.
>
> The short answer currently is: No, it's not possible.
>
> Cheers,
> Lars
>
> On Thu, Jan 14, 2016 at 8:42 AM, sudeep mishra
> wrote:
>
>> Hi,
>>
>> C
Thanks Joe. The GetDistributedMapCache seems to be working fine.
Is there a way to clear DistributedMapCache on demand?
Regards,
Sudeep
On Thu, Jan 14, 2016 at 12:42 PM, sudeep mishra
wrote:
> Upon building the repository we get different .nar files which can be
> updated in the lib
ercivall
> e: joeperciv...@yahoo.com
>
>
>
> On Thursday, January 14, 2016 7:04 AM, sudeep mishra <
> sudeepshekh...@gmail.com> wrote:
>
>
> Thanks Joe. The GetDistributedMapCache seems to be working fine.
>
> Is there a way to clear DistributedMapCache on demand?
Hi,
Do we have any processors in NiFi to push data from HDFS to Hive and to
read adat from Hive?
Thanks & Regards,
Sudeep Shekhar Mishra
you will want to make sure that you use the hive
> standalone jar which contains all of the required classes
>
> Hope this helps
>
> Sent from my iPhone
>
> > On Jan 18, 2016, at 2:24 AM, sudeep mishra
> wrote:
> >
> > Hi,
> >
> > Do we have
Hi,
I am getting an error as 'failed to invoke @OnSchedule method due to
java.net.SocketException:Permission denied' when using 'ListenHTTP
processor'.
Please suggest how to resolve the issue.
Thanks & Regards,
Sudeep Shekhar Mishra
rt.
> All ports 1024 and below are privileged ports and must be opened by root.
> Try setting the port number on ListenHTTP to something higher than 1024
>
> Sent from my iPhone
>
> > On Jan 19, 2016, at 2:11 AM, sudeep mishra
> wrote:
> >
> > Hi,
> >
> &g
Listening on port 81 on Linux RHEL 6.5
On Tue, Jan 19, 2016 at 6:04 PM, wrote:
> Hi Sudeep,
>
> On which port are you listening ?
> Also which OS are you using ?
>
> Best regards,
> Louis-Etienne
>
> > On Jan 19, 2016, at 2:11 AM, sudeep mishra
> wrote:
&g
e comment made
>
> ListenHTTP by default does not set an input port and instead it is a
> required input.
>
> Thanks
> Joe
>
> On Tue, Jan 19, 2016 at 7:49 AM, sudeep mishra
> wrote:
> > Listening on port 81 on Linux RHEL 6.5
> >
> > On Tue, Jan 19, 2016 at 6:04 P
This will allow you
> to run your NiFi as a non root user.
> On Jan 20, 2016 1:23 AM, "sudeep mishra" wrote:
>
>> Thanks Joe.
>>
>> One doubt though... maybe some Linux changes required but do I have to
>> run NiFi as root user only to make use of ListenHTTP p
Hi,
I need to create some audits around the NiFi flows and want to add the time
a flow file was received by a particular processor. Is there a way to add
this date in the attributes for flow files?
I can see a date in the 'Details' section for a data provenance entry but
can we get such a date in
age-guide.html#format
>
> Hope that helps,
> Joe
> - - - - - -
> Joseph Percivall
> linkedin.com/in/Percivall
> e: joeperciv...@yahoo.com
>
>
>
> On Tuesday, February 2, 2016 1:17 AM, sudeep mishra <
> sudeepshekh...@gmail.com> wrote:
>
>
>
> H
gt; - - - - - -
> Joseph Percivall
> linkedin.com/in/Percivall
> e: joeperciv...@yahoo.com
>
>
>
> On Tuesday, February 2, 2016 12:11 PM, sudeep mishra <
> sudeepshekh...@gmail.com> wrote:
>
>
>
> Thanks Joe.
>
> The UpdateAttribute processor can be helpful for
AttributeToJson:
>>
>> "
>> *Destination* flowfile-attribute
>>
>>- flowfile-attribute
>>- flowfile-content
>>
>> Control if JSON value is written as a new flowfile attribute
>> 'JSONAttributes' or written in the flowfile content.
Hi,
I have following schema of records in MongoDB.
{
"_id" : ObjectId("56b1958a1ebecc0724588c39"),
"ContractNumber" : "ABC87gdtr53",
"DocumentType" : "TestDoc",
"FlowNr" : 3,
"TimeStamp" : "03/02/2016 05:51:09:023"
}
How can I query for a particular contra
quot;flow.file.contract.number" as an
> attribute and it would fetch documents matching that.
>
> I don't know that much about MongoDB, but does that sound like what you
> need?
>
> -Bryan
>
>
> On Thu, Feb 4, 2016 at 8:00 AM, sudeep mishra
> wrote:
>
>
, 2016 at 9:11 PM, sudeep mishra
wrote:
> Thanks for the feedback Bryan.
>
> Yes I need a processor similar to what you described.
>
> On Thu, Feb 4, 2016 at 7:38 PM, Bryan Bende wrote:
>
>> Hi Sudeep,
>>
>> From looking at the GetMongo processor, I don
g to every few seconds) ->
> UpdateAttribute (set your id attribute) -> FetchMongo
>
> Let us know if that doesn't help.
>
> On Thu, Feb 4, 2016 at 11:28 AM, sudeep mishra
> wrote:
>
>> Hi Bryan,
>>
>> I am trying to create a processor on the lines of g
Hi,
What is the preferred practice for logging details for NiFi data flow? How
can I use my own logging using log4j to log custom details for NiFi data
flow.
Thanks & Regards,
Sudeep
ifi/blob/31fba6b3332978ca2f6a1d693f6053d719fb9daa/nifi-api/src/main/java/org/apache/nifi/processor/AbstractProcessor.java
> [4] http://www.slf4j.org/legacy.html
>
>
> Andy LoPresto
> alopresto.apa...@gmail.com
> PGP Fingerprint: 70EC B3E5 98A6 5A3F D3C4 BACE 3C6E F65B 2F7D EF69
>
> O
Hi,
Can someone please guide how to use the ExtractText processor to add entire
flowfile content to an attribute?
Thanks & Regards,
Sudeep
Hi,
I am trying to build a data flow where I want outputs of two ExecuteSQL
processors to be combined and add certain extra metadata. The resulting
data should be in JSON format.
What is a good approach to achieve this?
Can I use the funnel component for this purpose?
Thanks & Regards,
Sudeep
ght be a
> good 'right now' answer.
>
> Thanks
> Joe
>
> On Wed, Feb 24, 2016 at 10:43 PM, sudeep mishra
> wrote:
> > Hi,
> >
> > I am trying to build a data flow where I want outputs of two ExecuteSQL
> > processors to be combined and add ce
or example:
> property name: MyContent
> value: (.*)
>
> The above value is a Java regular expression contained in a capture group.
>
> Matt
>
> On Wed, Feb 24, 2016 at 9:22 AM, sudeep mishra
> wrote:
>
>> Hi,
>>
>> Can someone please gui
Hi,
How can I search provenance data based on specific attributes?
Let us say in my flow I add an attribute as 'location'. How can I search
the provenance data for a particular location? Also is the list of
attributes based on which we can search the provenance data constant for a
flow?
Thanks &
ance Events that already exists, only for those that are
> created after
> that value was set.
>
> Thanks
> -Mark
>
>
> > On Mar 11, 2016, at 1:40 PM, sudeep mishra
> wrote:
> >
> > Hi,
> >
> > How can I search provenance data based on specific attr
Hi,
What information is stored as part of NiFi database_repository?
Thanks & Regards,
Sudeep
Hi,
Can someone please share the API documentation for NiFi.
Thanks & Regards,
Sudeep
g instance of NIFi you can click help. Or, you can click
> here https://nifi.apache.org/docs.html
>
> From that page scroll all the way down to the bottom. You'll see a
> developer section that says "REST API". Click on that and you're
> there.
>
> Thanks
>
Hi,
I have few doubts regarding NiFi cluster.
1) In case we schedule a processor to run on only one node using 'On
primary node' as 'Scheduling Strategy' then is the processor still
configured on all nodes? If yes, then does then which is the actual step in
processor lief cycle that takes place o
41 matches
Mail list logo