Hi Sangavi,
Good question, I thought it can be a nice example to illustrate how to
use LookupService.
I wrote a simple Gist entry with a NiFi template file to do what you
are looking for.
"Join, Enrich multiple columns by looking up an external CSV file"
https://gist.github.com/ijokarumawak/b9c95
Hi Austin,
I think there are a couple of ways to do that:
1. UpdateRecord with CSVReader and CSVWriter, update a column with a
Record Path and Expression Language, e.g. Add a dynamic property,
key=/filename, value=${filename}
2. Use SpritText to sprit each CSV record into a FlowFile, then
combine
all,
I am trying to take a csv, add a column to it that contains the filename of
the csv, and then insert that record into postgres using the
putdatabaserecord processor. Any idea what would be the best way to go
about doing this?
--
Austin Duncan
*Researcher*
PYA Analytics
2220 Sutherland Av
Ah ok, I see that you have a deeper issue there that probably needs to be
addressed within the web-framework in the main project.
Afraid I can't help with that, but I wish you luck!
On Tue, Jan 30, 2018 at 6:09 PM Ryan H
wrote:
> Hi Dan,
>
> Thanks for the info on the changes you made to the Doc
Hi Dan,
Thanks for the info on the changes you made to the Docker image. I am
essentially doing the same thing as I have built our own Docker and have a
wrapper script that provides the ability to configure the nifi.properties
file based on env variables at run time. The problem I am facing is tha
Hi Ravi,
It is not an official part of the NiFi project, but i have been
collaborating with a few people on a community effort to provide Python
automation for NiFi - in our next release there is functionality to
reconfigure processors:
Here is the relevant function in NiPyApi
https://github.com/Ch
Hi Ryan,
I have proposed a small change to the Docker image which may help you here
- https://github.com/apache/nifi/pull/2439
Essentially it exposes the port and hostname to be used within
nifi.properties as environment variables which you can pass in at runtime.
Perhaps the approach used will ass
Sounds great, will do.
Dan
On Tue, Jan 30, 2018 at 6:53 AM Joe Witt wrote:
> Dan
>
> I'd add that when it doubt get a thread dump out. If ever the system
> seems to be behavior incorrectly run
>
> bin/nifi.sh dump
>
> wait 30 seconds
>
> bin/nifi.sh dump
>
> And ideally send the full contents
Hello,
The timestamp in Kafka is separate from the headers, currently there
isn't a way to specify the timestamp from NiFi.
For PublishKafkaRecord, I could see having an option to take the value
of a specified field from each record and make that the timestamp,
assuming it can be converted to a l
Dan
I'd add that when it doubt get a thread dump out. If ever the system
seems to be behavior incorrectly run
bin/nifi.sh dump
wait 30 seconds
bin/nifi.sh dump
And ideally send the full contents of the logs directory in a tar.gz
tar czvf nifilogs.tar.gz logs
Thanks
Joe
On Tue, Jan 30, 2018
Hello Koji,
I don't see any OOM errors in the logs, I'll keep an eye on the avail.
thread count. Thank you.
Regards,
Dan
On Mon, Jan 29, 2018 at 10:49 PM Koji Kawamura
wrote:
> Hi Dan,
>
> If all available Timer Driven Thread are being used (or hang
> unexpectedly for some reason), then no pr
Hi,
I would like to join multiple columns to input record using single lookup
service.Is it possible in LookupRecord processor??
For Example,
I have an input csv file with columns.
File 1:
*Emp_Id,Name,Address,Mobile No*
And another file which is used as input for lookup service,
File 2:
*E
12 matches
Mail list logo