Hi Bryan,
I am trying to insert the following data in to Database using Nifi processor*
ConvertJsonToSql and PutSQL.*
*Json Object used:*
{"index":"1", "num":"1", "len":"58", "caplen":"54", "timestamp":"*Nov 4,
2015 00:42:15.0 CST*"}
Kindly find the *table description:*
maddb=# \d
Hi,
Thank you very much for all the support.
I have written a custom processor to split json to multiple json.
Now I would like to route the flowfile based on the content of the flowfile.
I tried using RouteOnContent. But it did not work.
Can you please help me how can i route the flowfile based
Parul,
You can use SplitJson to take a large JSON document and split an array
element into individual documents. I took the json you attached and created
a flow like GetFile -> SplitJson -> SplitJson -> PutFile
In the first SplitJson the path I used was $.packet.proto and in the second
I used
Hi,
I tried with the above json element. But I am getting the below mentioned
error:
2015-10-12 23:53:39,209 ERROR [Timer-Driven Process Thread-9]
o.a.n.p.standard.ConvertJSONToSQL
ConvertJSONToSQL[id=0e964781-6914-486f-8bb7-214c6a1cd66e] Failed to parse
Hi,
Thank you very much for all the support.
I could able to convert XML format to json using custom flume source.
Now I would need ConvertJSONToSQL processor to insert data into SQL.
I am trying to get hands-on on this processor. Will update you on this.
Meanwhile if any example you could
I think ConvertJSONToSQL expects a flat document of key/value pairs, or an
array of flat documents. So I think your JSON would be:
[
{"firstname":"John", "lastname":"Doe"},
{"firstname":"Anna", "lastname":"Smith"}
]
The table name will come from the Table Name property.
Let us know if
Can you share the source code to the custom source?
-Joey
> On Oct 10, 2015, at 03:34, Parul Agrawal wrote:
>
> Hi,
>
> I added custom flume source and when flume source is sending the data to
> flume sink, below mentioned error is thrown at flume sink.
>
>
I've done something like this by wrapping the command in a shell script:
http://ingest.tips/2014/12/22/getting-started-with-apache-nifi/
My use case was slightly different, but I'm pretty sure you can adapt the same
idea.
-Joey
> On Oct 10, 2015, at 03:52, Parul Agrawal
Hi,
I actually need to get the data from pipe.
So the actual command I would need is mkfifo /tmp/packet;tshark -i ens160
-T pdml >/tmp/packet.
Is it possible to use ExecuteProcessor for multiple commands ?
On Sat, Oct 10, 2015 at 1:04 PM, Parul Agrawal
wrote:
> Hi,
>
Hi,
I added custom flume source and when flume source is sending the data to
flume sink, below mentioned error is thrown at flume sink.
Administratively Yielded for 1 sec due to processing failure
2015-10-10 02:30:45,027 WARN [Timer-Driven Process Thread-9]
o.a.n.c.t.ContinuallyRunProcessorTask
Hi Parul,
It is possible to deploy a custom Flume source/sink to NiFi, but due to the
way the Flume processors load the classes for the sources and sinks, the
jar you deploy to the lib directory also needs to include the other
dependencies your source/sink needs (or they each need to individually
> If you plan to use NiFi for the long term, it might be worth investing in
> converting your custom Flume components to NiFi processors. We can help you
> get started if you need any guidance going that route.
+1. Running Flume sources/sinks is meant as a transition step. It's
really useful if
Hello,
The NiFi Flume processors are for running Flume sources and sinks with in
NiFi. They don't communicate with an external Flume process.
In your example you would need an ExecuteFlumeSource configured to run the
netcat source, connected to a ExecuteFlumeSink configured with the logger.
13 matches
Mail list logo