Hi Sanjeet,
I am glad to hear you like NiPyAPI and have been finding it useful.
I did update the client to support the new API calls for Parameters in
1.10, they are in the nipyapi.nifi.* low level client today.
I plan to work on some higher level calls in nipyapi.canvas.* soon, similar
to update_
Athena really isn't designed for single record inserts as each insert will
create another file in S3 and the driver behaves a lot more like Hive than a
regular jdbc connection so that processor probably won't ever work. To load
data into Athena from NiFi you can either use ConvertRecord to conve
Hi,
I am relatively new to Nifi, and am trying to use PutDatabaseRecord using
the Athena JDBC 42 driver to populate an AWS Athena Table. I have designed
the flow to generate JSON records out of a large XML file, and in order to
debug the flow I have split the flow into to two PutDatabaseRecord
pro
Hi Mark,
in our test environment, the repository is only 1 GB. We waited for about an
hour. During this time we could not access the old entries. However, new
entries have been added. We always tested this via the global (hamburger)
menu -> Provenance.
Thanks
Alec
--
Sent from: http://apache-
Hi ,
Previously i have used nipyapi.canvas.update_variable_registry() to update
my processor group variables. I have updated it to nifi 1.10.So I want to
use new feature parameter context which is present in 1.10 nifi version.
I am able to update this parameters both sensitive and non sensitive