[GitHub] [nifi] jtstorck commented on issue #3558: NIFI-6405 GetHDFSFileInfo ignores files in the root/start directory

2019-06-28 Thread GitBox
jtstorck commented on issue #3558: NIFI-6405 GetHDFSFileInfo ignores files in 
the root/start directory
URL: https://github.com/apache/nifi/pull/3558#issuecomment-506892312
 
 
   Initial tests run with the PR changes look good.  I will do a bit more 
review, and then most likely merge these changes tomorrow.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi] kaHaleMaKai opened a new pull request #3559: NIFI-1624 Allow ExtractText processor to fail if max. capture group l…

2019-06-28 Thread GitBox
kaHaleMaKai opened a new pull request #3559: NIFI-1624 Allow ExtractText 
processor to fail if max. capture group l…
URL: https://github.com/apache/nifi/pull/3559
 
 
   …ength is exceeded.
   
   Thank you for submitting a contribution to Apache NiFi.
   
   Please provide a short description of the PR here:
   
    Description of PR
   
   _Allow ExtractText processor to fail if max. capture group length is 
exceeded._
   
   In order to streamline the review of the contribution we ask you
   to ensure the following steps have been taken:
   
   ### For all changes:
   - [ x ] Is there a JIRA ticket associated with this PR? Is it referenced 
in the commit message?
   
   - [ x ] Does your PR title start with **NIFI-** where  is the JIRA 
number you are trying to resolve? Pay particular attention to the hyphen "-" 
character.
   
   - [ x ] Has your PR been rebased against the latest commit within the target 
branch (typically `master`)?
   
   - [ x ] Is your initial contribution a single, squashed commit? _Additional 
commits in response to PR reviewer feedback should be made on this branch and 
pushed to allow change tracking. Do not `squash` or use `--force` when pushing 
to allow for clean monitoring of changes._
   
   ### For code changes:
   - [ x ] Have you ensured that the full suite of tests is executed via `mvn 
-Pcontrib-check clean install` at the root `nifi` folder?
   - [ x ] Have you written or updated unit tests to verify your changes?
   - [ x ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
   - [ x ] If applicable, have you updated the `LICENSE` file, including the 
main `LICENSE` file under `nifi-assembly`?
   - [ x ] If applicable, have you updated the `NOTICE` file, including the 
main `NOTICE` file found under `nifi-assembly`?
   - [ x ] If adding new Properties, have you added `.displayName` in addition 
to .name (programmatic access) for each of the new properties?
   
   ### For documentation related changes:
   - [ x ] Have you ensured that format looks appropriate for the output in 
which it is rendered?
   
   ### Note:
   Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi] jtstorck commented on issue #3558: NIFI-6405 GetHDFSFileInfo ignores files in the root/start directory

2019-06-28 Thread GitBox
jtstorck commented on issue #3558: NIFI-6405 GetHDFSFileInfo ignores files in 
the root/start directory
URL: https://github.com/apache/nifi/pull/3558#issuecomment-506863907
 
 
   Reviewing...


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi-minifi-cpp] phrocker closed pull request #528: MINIFICPP-793: Allow SSL Context to be defined from properties

2019-06-28 Thread GitBox
phrocker closed pull request #528: MINIFICPP-793: Allow SSL Context to be 
defined from properties
URL: https://github.com/apache/nifi-minifi-cpp/pull/528
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi-minifi-cpp] phrocker commented on issue #528: MINIFICPP-793: Allow SSL Context to be defined from properties

2019-06-28 Thread GitBox
phrocker commented on issue #528: MINIFICPP-793: Allow SSL Context to be 
defined from properties
URL: https://github.com/apache/nifi-minifi-cpp/pull/528#issuecomment-506861110
 
 
   will re-open as a separate PR


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi] mcgilman commented on a change in pull request #3536: NIFI-6380: Introduced the notion of Parameters and Parameter Contexts…

2019-06-28 Thread GitBox
mcgilman commented on a change in pull request #3536: NIFI-6380: Introduced the 
notion of Parameters and Parameter Contexts…
URL: https://github.com/apache/nifi/pull/3536#discussion_r298638610
 
 

 ##
 File path: 
nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-api/src/main/java/org/apache/nifi/authorization/StandardAuthorizableLookup.java
 ##
 @@ -158,6 +159,18 @@ public Resource getResource() {
 }
 };
 
+private static final Authorizable PARAMETER_CONTEXTS_AUTHORIZABLE = new 
Authorizable() {
+@Override
+public Authorizable getParentAuthorizable() {
+return null;
 
 Review comment:
   I think the parameter context authorizable should have a parent authorizable 
of the controller authorizable. I believe feature proposal indicates the 
parameter contexts would exist at the controller level. By having the 
controller be the parent authorizable, anyone with access to the controller 
will automatically have access to the parameter contexts (unless overridden by 
a specific parameter context policy). By also introducing the parameter context 
authorizable (which you've done), we provide the ability to allow someone to 
manage parameters without also giving them access to the controller settings.
   
   By implementing this, it would also make migration easier as the users will 
not have to manually build out the new policies (since seeding only happens on 
initial startup). Additionally, we could remove the code that seeds the 
parameter context policies as we are already seeding policies for the 
controller.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi] mcgilman commented on a change in pull request #3536: NIFI-6380: Introduced the notion of Parameters and Parameter Contexts…

2019-06-28 Thread GitBox
mcgilman commented on a change in pull request #3536: NIFI-6380: Introduced the 
notion of Parameters and Parameter Contexts…
URL: https://github.com/apache/nifi/pull/3536#discussion_r298703815
 
 

 ##
 File path: 
nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-file-authorizer/src/main/java/org/apache/nifi/authorization/FileAccessPolicyProvider.java
 ##
 @@ -624,6 +625,10 @@ private void populateInitialAdmin(final Authorizations 
authorizations) {
 // grant the user read/write access to the /controller resource
 addUserToAccessPolicy(authorizations, 
ResourceType.Controller.getValue(), initialAdmin.getIdentifier(), READ_CODE);
 addUserToAccessPolicy(authorizations, 
ResourceType.Controller.getValue(), initialAdmin.getIdentifier(), WRITE_CODE);
+
+// grant the user read/write access to the /parameter-contexts resource
+addUserToAccessPolicy(authorizations, 
ResourceType.ParameterContext.getValue(), initialAdmin.getIdentifier(), 
READ_CODE);
+addUserToAccessPolicy(authorizations, 
ResourceType.ParameterContext.getValue(), initialAdmin.getIdentifier(), 
WRITE_CODE);
 
 Review comment:
   Assuming we add controller as a parent resource to parameter contexts, we 
could remove these lines.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi-minifi-cpp] achristianson commented on a change in pull request #603: MINIFICPP-929 mmap

2019-06-28 Thread GitBox
achristianson commented on a change in pull request #603: MINIFICPP-929 mmap
URL: https://github.com/apache/nifi-minifi-cpp/pull/603#discussion_r298711152
 
 

 ##
 File path: extensions/rocksdb-repos/DatabaseContentRepository.cpp
 ##
 @@ -38,8 +39,8 @@ bool DatabaseContentRepository::initialize(const 
std::shared_ptr

[GitHub] [nifi-minifi-cpp] achristianson commented on a change in pull request #603: MINIFICPP-929 mmap

2019-06-28 Thread GitBox
achristianson commented on a change in pull request #603: MINIFICPP-929 mmap
URL: https://github.com/apache/nifi-minifi-cpp/pull/603#discussion_r298711152
 
 

 ##
 File path: extensions/rocksdb-repos/DatabaseContentRepository.cpp
 ##
 @@ -38,8 +39,8 @@ bool DatabaseContentRepository::initialize(const 
std::shared_ptr

[jira] [Commented] (NIFI-1624) ExtractText - Add option to Throw Failure if Text is greater than Capture Group

2019-06-28 Thread Lars Winderling (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-1624?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16875147#comment-16875147
 ] 

Lars Winderling commented on NIFI-1624:
---

I have just started working on this issue. I will soon provide a PR on github.

> ExtractText - Add option to Throw Failure if Text is greater than Capture 
> Group
> ---
>
> Key: NIFI-1624
> URL: https://issues.apache.org/jira/browse/NIFI-1624
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 0.4.1, 1.5.0, 1.6.0, 1.7.0, 1.8.0, 1.9.0
>Reporter: Randy Bovay
>Priority: Major
>
> ExtractText allows us to specify the "Maximum Capture Group Length"
> Occasionally I will get a text string that is greater than what I've 
> specified.
> In these unexpected situations, I would LIKE to route this to something that 
> can handle it, or throw an error so I can look into why this is larger than 
> expected.
> The ask is to make the behavior configurable on LengthExceededBehavior.  With 
> options of 'truncate' or 'failure'.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [nifi-minifi-cpp] achristianson opened a new pull request #603: MINIFICPP-929 mmap

2019-06-28 Thread GitBox
achristianson opened a new pull request #603: MINIFICPP-929 mmap
URL: https://github.com/apache/nifi-minifi-cpp/pull/603
 
 
   --DRAFT PR please review... lots of code changes, so many eyes are 
welcome--
   
   This PR adds a mmap() interface to allow processors to map FlowFile payloads 
to a memory address. This increases efficiency and performance significantly 
for some use cases. The change does not negatively impact performance in almost 
all cases, as shown in benchmarks.
   
   Original/full reason/justification:
   
   "Currently, MiNiFi - C++ only support stream-oriented i/o to FlowFile 
payloads. This can limit performance in cases where in-place access to the 
payload is desirable. In cases where data can be accessed randomly and 
in-place, a significant speedup can be realized by mapping the payload into 
system memory address space. This is natively supported at the kernel level in 
Linux, MacOS, and Windows via the mmap() interface on files. Other 
repositories, such as the VolatileRepository, already store the entire payload 
in memory, so it is natural to pass through this memory block as if it were a 
memory-mapped file. While the DatabaseContentRepostory does not appear to 
natively support a memory map interface, accesses via an emulated memory-map 
interface should be possible with no performance degradation with respect to a 
full read via the streaming interface.
   
   Cases where in-place, random access is beneficial include, but are not 
limited to:
   
   in-place parsing of JSON (e.g. RapidJSON supports parsing in-place, at 
least for strings).
   access of payload via protocol buffers
   random access of large files on disk, where it would otherwise require 
many seek() and read() syscalls
   
   The interface should be accessible by processors via a mmap() call on 
ProcessSession (adjacent to read() and write()). A MemoryMapCallback should be 
provided, which is called back via a process() call where the argument is an 
instance of BaseMemoryMap. The BaseMemoryMap is extended for each type of 
repository that MiNiFi - C++ supports, including: FileSystemRepository, 
VolatileRepository, and DatabaseContentRepository.
   
   As part of the change, in addition to extensive unit test coverage, 
benchmarks should be written such that the performance impact can be 
empirically measured and evaluated."
   
   Here is the full benchmark suite:
   
   ```
   
---
   Benchmark
 Time CPU   Iterations
   
---
   FSMemoryMapBMFixture/MemoryMap_FileSystemRepository_Read_Tiny
  2956 ns 2923 ns   240558
   [2019-06-debugl 14:10:44.663] 
[org::apache::nifi::minifi::core::logging::LoggerConfiguration] [debug] 
org::apache::nifi::minifi::io::FileStream logger got sink
   s from namespace root and level error from namespace root
   FSMemoryMapBMFixture/Callback_FileSystemRepository_Read_Tiny 
  4258 ns 4227 ns   164835
   FSMemoryMapBMFixture/MemoryMap_FileSystemRepository_WriteRead_Tiny   
  7764 ns 7665 ns91078
   FSMemoryMapBMFixture/Callback_FileSystemRepository_WriteRead_Tiny
 14152 ns14022 ns49870
   FSMemoryMapBMFixture/MemoryMap_FileSystemRepository_Read_Small   
 15671 ns15631 ns44870
   FSMemoryMapBMFixture/Callback_FileSystemRepository_Read_Small
 21020 ns20977 ns33246
   FSMemoryMapBMFixture/MemoryMap_FileSystemRepository_WriteRead_Small  
 59944 ns59772 ns11701
   FSMemoryMapBMFixture/Callback_FileSystemRepository_WriteRead_Small   
 57354 ns57152 ns12237
   FSMemoryMapBMFixture/MemoryMap_FileSystemRepository_Read_Large  
3592536 ns  3587026 ns  194
   FSMemoryMapBMFixture/Callback_FileSystemRepository_Read_Large  
17014790 ns 16979026 ns   41
   FSMemoryMapBMFixture/MemoryMap_FileSystemRepository_WriteRead_Large
16578370 ns 16530633 ns   42
   FSMemoryMapBMFixture/Callback_FileSystemRepository_WriteRead_Large 
26228637 ns 26159193 ns   27
   FSMemoryMapBMFixture/MemoryMap_FileSystemRepository_RandomRead_Large 
  53.7 ns 53.7 ns 13026678
   FSMemoryMapBMFixture/Callback_FileSystemRepository_RandomRead_Large  
170905 ns   170829 ns 4074
   [2019-06-debugl 14:10:56.874] 
[org::apache::nifi::minifi::core::logging::LoggerConfiguration] [debug] 
org::apache::nifi::minifi::core::Repository logger got si
   nks from namespace root and level error from namespace root
   [2019-06-debugl 14:10:56.874] 
[org::apache::nifi::minifi::core

[GitHub] [nifi-minifi-cpp] asfgit closed pull request #602: MINIFICPP-939: Install deps up front on certian systems. Change cento…

2019-06-28 Thread GitBox
asfgit closed pull request #602: MINIFICPP-939: Install deps up front on 
certian systems. Change cento…
URL: https://github.com/apache/nifi-minifi-cpp/pull/602
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Resolved] (NIFI-6406) ForkRecord Extract Incorrectly Forks Records

2019-06-28 Thread David Mollitor (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-6406?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

David Mollitor resolved NIFI-6406.
--
Resolution: Not A Problem

Woops.

 

The documentation says:

 

{quote}

assuming the Record Writer schema is correctly set

{quote}

 

I did not explicitly set a schema on the JSON Record Writer. I thought it could 
change the schema for me automatically to match the 'extract' requirement.  No 
such luck.  The docs should include the example schema as well as part of the 
example.

> ForkRecord Extract Incorrectly Forks Records
> 
>
> Key: NIFI-6406
> URL: https://issues.apache.org/jira/browse/NIFI-6406
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core Framework
>Affects Versions: 1.9.2
>Reporter: David Mollitor
>Priority: Major
>
> I am looking at the {{ForkRecord}} processor and trying to reproduce the 
> example provided here:
> [https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi/nifi-standard-nar/1.9.2/org.apache.nifi.processors.standard.ForkRecord/additionalDetails.html]
> In particular, I am looking at _Example 2 - Extracting with parent fields_
> My input:
> {code}
> [{
> "id": 1,
> "name": "John Doe",
> "address": "123 My Street",
> "city": "My City",
> "state": "MS",
> "zipCode": "1",
> "country": "USA",
> "accounts": [{
> "id": 42,
> "balance": 4750.89
> }, {
> "id": 43,
> "balance": 48212.38
> }]
> },
> {
> "id": 2,
> "name": "Jane Doe",
> "address": "345 My Street",
> "city": "Her City",
> "state": "NY",
> "zipCode": "2",
> "country": "USA",
> "accounts": [{
> "id": 45,
> "balance": 6578.45
> }, {
> "id": 46,
> "balance": 34567.21
> }]
> }]
> {code}
> My output:
> {code}
> [ {
>   "id" : 42,
>   "name" : "John Doe",
>   "address" : "123 My Street",
>   "city" : "My City",
>   "state" : "MS",
>   "zipCode" : "1",
>   "country" : "USA",
>   "accounts" : [ {
> "id" : 42,
> "balance" : 4750.89
>   }, {
> "id" : 43,
> "balance" : 48212.38
>   } ]
> }, {
>   "id" : 43,
>   "name" : "John Doe",
>   "address" : "123 My Street",
>   "city" : "My City",
>   "state" : "MS",
>   "zipCode" : "1",
>   "country" : "USA",
>   "accounts" : [ {
> "id" : 42,
> "balance" : 4750.89
>   }, {
> "id" : 43,
> "balance" : 48212.38
>   } ]
> }, {
>   "id" : 45,
>   "name" : "Jane Doe",
>   "address" : "345 My Street",
>   "city" : "Her City",
>   "state" : "NY",
>   "zipCode" : "2",
>   "country" : "USA",
>   "accounts" : [ {
> "id" : 45,
> "balance" : 6578.45
>   }, {
> "id" : 46,
> "balance" : 34567.21
>   } ]
> }, {
>   "id" : 46,
>   "name" : "Jane Doe",
>   "address" : "345 My Street",
>   "city" : "Her City",
>   "state" : "NY",
>   "zipCode" : "2",
>   "country" : "USA",
>   "accounts" : [ {
> "id" : 45,
> "balance" : 6578.45
>   }, {
> "id" : 46,
> "balance" : 34567.21
>   } ]
> } ]
> {code}
> I expect there to be 4 records (2x2) and there are,... but each record caries 
> all of the fields instead of extracting them (see the output example in the 
> docs).  It looks more like a cross-join than anything.
> The 'split' function worked as expected.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (NIFI-6406) ForkRecord Extract Incorrectly Forks Records

2019-06-28 Thread David Mollitor (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-6406?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

David Mollitor updated NIFI-6406:
-
Summary: ForkRecord Extract Incorrectly Forks Records  (was: ForkReader 
Extract Not Working As Advertised)

> ForkRecord Extract Incorrectly Forks Records
> 
>
> Key: NIFI-6406
> URL: https://issues.apache.org/jira/browse/NIFI-6406
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core Framework
>Affects Versions: 1.9.2
>Reporter: David Mollitor
>Priority: Major
>
> I am looking at the {{ForkRecord}} processor and trying to reproduce the 
> example provided here:
> [https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi/nifi-standard-nar/1.9.2/org.apache.nifi.processors.standard.ForkRecord/additionalDetails.html]
> In particular, I am looking at _Example 2 - Extracting with parent fields_
> My input:
> {code}
> [{
> "id": 1,
> "name": "John Doe",
> "address": "123 My Street",
> "city": "My City",
> "state": "MS",
> "zipCode": "1",
> "country": "USA",
> "accounts": [{
> "id": 42,
> "balance": 4750.89
> }, {
> "id": 43,
> "balance": 48212.38
> }]
> },
> {
> "id": 2,
> "name": "Jane Doe",
> "address": "345 My Street",
> "city": "Her City",
> "state": "NY",
> "zipCode": "2",
> "country": "USA",
> "accounts": [{
> "id": 45,
> "balance": 6578.45
> }, {
> "id": 46,
> "balance": 34567.21
> }]
> }]
> {code}
> My output:
> {code}
> [ {
>   "id" : 42,
>   "name" : "John Doe",
>   "address" : "123 My Street",
>   "city" : "My City",
>   "state" : "MS",
>   "zipCode" : "1",
>   "country" : "USA",
>   "accounts" : [ {
> "id" : 42,
> "balance" : 4750.89
>   }, {
> "id" : 43,
> "balance" : 48212.38
>   } ]
> }, {
>   "id" : 43,
>   "name" : "John Doe",
>   "address" : "123 My Street",
>   "city" : "My City",
>   "state" : "MS",
>   "zipCode" : "1",
>   "country" : "USA",
>   "accounts" : [ {
> "id" : 42,
> "balance" : 4750.89
>   }, {
> "id" : 43,
> "balance" : 48212.38
>   } ]
> }, {
>   "id" : 45,
>   "name" : "Jane Doe",
>   "address" : "345 My Street",
>   "city" : "Her City",
>   "state" : "NY",
>   "zipCode" : "2",
>   "country" : "USA",
>   "accounts" : [ {
> "id" : 45,
> "balance" : 6578.45
>   }, {
> "id" : 46,
> "balance" : 34567.21
>   } ]
> }, {
>   "id" : 46,
>   "name" : "Jane Doe",
>   "address" : "345 My Street",
>   "city" : "Her City",
>   "state" : "NY",
>   "zipCode" : "2",
>   "country" : "USA",
>   "accounts" : [ {
> "id" : 45,
> "balance" : 6578.45
>   }, {
> "id" : 46,
> "balance" : 34567.21
>   } ]
> } ]
> {code}
> I expect there to be 4 records (2x2) and there are,... but each record caries 
> all of the fields instead of extracting them (see the output example in the 
> docs).  It looks more like a cross-join than anything.
> The 'split' function worked as expected.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (NIFI-5176) NiFi needs to be buildable on Java 11

2019-06-28 Thread Jeff Storck (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-5176?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jeff Storck updated NIFI-5176:
--
Description: 
While retaining a source/target comptability of 1.8, NiFi needs to be buildable 
on Java 11.

The following issues have been encountered while attempting to run a Java 
1.8-built NiFi on Java 11:
||Issue||Solution||
|groovy-eclipse-compiler not working with Java 10|-Switched to gmaven-plus- 
Updated to maven-compiler-plugin:3.8.0, groovy-eclipse-compiler:3.4.0-01, and 
groovy-eclipse-batch:2.5.4-01 (See NIFI-5254 for reference)|
|Antler3 issue with ambiguous method calls|Explicit cast to ValidationContext 
needed in TestHL7Query.java|
|jaxb2-maven-plugin not compatible with Java 9|Switched to maven-jaxb-plugin|
|-nifi-enrich-processors uses package com.sun.jndi.dns, which does not 
exist-|-Required addition of- -add-modules=jdk.naming.dns 
--add-exports=jdk.naming.dns/com.sun.jndi.dns=ALL-UNNAMED, which prevents the 
usage of compiling with the --release option (to compile only against public 
APIs in the JDK) from being used. Not an optimal solution.-|
|groovy-eclipse-batch:2.4.13-01 could not find JDK base classes|Updated to 
groovy-eclipse-batch:2.5.4-01 and groovy-all:2.5.4 (See NIFI-5254 for 
reference)|
|maven-surefire-plugin:2.20.1 throws null pointer exceptions|Updated to 
maven-surefire-plugin:2.22.0|
|okhttp client builder requires X509TrustManager on Java 9+|Added methods to 
return TrustManager instances with the SSLContext created by SSLContextFactory 
and updated HttpNotificationService to use the paired TrustManager|
|nifi-runtime groovy tests aren't running|Added usage of 
build-helper-maven-plugin to explicitly add src/test/groovy to force groovy 
compilation of test sources. groovy-eclipse-compiler skips src/test/groovy if 
src/test/java doesn't exist, which is the case for nifi-runtime. (See NIFI-5341 
for reference)|
|hbase-client depends on jdk.tools:jdk.tools|Excluded this dependency 
{color:#f79232}*(needs live testing)* {color}|
|HBase client 1.1.2 does not allow running on Java 9+|Updated to HBase client 
1.1.11, passes unit tests (See HBASE-17944 for reference) 
*{color:#f79232}(needs live testing){color}*|
|powermock:1.6.5 does not support Java 10|Updated to powermock:2.0.2 and 
mockito:2.28.2 (See NIFI-6360 for reference)|
|com.yammer.metrics:metrics-core:2.2.0 does not support Java 10|Upgrading 
com.yammer.metrics:metrics-core:2.2.0 to 
io.dropwizard.metrics:metrics-jvm:4.0.0 (See NIFI-5373 for reference)|

  was:
While retaining a source/target comptability of 1.8, NiFi needs to be buildable 
on Java 11.

The following issues have been encountered while attempting to run a Java 
1.8-built NiFi on Java 11:
||Issue||Solution||
|groovy-eclipse-compiler not working with Java 10|-Switched to gmaven-plus- 
Updated to maven-compiler-plugin:3.8.0, groovy-eclipse-compiler:3.4.0-01, and 
groovy-eclipse-batch:2.5.4-01|
|Antler3 issue with ambiguous method calls|Explicit cast to ValidationContext 
needed in TestHL7Query.java|
|jaxb2-maven-plugin not compatible with Java 9|Switched to maven-jaxb-plugin|
|-nifi-enrich-processors uses package com.sun.jndi.dns, which does not 
exist-|-Required addition of- -add-modules=jdk.naming.dns 
--add-exports=jdk.naming.dns/com.sun.jndi.dns=ALL-UNNAMED, which prevents the 
usage of compiling with the --release option (to compile only against public 
APIs in the JDK) from being used. Not an optimal solution.-|
|groovy-eclipse-batch:2.4.13-01 could not find JDK base classes|Updated to 
groovy-eclipse-batch:2.5.4-01 and groovy-all:2.5.4|
|maven-surefire-plugin:2.20.1 throws null pointer exceptions|Updated to 
maven-surefire-plugin:2.22.0|
|okhttp client builder requires X509TrustManager on Java 9+|Added methods to 
return TrustManager instances with the SSLContext created by SSLContextFactory 
and updated HttpNotificationService to use the paired TrustManager|
|nifi-runtime groovy tests aren't running|Added usage of 
build-helper-maven-plugin to explicitly add src/test/groovy to force groovy 
compilation of test sources. groovy-eclipse-compiler skips src/test/groovy if 
src/test/java doesn't exist, which is the case for nifi-runtime. (See NIFI-5341 
for reference)|
|hbase-client depends on jdk.tools:jdk.tools|Excluded this dependency 
{color:#f79232}*(needs live testing)* {color}|
|HBase client 1.1.2 does not allow running on Java 9+|Updated to HBase client 
1.1.11, passes unit tests (See HBASE-17944 for reference) 
*{color:#f79232}(needs live testing){color}*|
|powermock:1.6.5 does not support Java 10|Updated to powermock:2.0.2 and 
mockito:2.28.2 (See NIFI-6360 for reference)|
|com.yammer.metrics:metrics-core:2.2.0 does not support Java 10|Upgrading 
com.yammer.metrics:metrics-core:2.2.0 to 
io.dropwizard.metrics:metrics-jvm:4.0.0 (See NIFI-5373 for reference)|


> NiFi needs to be buildable on Java 11
> -
>
> 

[jira] [Created] (NIFI-6407) Support useAvroLogicalTypes in the PutBigQueryBatch Processor

2019-06-28 Thread John (JIRA)
John created NIFI-6407:
--

 Summary: Support useAvroLogicalTypes in the PutBigQueryBatch 
Processor
 Key: NIFI-6407
 URL: https://issues.apache.org/jira/browse/NIFI-6407
 Project: Apache NiFi
  Issue Type: Wish
Affects Versions: 1.9.2
Reporter: John


Would be great if PutBigQueryBatch supported the Google useAvroLogicalTypes 
option, similar to https://issues.apache.org/jira/browse/AIRFLOW-3541



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (NIFI-5176) NiFi needs to be buildable on Java 11

2019-06-28 Thread Jeff Storck (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-5176?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jeff Storck updated NIFI-5176:
--
Description: 
While retaining a source/target comptability of 1.8, NiFi needs to be buildable 
on Java 11.

The following issues have been encountered while attempting to run a Java 
1.8-built NiFi on Java 11:
||Issue||Solution||
|groovy-eclipse-compiler not working with Java 10|-Switched to gmaven-plus- 
Updated to maven-compiler-plugin:3.8.0, groovy-eclipse-compiler:3.4.0-01, and 
groovy-eclipse-batch:2.5.4-01|
|Antler3 issue with ambiguous method calls|Explicit cast to ValidationContext 
needed in TestHL7Query.java|
|jaxb2-maven-plugin not compatible with Java 9|Switched to maven-jaxb-plugin|
|-nifi-enrich-processors uses package com.sun.jndi.dns, which does not 
exist-|-Required addition of- -add-modules=jdk.naming.dns 
--add-exports=jdk.naming.dns/com.sun.jndi.dns=ALL-UNNAMED, which prevents the 
usage of compiling with the --release option (to compile only against public 
APIs in the JDK) from being used. Not an optimal solution.-|
|groovy-eclipse-batch:2.4.13-01 could not find JDK base classes|Updated to 
groovy-eclipse-batch:2.5.4-01 and groovy-all:2.5.4|
|maven-surefire-plugin:2.20.1 throws null pointer exceptions|Updated to 
maven-surefire-plugin:2.22.0|
|okhttp client builder requires X509TrustManager on Java 9+|Added methods to 
return TrustManager instances with the SSLContext created by SSLContextFactory 
and updated HttpNotificationService to use the paired TrustManager|
|nifi-runtime groovy tests aren't running|Added usage of 
build-helper-maven-plugin to explicitly add src/test/groovy to force groovy 
compilation of test sources. groovy-eclipse-compiler skips src/test/groovy if 
src/test/java doesn't exist, which is the case for nifi-runtime. (See NIFI-5341 
for reference)|
|hbase-client depends on jdk.tools:jdk.tools|Excluded this dependency 
{color:#f79232}*(needs live testing)* {color}|
|HBase client 1.1.2 does not allow running on Java 9+|Updated to HBase client 
1.1.11, passes unit tests (See HBASE-17944 for reference) 
*{color:#f79232}(needs live testing){color}*|
|powermock:1.6.5 does not support Java 10|Updated to powermock:2.0.2 and 
mockito:2.28.2 (See NIFI-6360 for reference)|
|com.yammer.metrics:metrics-core:2.2.0 does not support Java 10|Upgrading 
com.yammer.metrics:metrics-core:2.2.0 to 
io.dropwizard.metrics:metrics-jvm:4.0.0 (See NIFI-5373 for reference)|

  was:
While retaining a source/target comptability of 1.8, NiFi needs to be buildable 
on Java 11.

The following issues have been encountered while attempting to run a Java 
1.8-built NiFi on Java 11:
||Issue||Solution||
|groovy-eclipse-compiler not working with Java 10|-Switched to gmaven-plus- 
Updated to maven-compiler-plugin:3.8.0, groovy-eclipse-compiler:3.4.0-01, and 
groovy-eclipse-batch:2.5.4-01|
|Antler3 issue with ambiguous method calls|Explicit cast to ValidationContext 
needed in TestHL7Query.java|
|jaxb2-maven-plugin not compatible with Java 9|Switched to maven-jaxb-plugin|
|-nifi-enrich-processors uses package com.sun.jndi.dns, which does not 
exist-|-Required addition of- -add-modules=jdk.naming.dns 
--add-exports=jdk.naming.dns/com.sun.jndi.dns=ALL-UNNAMED, which prevents the 
usage of compiling with the --release option (to compile only against public 
APIs in the JDK) from being used. Not an optimal solution.-|
|groovy-eclipse-batch:2.4.13-01 could not find JDK base classes|Updated to 
groovy-eclipse-batch:2.5.4-01 and groovy-all:2.5.4|
|maven-surefire-plugin:2.20.1 throws null pointer exceptions|Updated to 
maven-surefire-plugin:2.22.0|
|okhttp client builder requires X509TrustManager on Java 9+|Added methods to 
return TrustManager instances with the SSLContext created by SSLContextFactory 
and updated HttpNotificationService to use the paired TrustManager|
|nifi-runtime groovy tests aren't running|Added usage of 
build-helper-maven-plugin to explicitly add src/test/groovy to force groovy 
compilation of test sources. groovy-eclipse-compiler skips src/test/groovy if 
src/test/java doesn't exist, which is the case for nifi-runtime. (See NIFI-5341 
for reference)|
|hbase-client depends on jdk.tools:jdk.tools|Excluded this dependency 
{color:#f79232}*(needs live testing)* {color}|
|HBase client 1.1.2 does not allow running on Java 9+|Updated to HBase client 
1.1.11, passes unit tests (See HBASE-17944 for reference) 
*{color:#f79232}(needs live testing){color}*|
|powermock:1.6.5 does not support Java 10|Updated to powermock:2.0.2 and 
mockito:2.28.2|
|com.yammer.metrics:metrics-core:2.2.0 does not support Java 10|Upgrading 
com.yammer.metrics:metrics-core:2.2.0 to 
io.dropwizard.metrics:metrics-jvm:4.0.0 (See NIFI-5373 for reference)|


> NiFi needs to be buildable on Java 11
> -
>
> Key: NIFI-5176
> URL: https://issues.apache.org/jira/browse/NIF

[jira] [Updated] (NIFI-5176) NiFi needs to be buildable on Java 11

2019-06-28 Thread Jeff Storck (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-5176?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jeff Storck updated NIFI-5176:
--
Description: 
While retaining a source/target comptability of 1.8, NiFi needs to be buildable 
on Java 11.

The following issues have been encountered while attempting to run a Java 
1.8-built NiFi on Java 11:
||Issue||Solution||
|groovy-eclipse-compiler not working with Java 10|-Switched to gmaven-plus- 
Updated to maven-compiler-plugin:3.8.0, groovy-eclipse-compiler:3.4.0-01, and 
groovy-eclipse-batch:2.5.4-01|
|Antler3 issue with ambiguous method calls|Explicit cast to ValidationContext 
needed in TestHL7Query.java|
|jaxb2-maven-plugin not compatible with Java 9|Switched to maven-jaxb-plugin|
|-nifi-enrich-processors uses package com.sun.jndi.dns, which does not 
exist-|-Required addition of- -add-modules=jdk.naming.dns 
--add-exports=jdk.naming.dns/com.sun.jndi.dns=ALL-UNNAMED, which prevents the 
usage of compiling with the --release option (to compile only against public 
APIs in the JDK) from being used. Not an optimal solution.-|
|groovy-eclipse-batch:2.4.13-01 could not find JDK base classes|Updated to 
groovy-eclipse-batch:2.5.4-01 and groovy-all:2.5.4|
|maven-surefire-plugin:2.20.1 throws null pointer exceptions|Updated to 
maven-surefire-plugin:2.22.0|
|okhttp client builder requires X509TrustManager on Java 9+|Added methods to 
return TrustManager instances with the SSLContext created by SSLContextFactory 
and updated HttpNotificationService to use the paired TrustManager|
|nifi-runtime groovy tests aren't running|Added usage of 
build-helper-maven-plugin to explicitly add src/test/groovy to force groovy 
compilation of test sources. groovy-eclipse-compiler skips src/test/groovy if 
src/test/java doesn't exist, which is the case for nifi-runtime. (See NIFI-5341 
for reference)|
|hbase-client depends on jdk.tools:jdk.tools|Excluded this dependency 
{color:#f79232}*(needs live testing)* {color}|
|HBase client 1.1.2 does not allow running on Java 9+|Updated to HBase client 
1.1.11, passes unit tests (See HBASE-17944 for reference) 
*{color:#f79232}(needs live testing){color}*|
|powermock:1.6.5 does not support Java 10|Updated to powermock:2.0.2 and 
mockito:2.28.2|
|com.yammer.metrics:metrics-core:2.2.0 does not support Java 10|Upgrading 
com.yammer.metrics:metrics-core:2.2.0 to 
io.dropwizard.metrics:metrics-jvm:4.0.0 (See NIFI-5373 for reference)|

  was:
While retaining a source/target comptability of 1.8, NiFi needs to be buildable 
on Java 11.

The following issues have been encountered while attempting to run a Java 
1.8-built NiFi on Java 11:
||Issue||Solution||
|groovy-eclipse-compiler not working with Java 10|-Switched to gmaven-plus- 
Updated to maven-compiler-plugin:3.7.0, groovy-eclipse-compiler:2.9.3-01, and 
groovy-eclipse-batch:2.4.15-01|
|Antler3 issue with ambiguous method calls|Explicit cast to ValidationContext 
needed in TestHL7Query.java|
|jaxb2-maven-plugin not compatible with Java 9|Switched to maven-jaxb-plugin|
|-nifi-enrich-processors uses package com.sun.jndi.dns, which does not 
exist-|-Required addition of- -add-modules=jdk.naming.dns 
--add-exports=jdk.naming.dns/com.sun.jndi.dns=ALL-UNNAMED, which prevents the 
usage of compiling with the --release option (to compile only against public 
APIs in the JDK) from being used. Not an optimal solution.-|
|groovy-eclipse-batch:2.4.13-01 could not find JDK base classes|Updated to 
groovy-eclipse-batch:2.4.15-01 and grooy-all:2.4.15|
|maven-surefire-plugin:2.20.1 throws null pointer exceptions|Updated to 
maven-surefire-plugin:2.22.0|
|okhttp client builder requires X509TrustManager on Java 9+|Added methods to 
return TrustManager instances with the SSLContext created by SSLContextFactory 
and updated HttpNotificationService to use the paired TrustManager|
|nifi-runtime groovy tests aren't running|Added usage of 
build-helper-maven-plugin to explicitly add src/test/groovy to force groovy 
compilation of test sources. groovy-eclipse-compiler skips src/test/groovy if 
src/test/java doesn't exist, which is the case for nifi-runtime. (See NIFI-5341 
for reference)|
|hbase-client depends on jdk.tools:jdk.tools|Excluded this dependency 
{color:#f79232}*(needs live testing)* {color}|
|HBase client 1.1.2 does not allow running on Java 9+|Updated to HBase client 
1.1.11, passes unit tests (See HBASE-17944 for reference) 
*{color:#f79232}(needs live testing){color}*|
|powermock:1.6.5 does not support Java 10|Updated to powermock:2.0.0-beta.5 and 
mockito:2.19|
|com.yammer.metrics:metrics-core:2.2.0 does not support Java 10|Upgrading 
com.yammer.metrics:metrics-core:2.2.0 to 
io.dropwizard.metrics:metrics-jvm:4.0.0 (See NIFI-5373 for reference)|


> NiFi needs to be buildable on Java 11
> -
>
> Key: NIFI-5176
> URL: https://issues.apache.org/jira/browse/NIFI-5176
> Pr

[jira] [Commented] (NIFI-6406) ForkReader Extract Not Working As Advertised

2019-06-28 Thread David Mollitor (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-6406?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16875104#comment-16875104
 ] 

David Mollitor commented on NIFI-6406:
--

May not be an issue... I think I see what's wrong...

> ForkReader Extract Not Working As Advertised
> 
>
> Key: NIFI-6406
> URL: https://issues.apache.org/jira/browse/NIFI-6406
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core Framework
>Affects Versions: 1.9.2
>Reporter: David Mollitor
>Priority: Major
>
> I am looking at the {{ForkRecord}} processor and trying to reproduce the 
> example provided here:
> [https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi/nifi-standard-nar/1.9.2/org.apache.nifi.processors.standard.ForkRecord/additionalDetails.html]
> In particular, I am looking at _Example 2 - Extracting with parent fields_
> My input:
> {code}
> [{
> "id": 1,
> "name": "John Doe",
> "address": "123 My Street",
> "city": "My City",
> "state": "MS",
> "zipCode": "1",
> "country": "USA",
> "accounts": [{
> "id": 42,
> "balance": 4750.89
> }, {
> "id": 43,
> "balance": 48212.38
> }]
> },
> {
> "id": 2,
> "name": "Jane Doe",
> "address": "345 My Street",
> "city": "Her City",
> "state": "NY",
> "zipCode": "2",
> "country": "USA",
> "accounts": [{
> "id": 45,
> "balance": 6578.45
> }, {
> "id": 46,
> "balance": 34567.21
> }]
> }]
> {code}
> My output:
> {code}
> [ {
>   "id" : 42,
>   "name" : "John Doe",
>   "address" : "123 My Street",
>   "city" : "My City",
>   "state" : "MS",
>   "zipCode" : "1",
>   "country" : "USA",
>   "accounts" : [ {
> "id" : 42,
> "balance" : 4750.89
>   }, {
> "id" : 43,
> "balance" : 48212.38
>   } ]
> }, {
>   "id" : 43,
>   "name" : "John Doe",
>   "address" : "123 My Street",
>   "city" : "My City",
>   "state" : "MS",
>   "zipCode" : "1",
>   "country" : "USA",
>   "accounts" : [ {
> "id" : 42,
> "balance" : 4750.89
>   }, {
> "id" : 43,
> "balance" : 48212.38
>   } ]
> }, {
>   "id" : 45,
>   "name" : "Jane Doe",
>   "address" : "345 My Street",
>   "city" : "Her City",
>   "state" : "NY",
>   "zipCode" : "2",
>   "country" : "USA",
>   "accounts" : [ {
> "id" : 45,
> "balance" : 6578.45
>   }, {
> "id" : 46,
> "balance" : 34567.21
>   } ]
> }, {
>   "id" : 46,
>   "name" : "Jane Doe",
>   "address" : "345 My Street",
>   "city" : "Her City",
>   "state" : "NY",
>   "zipCode" : "2",
>   "country" : "USA",
>   "accounts" : [ {
> "id" : 45,
> "balance" : 6578.45
>   }, {
> "id" : 46,
> "balance" : 34567.21
>   } ]
> } ]
> {code}
> I expect there to be 4 records (2x2) and there are,... but each record caries 
> all of the fields instead of extracting them (see the output example in the 
> docs).  It looks more like a cross-join than anything.
> The 'split' function worked as expected.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (NIFI-6360) Update Mockito and PowerMock to 2.x

2019-06-28 Thread Jeff Storck (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-6360?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jeff Storck updated NIFI-6360:
--
Labels: Java11  (was: )

> Update Mockito and PowerMock to 2.x
> ---
>
> Key: NIFI-6360
> URL: https://issues.apache.org/jira/browse/NIFI-6360
> Project: Apache NiFi
>  Issue Type: Task
>  Components: Core Framework, Extensions, Tools and Build
>Affects Versions: 1.9.2
>Reporter: Jeff Storck
>Assignee: Jeff Storck
>Priority: Major
>  Labels: Java11
> Fix For: 1.10.0
>
>  Time Spent: 20m
>  Remaining Estimate: 0h
>
> Update to the most recent stable 2.x versions of Mockito and PowerMock.
> This is necessary for building NiFi on Java 11.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (NIFI-6196) Upgrade version of Jetty

2019-06-28 Thread Jeff Storck (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-6196?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jeff Storck updated NIFI-6196:
--
Labels: Java11  (was: )

> Upgrade version of Jetty
> 
>
> Key: NIFI-6196
> URL: https://issues.apache.org/jira/browse/NIFI-6196
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core Framework
>Affects Versions: 1.9.2
>Reporter: Jeff Storck
>Assignee: Jeff Storck
>Priority: Major
>  Labels: Java11
>  Time Spent: 1h
>  Remaining Estimate: 0h
>
> Upgrade version of Jetty to 9.4.15.v20190215 from 9.4.11.v20180605.
> \\
> \\
> This upgrade is needed for building NiFi with Java 11.
> \\
> \\
> ||Issues encountered during upgrade||Resolution||
> |As of Jetty 9.4.15.v20190215, certificate verification has changed.  
> Previous to version 9.4.15.v20190215, 
> {{org.eclipse.jetty.util.ssl.SslContextFactory.getEndpointIdentificationAlgorithm()}}
>  returned {{null}}. As of version 9.4.15.v20190215, that method returns 
> {{"HTTPS"}}. This causes the {{SslContextFactory}} to verify the hostname on 
> the other end of the connection, regardless of being used by a client or 
> server. This works correctly for clients but results in a 
> {{CertificateException}} on the server if the client cert does not contain 
> the correct SAN. The following Jetty Github issues reference this scenario:
>  * [https://github.com/eclipse/jetty.project/issues/3154]
>  * [https://github.com/eclipse/jetty.project/issues/3454]
>  * [https://github.com/eclipse/jetty.project/issues/3464]
>  * [https://github.com/eclipse/jetty.project/issues/3466]|Update server 
> SslContextFactory instances use 
> {{org.eclipse.jetty.util.ssl.SslContextFactory.setEndpointIdentificationAlgorithm(null)}}|
> |Several tests use the same keystore between client and server:
>  * ITestHandleHttpRequest
>  * TestInvokeHttpSSL
>  * TestInvokeHttpTwoWaySSL
>  * TestListenHTTP|Update tests to use a separate keystore for clients|
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (NIFI-6406) ForkReader Extract Not Working As Advertised

2019-06-28 Thread David Mollitor (JIRA)
David Mollitor created NIFI-6406:


 Summary: ForkReader Extract Not Working As Advertised
 Key: NIFI-6406
 URL: https://issues.apache.org/jira/browse/NIFI-6406
 Project: Apache NiFi
  Issue Type: Improvement
  Components: Core Framework
Affects Versions: 1.9.2
Reporter: David Mollitor


I am looking at the {{ForkRecord}} processor and trying to reproduce the 
example provided here:

[https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi/nifi-standard-nar/1.9.2/org.apache.nifi.processors.standard.ForkRecord/additionalDetails.html]

In particular, I am looking at _Example 2 - Extracting with parent fields_

My input:
{code}
[{
"id": 1,
"name": "John Doe",
"address": "123 My Street",
"city": "My City",
"state": "MS",
"zipCode": "1",
"country": "USA",
"accounts": [{
"id": 42,
"balance": 4750.89
}, {
"id": 43,
"balance": 48212.38
}]
},
{
"id": 2,
"name": "Jane Doe",
"address": "345 My Street",
"city": "Her City",
"state": "NY",
"zipCode": "2",
"country": "USA",
"accounts": [{
"id": 45,
"balance": 6578.45
}, {
"id": 46,
"balance": 34567.21
}]
}]
{code}
My output:
{code}
[ {
  "id" : 42,
  "name" : "John Doe",
  "address" : "123 My Street",
  "city" : "My City",
  "state" : "MS",
  "zipCode" : "1",
  "country" : "USA",
  "accounts" : [ {
"id" : 42,
"balance" : 4750.89
  }, {
"id" : 43,
"balance" : 48212.38
  } ]
}, {
  "id" : 43,
  "name" : "John Doe",
  "address" : "123 My Street",
  "city" : "My City",
  "state" : "MS",
  "zipCode" : "1",
  "country" : "USA",
  "accounts" : [ {
"id" : 42,
"balance" : 4750.89
  }, {
"id" : 43,
"balance" : 48212.38
  } ]
}, {
  "id" : 45,
  "name" : "Jane Doe",
  "address" : "345 My Street",
  "city" : "Her City",
  "state" : "NY",
  "zipCode" : "2",
  "country" : "USA",
  "accounts" : [ {
"id" : 45,
"balance" : 6578.45
  }, {
"id" : 46,
"balance" : 34567.21
  } ]
}, {
  "id" : 46,
  "name" : "Jane Doe",
  "address" : "345 My Street",
  "city" : "Her City",
  "state" : "NY",
  "zipCode" : "2",
  "country" : "USA",
  "accounts" : [ {
"id" : 45,
"balance" : 6578.45
  }, {
"id" : 46,
"balance" : 34567.21
  } ]
} ]
{code}
I expect there to be 4 records (2x2) and there are,... but each record caries 
all of the fields instead of extracting them (see the output example in the 
docs).  It looks more like a cross-join than anything.

The 'split' function worked as expected.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (NIFI-5820) NiFi built with Java 1.8 needs to run on Java 11

2019-06-28 Thread Jeff Storck (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-5820?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jeff Storck updated NIFI-5820:
--
Labels: Java11  (was: )

> NiFi built with Java 1.8 needs to run on Java 11
> 
>
> Key: NIFI-5820
> URL: https://issues.apache.org/jira/browse/NIFI-5820
> Project: Apache NiFi
>  Issue Type: Sub-task
>Reporter: Jeff Storck
>Assignee: Jeff Storck
>Priority: Major
>  Labels: Java11
> Fix For: 1.10.0
>
>  Time Spent: 1h 40m
>  Remaining Estimate: 0h
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (NIFI-5254) Upgrade to Groovy 2.5.0

2019-06-28 Thread Jeff Storck (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-5254?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jeff Storck updated NIFI-5254:
--
Labels: Java11  (was: )

> Upgrade to Groovy 2.5.0
> ---
>
> Key: NIFI-5254
> URL: https://issues.apache.org/jira/browse/NIFI-5254
> Project: Apache NiFi
>  Issue Type: Improvement
>Reporter: Mike Thomsen
>Assignee: Mike Thomsen
>Priority: Major
>  Labels: Java11
>  Time Spent: 4h 10m
>  Remaining Estimate: 0h
>
> Groovy 2.5 has been released and support for it should be added.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [nifi] kaHaleMaKai commented on a change in pull request #3543: NIFI-6388 Add dynamic relationships to the ExecuteScript processor.

2019-06-28 Thread GitBox
kaHaleMaKai commented on a change in pull request #3543: NIFI-6388 Add dynamic 
relationships to the ExecuteScript processor.
URL: https://github.com/apache/nifi/pull/3543#discussion_r298681084
 
 

 ##
 File path: 
nifi-nar-bundles/nifi-scripting-bundle/nifi-scripting-processors/src/main/java/org/apache/nifi/processors/script/ExecuteScript.java
 ##
 @@ -133,13 +167,57 @@
  */
 @Override
 protected PropertyDescriptor getSupportedDynamicPropertyDescriptor(final 
String propertyDescriptorName) {
-return new PropertyDescriptor.Builder()
+final boolean isRelationship = propertyDescriptorName != null && 
propertyDescriptorName.startsWith(DYNAMIC_RELATIONSHIP_PREFIX);
+if (isRelationship) {
+if 
(!DYNAMIC_RELATIONSHIP_PATTERN.matcher(propertyDescriptorName).matches()) {
+log.warn("dyn. property for relationship is invalid: '{}'. 
accepted patterns: '{}'", new Object[]{propertyDescriptorName, 
DYNAMIC_RELATIONSHIP_PATTERN_AS_STRING});
+return new PropertyDescriptor.Builder()
+.addValidator(new RelationshipInvalidator())
+.dynamic(true)
+.required(false)
+.name(propertyDescriptorName)
+.build();
+}
+}
+final Validator validator = isRelationship
+? Validator.VALID
+: StandardValidators.NON_EMPTY_VALIDATOR;
+final PropertyDescriptor.Builder builder = new 
PropertyDescriptor.Builder()
 .name(propertyDescriptorName)
 .required(false)
-.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.addValidator(validator)
 
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
-.dynamic(true)
-.build();
+.dynamic(true);
+if (isRelationship) {
+builder.description(String.format(
 
 Review comment:
   fixed in 56a97e38c7de86daf56d2488ddbff211bd9ab5cc


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi] jtstorck commented on issue #3547: [WIP] NIFI-5254 Update Groovy dependecies to version 2.5.4

2019-06-28 Thread GitBox
jtstorck commented on issue #3547: [WIP] NIFI-5254 Update Groovy dependecies to 
version 2.5.4
URL: https://github.com/apache/nifi/pull/3547#issuecomment-506819978
 
 
   @MikeThomsen This most recent update to the PR should resolve the 
compilation issues, and the tests are passing.
   
   `groovy-all` is now leveraged in several places, along with some new 
properties to specify the Groovy version used by several modules.  The idea 
here is that the `nifi-groovyx-*` and `nifi-scripting-*` can be upgraded to new 
versions of Groovy without having to use the same version in NiFi or NiFi 
Toolkit until we're ready to update the tests/utils in NiFi and NiFi Toolkit 
written in Groovy.  This changes also makes it a bit easier to manage a Groovy 
upgrade in NiFi or NiFi Toolkit.  Rather than copying all the modules specified 
in `groovy-all`'s POM individually to NiFi module POMs, `groovy-all` can be 
upgraded, along with the few Groovy modules that are referenced individually, 
by changing the appropriate `groovy.version` property.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi] kaHaleMaKai commented on a change in pull request #3543: NIFI-6388 Add dynamic relationships to the ExecuteScript processor.

2019-06-28 Thread GitBox
kaHaleMaKai commented on a change in pull request #3543: NIFI-6388 Add dynamic 
relationships to the ExecuteScript processor.
URL: https://github.com/apache/nifi/pull/3543#discussion_r298682586
 
 

 ##
 File path: 
nifi-nar-bundles/nifi-scripting-bundle/nifi-scripting-processors/src/main/java/org/apache/nifi/processors/script/ExecuteScript.java
 ##
 @@ -133,13 +167,57 @@
  */
 @Override
 protected PropertyDescriptor getSupportedDynamicPropertyDescriptor(final 
String propertyDescriptorName) {
-return new PropertyDescriptor.Builder()
+final boolean isRelationship = propertyDescriptorName != null && 
propertyDescriptorName.startsWith(DYNAMIC_RELATIONSHIP_PREFIX);
+if (isRelationship) {
+if 
(!DYNAMIC_RELATIONSHIP_PATTERN.matcher(propertyDescriptorName).matches()) {
+log.warn("dyn. property for relationship is invalid: '{}'. 
accepted patterns: '{}'", new Object[]{propertyDescriptorName, 
DYNAMIC_RELATIONSHIP_PATTERN_AS_STRING});
+return new PropertyDescriptor.Builder()
+.addValidator(new RelationshipInvalidator())
+.dynamic(true)
+.required(false)
+.name(propertyDescriptorName)
 
 Review comment:
   I tried that, but it didn't work out. We need to keep the full property name 
(including the `REL_` prefix, previously `rel.`), so that the 
`ExecuteScript#onPropertyModified()` method can detect the new property 
indicating a relationship. If we prematurely remove the prefix, the respective 
method cannot tell if it's just a regular dynamic property, or if the user 
wants to add a new dynamic relationship.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi] kaHaleMaKai commented on a change in pull request #3543: NIFI-6388 Add dynamic relationships to the ExecuteScript processor.

2019-06-28 Thread GitBox
kaHaleMaKai commented on a change in pull request #3543: NIFI-6388 Add dynamic 
relationships to the ExecuteScript processor.
URL: https://github.com/apache/nifi/pull/3543#discussion_r298681254
 
 

 ##
 File path: 
nifi-nar-bundles/nifi-scripting-bundle/nifi-scripting-processors/src/main/java/org/apache/nifi/processors/script/ExecuteScript.java
 ##
 @@ -264,4 +343,16 @@ public void onTrigger(ProcessContext context, 
ProcessSessionFactory sessionFacto
 public void stop() {
 scriptingComponentHelper.stop();
 }
+
+private static class RelationshipInvalidator implements Validator {
+@Override
+public ValidationResult validate(String subject, String input, 
ValidationContext validationContext) {
+return new ValidationResult.Builder()
+.subject(subject)
+.input(input)
+.explanation("invalid dyn. relationship specified")
 
 Review comment:
   fixed in 0b13c41acfe3f354db8dd739d929188ddb625a6a


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi] kaHaleMaKai commented on a change in pull request #3543: NIFI-6388 Add dynamic relationships to the ExecuteScript processor.

2019-06-28 Thread GitBox
kaHaleMaKai commented on a change in pull request #3543: NIFI-6388 Add dynamic 
relationships to the ExecuteScript processor.
URL: https://github.com/apache/nifi/pull/3543#discussion_r298681084
 
 

 ##
 File path: 
nifi-nar-bundles/nifi-scripting-bundle/nifi-scripting-processors/src/main/java/org/apache/nifi/processors/script/ExecuteScript.java
 ##
 @@ -133,13 +167,57 @@
  */
 @Override
 protected PropertyDescriptor getSupportedDynamicPropertyDescriptor(final 
String propertyDescriptorName) {
-return new PropertyDescriptor.Builder()
+final boolean isRelationship = propertyDescriptorName != null && 
propertyDescriptorName.startsWith(DYNAMIC_RELATIONSHIP_PREFIX);
+if (isRelationship) {
+if 
(!DYNAMIC_RELATIONSHIP_PATTERN.matcher(propertyDescriptorName).matches()) {
+log.warn("dyn. property for relationship is invalid: '{}'. 
accepted patterns: '{}'", new Object[]{propertyDescriptorName, 
DYNAMIC_RELATIONSHIP_PATTERN_AS_STRING});
+return new PropertyDescriptor.Builder()
+.addValidator(new RelationshipInvalidator())
+.dynamic(true)
+.required(false)
+.name(propertyDescriptorName)
+.build();
+}
+}
+final Validator validator = isRelationship
+? Validator.VALID
+: StandardValidators.NON_EMPTY_VALIDATOR;
+final PropertyDescriptor.Builder builder = new 
PropertyDescriptor.Builder()
 .name(propertyDescriptorName)
 .required(false)
-.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.addValidator(validator)
 
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
-.dynamic(true)
-.build();
+.dynamic(true);
+if (isRelationship) {
+builder.description(String.format(
 
 Review comment:
   fixed in 5963a8e5576c527f035e0db15e0c95c07e413789


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi] kaHaleMaKai commented on a change in pull request #3543: NIFI-6388 Add dynamic relationships to the ExecuteScript processor.

2019-06-28 Thread GitBox
kaHaleMaKai commented on a change in pull request #3543: NIFI-6388 Add dynamic 
relationships to the ExecuteScript processor.
URL: https://github.com/apache/nifi/pull/3543#discussion_r298680698
 
 

 ##
 File path: 
nifi-nar-bundles/nifi-scripting-bundle/nifi-scripting-processors/src/main/java/org/apache/nifi/processors/script/ExecuteScript.java
 ##
 @@ -133,13 +167,57 @@
  */
 @Override
 protected PropertyDescriptor getSupportedDynamicPropertyDescriptor(final 
String propertyDescriptorName) {
-return new PropertyDescriptor.Builder()
+final boolean isRelationship = propertyDescriptorName != null && 
propertyDescriptorName.startsWith(DYNAMIC_RELATIONSHIP_PREFIX);
+if (isRelationship) {
+if 
(!DYNAMIC_RELATIONSHIP_PATTERN.matcher(propertyDescriptorName).matches()) {
+log.warn("dyn. property for relationship is invalid: '{}'. 
accepted patterns: '{}'", new Object[]{propertyDescriptorName, 
DYNAMIC_RELATIONSHIP_PATTERN_AS_STRING});
 
 Review comment:
   fixed in 0b13c41acfe3f354db8dd739d929188ddb625a6a


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi] kaHaleMaKai commented on a change in pull request #3543: NIFI-6388 Add dynamic relationships to the ExecuteScript processor.

2019-06-28 Thread GitBox
kaHaleMaKai commented on a change in pull request #3543: NIFI-6388 Add dynamic 
relationships to the ExecuteScript processor.
URL: https://github.com/apache/nifi/pull/3543#discussion_r298680420
 
 

 ##
 File path: 
nifi-nar-bundles/nifi-scripting-bundle/nifi-scripting-processors/src/test/groovy/org/apache/nifi/processors/script/ExecuteScriptGroovyTest.groovy
 ##
 @@ -20,19 +20,17 @@ import org.apache.nifi.script.ScriptingComponentUtils
 import org.apache.nifi.util.MockFlowFile
 import org.apache.nifi.util.StopWatch
 import org.apache.nifi.util.TestRunners
-import org.junit.After
-import org.junit.Before
-import org.junit.BeforeClass
-import org.junit.Ignore
-import org.junit.Test
+import org.junit.*
 
 Review comment:
   fixed in 4614d29010d390d6ed5e50d56ec9fd3a29d55805


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi] jtstorck commented on a change in pull request #3547: [WIP] NIFI-5254 Update Groovy dependecies to version 2.5.4

2019-06-28 Thread GitBox
jtstorck commented on a change in pull request #3547: [WIP] NIFI-5254 Update 
Groovy dependecies to version 2.5.4
URL: https://github.com/apache/nifi/pull/3547#discussion_r298665659
 
 

 ##
 File path: pom.xml
 ##
 @@ -442,7 +339,8 @@
 
 
 org.codehaus.groovy
-groovy-test
+groovy-all
 
 Review comment:
   That commit is mid-progress (noted in the commit message).  I'll be pushing 
another change today to resolve all that and shift around some of the groovy-* 
dependencies.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi] MikeThomsen commented on a change in pull request #3547: [WIP] NIFI-5254 Update Groovy dependecies to version 2.5.4

2019-06-28 Thread GitBox
MikeThomsen commented on a change in pull request #3547: [WIP] NIFI-5254 Update 
Groovy dependecies to version 2.5.4
URL: https://github.com/apache/nifi/pull/3547#discussion_r298664594
 
 

 ##
 File path: pom.xml
 ##
 @@ -442,7 +339,8 @@
 
 
 org.codehaus.groovy
-groovy-test
+groovy-all
 
 Review comment:
   Looks like this caused a cascade of breaking references in about 12 
projects. Here's the output from the logs:
   
   ```
   ERROR] The build could not read 12 projects -> [Help 1]
   [ERROR]   
   [ERROR]   The project  
(/home/travis/build/apache/nifi/nifi-nar-bundles/nifi-scripting-bundle/nifi-scripting-nar/pom.xml)
 has 1 error
   [ERROR] Non-parseable POM 
/home/travis/build/apache/nifi/nifi-nar-bundles/nifi-scripting-bundle/nifi-scripting-nar/pom.xml:
 end tag name  must match start tag name  from line 32 
(position: TEXT seen ...\n... @51:11)  @ line 51, column 
11 -> [Help 2]
   [ERROR]   
   [ERROR]   The project org.apache.nifi:nifi-json-utils:1.10.0-SNAPSHOT 
(/home/travis/build/apache/nifi/nifi-commons/nifi-json-utils/pom.xml) has 1 
error
   [ERROR] 'dependencies.dependency.version' for 
org.codehaus.groovy:groovy-json:jar is missing. @ line 38, column 21
   [ERROR]   
   [ERROR]   The project org.apache.nifi:nifi-web-security:1.10.0-SNAPSHOT 
(/home/travis/build/apache/nifi/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-security/pom.xml)
 has 1 error
   [ERROR] 'dependencies.dependency.version' for 
org.codehaus.groovy:groovy-json:jar is missing. @ 
org.apache.nifi:nifi-web-security:[unknown-version], 
/home/travis/build/apache/nifi/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-security/pom.xml,
 line 180, column 21
   [ERROR]   
   [ERROR]   The project 
org.apache.nifi:nifi-standard-processors:1.10.0-SNAPSHOT 
(/home/travis/build/apache/nifi/nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/pom.xml)
 has 2 errors
   [ERROR] 'dependencies.dependency.version' for 
org.codehaus.groovy:groovy-json:jar is missing. @ 
org.apache.nifi:nifi-standard-processors:[unknown-version], 
/home/travis/build/apache/nifi/nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/pom.xml,
 line 374, column 21
   [ERROR] 'dependencies.dependency.version' for 
org.codehaus.groovy:groovy-servlet:jar is missing. @ 
org.apache.nifi:nifi-standard-processors:[unknown-version], 
/home/travis/build/apache/nifi/nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/pom.xml,
 line 379, column 21
   [ERROR]   
   [ERROR]   The project org.apache.nifi:nifi-lookup-services:1.10.0-SNAPSHOT 
(/home/travis/build/apache/nifi/nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/pom.xml)
 has 1 error
   [ERROR] 'dependencies.dependency.version' for 
org.codehaus.groovy:groovy-json:jar is missing. @ 
org.apache.nifi:nifi-lookup-services:[unknown-version], 
/home/travis/build/apache/nifi/nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/pom.xml,
 line 166, column 21
   [ERROR]   
   [ERROR]   The project 
org.apache.nifi:nifi-mongodb-processors:1.10.0-SNAPSHOT 
(/home/travis/build/apache/nifi/nifi-nar-bundles/nifi-mongodb-bundle/nifi-mongodb-processors/pom.xml)
 has 1 error
   [ERROR] 'dependencies.dependency.version' for 
org.codehaus.groovy:groovy-json:jar is missing. @ 
org.apache.nifi:nifi-mongodb-processors:[unknown-version], 
/home/travis/build/apache/nifi/nifi-nar-bundles/nifi-mongodb-bundle/nifi-mongodb-processors/pom.xml,
 line 113, column 21
   [ERROR]   
   [ERROR]   The project org.apache.nifi:nifi-scripting-bundle:1.10.0-SNAPSHOT 
(/home/travis/build/apache/nifi/nifi-nar-bundles/nifi-scripting-bundle/pom.xml) 
has 1 error
   [ERROR] 'dependencies.dependency.version' for 
org.codehaus.groovy:groovy-all:pom is missing. @ 
org.apache.nifi:nifi:1.10.0-SNAPSHOT, /home/travis/build/apache/nifi/pom.xml, 
line 340, column 21
   [ERROR]   
   [ERROR]   The project 
org.apache.nifi:nifi-elasticsearch-client-service:1.10.0-SNAPSHOT 
(/home/travis/build/apache/nifi/nifi-nar-bundles/nifi-elasticsearch-bundle/nifi-elasticsearch-client-service/pom.xml)
 has 1 error
   [ERROR] 'dependencies.dependency.version' for 
org.codehaus.groovy:groovy-json:jar is missing. @ 
org.apache.nifi:nifi-elasticsearch-client-service:[unknown-version], 
/home/travis/build/apache/nifi/nifi-nar-bundles/nifi-elasticsearch-bundle/nifi-elasticsearch-client-service/pom.xml,
 line 163, column 21
   [ERROR]   
   [ERROR]   The project org.apache.nifi:nifi-groovyx-bundle:1.10.0-SNAPSHOT 
(/home/travis/build/apache/nifi/nifi-nar-bundles/nifi-groovyx-bundle/pom.xml) 
has 1 error
   [ERROR] 'dependencies.dependency.version' for 
org.codehaus.groovy:groovy-all:pom is missing. @ 
org.apache.nifi:nifi:1.10.0-SNAPSHOT, /home/travis/build/apache/nifi/pom.xml, 
line 340, column 21
   [ERROR]   
   [ERROR]   The 

[GitHub] [nifi-minifi-cpp] phrocker commented on a change in pull request #602: MINIFICPP-939: Install deps up front on certian systems. Change cento…

2019-06-28 Thread GitBox
phrocker commented on a change in pull request #602: MINIFICPP-939: Install 
deps up front on certian systems. Change cento…
URL: https://github.com/apache/nifi-minifi-cpp/pull/602#discussion_r298619303
 
 

 ##
 File path: docker/centos/Dockerfile
 ##
 @@ -35,7 +35,7 @@ ENV MINIFI_BASE_DIR /opt/minifi
 RUN mkdir -p $MINIFI_BASE_DIR 
 USER $USER
 
-RUN yum -y install java-1.8.0-openjdk java-1.8.0-openjdk-devel sudo git which 
maven
+RUN yum -y install java-1.8.0-openjdk java-1.8.0-openjdk-devel gcc g++ sudo 
git which maven
 
 Review comment:
   not entirely necessary as bootstrap will install it but this is easier to 
see in the output. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi-minifi-cpp] phrocker commented on a change in pull request #602: MINIFICPP-939: Install deps up front on certian systems. Change cento…

2019-06-28 Thread GitBox
phrocker commented on a change in pull request #602: MINIFICPP-939: Install 
deps up front on certian systems. Change cento…
URL: https://github.com/apache/nifi-minifi-cpp/pull/602#discussion_r298619125
 
 

 ##
 File path: bootstrap.sh
 ##
 @@ -270,7 +270,7 @@ add_dependency GPS_ENABLED "gpsd"
 
 add_disabled_option AWS_ENABLED ${TRUE} "ENABLE_AWS"
 
-add_disabled_option KAFKA_ENABLED ${FALSE} "ENABLE_LIBRDKAFKA" "3.4.0"
 
 Review comment:
   remove the cmake dependency version as it is no longer valid


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi-minifi-cpp] phrocker commented on a change in pull request #602: MINIFICPP-939: Install deps up front on certian systems. Change cento…

2019-06-28 Thread GitBox
phrocker commented on a change in pull request #602: MINIFICPP-939: Install 
deps up front on certian systems. Change cento…
URL: https://github.com/apache/nifi-minifi-cpp/pull/602#discussion_r298619358
 
 

 ##
 File path: extensions/librdkafka/CMakeLists.txt
 ##
 @@ -66,20 +67,6 @@ add_dependencies(minifi-rdkafka-extensions kafka-external)
 include_directories(${ZLIB_INCLUDE_DIRS})
 include_directories(${KAFKA_INCLUDE})
 target_link_libraries (minifi-rdkafka-extensions ${BYPRODUCT})
-if (WIN32)
 
 Review comment:
   unnecessary cruft


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi-minifi-cpp] phrocker commented on issue #602: MINIFICPP-939: Install deps up front on certian systems. Change cento…

2019-06-28 Thread GitBox
phrocker commented on issue #602: MINIFICPP-939: Install deps up front on 
certian systems. Change cento…
URL: https://github.com/apache/nifi-minifi-cpp/pull/602#issuecomment-506740879
 
 
   "This is some work I forgot to submit as a pr last month. doing so before I 
forget. also have a few other items with JNI that I need to push" -- have a few 
PRs I excised off of the windows stuff that are related this and forgot to 
submit


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi-minifi-cpp] phrocker opened a new pull request #602: MINIFICPP-939: Install deps up front on certian systems. Change cento…

2019-06-28 Thread GitBox
phrocker opened a new pull request #602: MINIFICPP-939: Install deps up front 
on certian systems. Change cento…
URL: https://github.com/apache/nifi-minifi-cpp/pull/602
 
 
   …s LIBDIR as the version change of cmake enforced a different dir
   
   This is some work I forgot to submit as a pr last month. doing so before I 
forget. also have a few other items with JNI that I need to push
   
   Thank you for submitting a contribution to Apache NiFi - MiNiFi C++.
   
   In order to streamline the review of the contribution we ask you
   to ensure the following steps have been taken:
   
   ### For all changes:
   - [ ] Is there a JIRA ticket associated with this PR? Is it referenced
in the commit message?
   
   - [ ] Does your PR title start with MINIFICPP- where  is the JIRA 
number you are trying to resolve? Pay particular attention to the hyphen "-" 
character.
   
   - [ ] Has your PR been rebased against the latest commit within the target 
branch (typically master)?
   
   - [ ] Is your initial contribution a single, squashed commit?
   
   ### For code changes:
   - [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)?
   - [ ] If applicable, have you updated the LICENSE file?
   - [ ] If applicable, have you updated the NOTICE file?
   
   ### For documentation related changes:
   - [ ] Have you ensured that format looks appropriate for the output in which 
it is rendered?
   
   ### Note:
   Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi-minifi-cpp] arpadboda commented on a change in pull request #601: MINIFICPP-937: resolve state file issues and rollover issues. Add tes…

2019-06-28 Thread GitBox
arpadboda commented on a change in pull request #601: MINIFICPP-937: resolve 
state file issues and rollover issues. Add tes…
URL: https://github.com/apache/nifi-minifi-cpp/pull/601#discussion_r298573237
 
 

 ##
 File path: extensions/standard-processors/tests/unit/TailFileTests.cpp
 ##
 @@ -78,7 +78,245 @@ TEST_CASE("TailFileWithDelimiter", "[tailfiletest2]") {
 
   // Delete the test and state file.
   remove(TMP_FILE);
-  remove(STATE_FILE);
+  remove(std::string(std::string(STATE_FILE) + "." + id).c_str());
+}
+
+TEST_CASE("TestNewContent", "[tailFileWithDelimiterState]") {
 
 Review comment:
   There were testcases in this file already resetting plan and running 
TailFile multiple times, the only things they missed to verify this behaviour 
is the new flag you just added to clear configured processors. 
   
   More tests don't hurt anyway, so let's leave it as is, although modifying 
some of those testscases to verify state storing as well might be good, too. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi-minifi-cpp] arpadboda commented on a change in pull request #601: MINIFICPP-937: resolve state file issues and rollover issues. Add tes…

2019-06-28 Thread GitBox
arpadboda commented on a change in pull request #601: MINIFICPP-937: resolve 
state file issues and rollover issues. Add tes…
URL: https://github.com/apache/nifi-minifi-cpp/pull/601#discussion_r298570258
 
 

 ##
 File path: extensions/standard-processors/processors/TailFile.cpp
 ##
 @@ -250,6 +259,24 @@ bool TailFile::recoverState() {
   for (file.getline(buf, BUFFER_SIZE); file.good(); file.getline(buf, 
BUFFER_SIZE)) {
 parseStateFileLine(buf);
   }
+
+  /**
+   * recover times and validate that we have paths
+   */
+
+  for (auto &state : tail_states_) {
+std::string fileLocation, fileName;
+if 
(!utils::file::PathUtils::getFileNameAndPath(state.second.current_file_name_, 
fileLocation, fileName) && state.second.path_.empty()) {
+  throw minifi::Exception(ExceptionType::PROCESSOR_EXCEPTION, "State file 
does not contain a full path and file name");
 
 Review comment:
   I feel a bit of inconsistency here.
   This works well, but in case the file cannot even be openned, we just print 
a log statement, in this case however an exception is thrown, which means 
giving up doing work in the given "ontrigger call". 
   
   Is it intended to be done this way? 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (NIFI-6405) GetHDFSFileInfo ignores files in the root/start directory

2019-06-28 Thread Ferenc Szabo (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-6405?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ferenc Szabo updated NIFI-6405:
---
Status: Patch Available  (was: In Progress)

> GetHDFSFileInfo ignores files in the root/start directory
> -
>
> Key: NIFI-6405
> URL: https://issues.apache.org/jira/browse/NIFI-6405
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 1.9.2, 1.7.1, 1.8.0
>Reporter: Ferenc Szabo
>Assignee: Ferenc Szabo
>Priority: Major
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> The processor is not listing any file in the directory set in the "Full path" 
> (gethdfsfileinfo-full-path) property.
> Only files in subdirectories are listed.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [nifi] szaboferee opened a new pull request #3558: NIFI-6405 GetHDFSFileInfo ignores files in the root/start directory

2019-06-28 Thread GitBox
szaboferee opened a new pull request #3558: NIFI-6405 GetHDFSFileInfo ignores 
files in the root/start directory
URL: https://github.com/apache/nifi/pull/3558
 
 
   modifying test cases, fixing file processing.
   
   Testing done:
   - Unit tests
   - Manual run against a hadoop instance
   
   Thank you for submitting a contribution to Apache NiFi.
   
   Please provide a short description of the PR here:
   
    Description of PR
   
   _Enables X functionality; fixes bug NIFI-._
   
   In order to streamline the review of the contribution we ask you
   to ensure the following steps have been taken:
   
   ### For all changes:
   - [ ] Is there a JIRA ticket associated with this PR? Is it referenced 
in the commit message?
   
   - [ ] Does your PR title start with **NIFI-** where  is the JIRA 
number you are trying to resolve? Pay particular attention to the hyphen "-" 
character.
   
   - [ ] Has your PR been rebased against the latest commit within the target 
branch (typically `master`)?
   
   - [ ] Is your initial contribution a single, squashed commit? _Additional 
commits in response to PR reviewer feedback should be made on this branch and 
pushed to allow change tracking. Do not `squash` or use `--force` when pushing 
to allow for clean monitoring of changes._
   
   ### For code changes:
   - [ ] Have you ensured that the full suite of tests is executed via `mvn 
-Pcontrib-check clean install` at the root `nifi` folder?
   - [ ] Have you written or updated unit tests to verify your changes?
   - [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
   - [ ] If applicable, have you updated the `LICENSE` file, including the main 
`LICENSE` file under `nifi-assembly`?
   - [ ] If applicable, have you updated the `NOTICE` file, including the main 
`NOTICE` file found under `nifi-assembly`?
   - [ ] If adding new Properties, have you added `.displayName` in addition to 
.name (programmatic access) for each of the new properties?
   
   ### For documentation related changes:
   - [ ] Have you ensured that format looks appropriate for the output in which 
it is rendered?
   
   ### Note:
   Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (NIFI-6405) GetHDFSFileInfo ignores files in the root/start directory

2019-06-28 Thread Ferenc Szabo (JIRA)
Ferenc Szabo created NIFI-6405:
--

 Summary: GetHDFSFileInfo ignores files in the root/start directory
 Key: NIFI-6405
 URL: https://issues.apache.org/jira/browse/NIFI-6405
 Project: Apache NiFi
  Issue Type: Bug
Affects Versions: 1.9.2, 1.7.1, 1.8.0
Reporter: Ferenc Szabo
Assignee: Ferenc Szabo


The processor is not listing any file in the directory set in the "Full path" 
(gethdfsfileinfo-full-path) property.

Only files in subdirectories are listed.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (MINIFICPP-935) SFTP extension fails to compile on Windows

2019-06-28 Thread Arpad Boda (JIRA)


 [ 
https://issues.apache.org/jira/browse/MINIFICPP-935?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Arpad Boda resolved MINIFICPP-935.
--
Resolution: Fixed

> SFTP extension fails to compile on Windows
> --
>
> Key: MINIFICPP-935
> URL: https://issues.apache.org/jira/browse/MINIFICPP-935
> Project: Apache NiFi MiNiFi C++
>  Issue Type: Bug
>Reporter: Daniel Bakai
>Assignee: Daniel Bakai
>Priority: Blocker
> Fix For: 0.7.0
>
>  Time Spent: 20m
>  Remaining Estimate: 0h
>
> #define stat _stat defines leak and mess up SFPTClient::stat
> We should remove these defines as they can cause problems elsewhere too.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (NIFI-6404) PutElasticsearchHttp: Remove _type as being compulsory

2019-06-28 Thread David Vassallo (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-6404?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

David Vassallo updated NIFI-6404:
-
Description: 
In ES 7.x and above, document "type" is no longer compulsory and in fact is 
deprecated. When using the 1.9.2 version of PutElasticsearchHttp with ES v7.2, 
it still works however you'll see the following HTTP in the response:

 

{{HTTP/1.1 200 OK}}
 *{{Warning: 299 Elasticsearch-7.2.0-508c38a "[types removal] Specifying types 
in bulk requests is deprecated."}}*
 {{content-type: application/json; charset=UTF-8}}

 

The fix is relatively straightforward:
 * In *PutElasticserachHttp.java*, remove the requirement of a compulsory 
"Type" property:

{code:java}
public static final PropertyDescriptor TYPE = new PropertyDescriptor.Builder()
 .name("put-es-type")
 .displayName("Type")
 .description("The type of this document (used by Elasticsearch < 7.0 for 
indexing and searching). Leave empty for ES >= 7.0") // <-
 .required(false) // <- CHANGE
 .expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
 .addValidator(StandardValidators.NON_EMPTY_EL_VALIDATOR)
 .build();
{code}
 
 * In *AbstractElasticsearchHttpProcessor.java*, check for the presence of 
"docType". If not present, assume elasticsearch 7.x or above and omit from bulk 
API URL:

 
{code:java}
protected void buildBulkCommand(StringBuilder sb, String index, String docType, 
String indexOp, String id, String jsonString) {
if (indexOp.equalsIgnoreCase("index")) {
sb.append("{\"index\": { \"_index\": \"");
sb.append(StringEscapeUtils.escapeJson(index));
if (!(StringUtils.isEmpty(docType) | docType == null)){ // <- 
CHANGE START
sb.append("\", \"_type\": \"");
sb.append(StringEscapeUtils.escapeJson(docType));
sb.append("\"");
}// <- CHANGE END
if (!StringUtils.isEmpty(id)) { 
sb.append(", \"_id\": \"");
sb.append(StringEscapeUtils.escapeJson(id));
sb.append("\"");
} 
sb.append("}}\n");
sb.append(jsonString);
sb.append("\n");
} else if (indexOp.equalsIgnoreCase("upsert") || 
indexOp.equalsIgnoreCase("update")) {
sb.append("{\"update\": { \"_index\": \"");
sb.append(StringEscapeUtils.escapeJson(index));
sb.append("\", \"_type\": \"");
sb.append(StringEscapeUtils.escapeJson(docType));
sb.append("\", \"_id\": \"");
sb.append(StringEscapeUtils.escapeJson(id));
sb.append("\" }\n");
sb.append("{\"doc\": ");
sb.append(jsonString);
sb.append(", \"doc_as_upsert\": ");
sb.append(indexOp.equalsIgnoreCase("upsert"));
sb.append(" }\n");
} else if (indexOp.equalsIgnoreCase("delete")) {
sb.append("{\"delete\": { \"_index\": \"");
sb.append(StringEscapeUtils.escapeJson(index));
sb.append("\", \"_type\": \"");
sb.append(StringEscapeUtils.escapeJson(docType));
sb.append("\", \"_id\": \"");
sb.append(StringEscapeUtils.escapeJson(id));
sb.append("\" }\n");
}
}
{code}
 
 * The *TestPutElasticsearchHttp.java* test file needs to be updated to reflect 
that now a requests without type is valid (it's currently marked as invalid)

  was:
In ES 7.x and above, document "type" is no longer compulsory and in fact is 
deprecated. When using the 1.9.2 version of PutElasticsearchHttp with ES v7.2, 
it still works however you'll see the following HTTP in the response:

 

{{HTTP/1.1 200 OK}}
 {{Warning: 299 Elasticsearch-7.2.0-508c38a "[types removal] Specifying types 
in bulk requests is deprecated."}}
 {{content-type: application/json; charset=UTF-8}}

 

The fix is relatively straightforward:
 * In *PutElasticserachHttp.java*, remove the requirement of a compulsory 
"Type" property:

{code:java}
public static final PropertyDescriptor TYPE = new PropertyDescriptor.Builder()
 .name("put-es-type")
 .displayName("Type")
 .description("The type of this document (used by Elasticsearch < 7.0 for 
indexing and searching). Leave empty for ES >= 7.0") // <-
 .required(false) // <- CHANGE
 .expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
 .addValidator(StandardValidators.NON_EMPTY_EL_VALIDATOR)
 .build();
{code}
 
 * In *AbstractElasticsearchHttpProcessor.java*, check for the presence of 
"docType". If not present, assume elasticsearch 7.x or above and omit from bulk 
API URL:

 
{code:java}
protected void buildBulkCommand(StringBuilder sb, String index, String docType, 
String indexOp, String id, String jsonString) {
if (indexOp.equalsIgnoreCase("index")) {
sb.append("{\"index\": { \"_index\": \"");
sb.append(StringEscapeUtils.escapeJson(index));
if (!(StringUtils.isEmpty(docType) | docType == null)){ // <- 
CHANGE START
sb.ap

[jira] [Updated] (NIFI-6404) PutElasticsearchHttp: Remove _type as being compulsory

2019-06-28 Thread David Vassallo (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-6404?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

David Vassallo updated NIFI-6404:
-
Description: 
In ES 7.x and above, document "type" is no longer compulsory and in fact is 
deprecated. When using the 1.9.2 version of PutElasticsearchHttp with ES v7.2, 
it still works however you'll see the following HTTP in the response:

 

{{HTTP/1.1 200 OK}}
 {{Warning: 299 Elasticsearch-7.2.0-508c38a "[types removal] Specifying types 
in bulk requests is deprecated."}}
 {{content-type: application/json; charset=UTF-8}}

 

The fix is relatively straightforward:
 * In *PutElasticserachHttp.java*, remove the requirement of a compulsory 
"Type" property:

{code:java}
public static final PropertyDescriptor TYPE = new PropertyDescriptor.Builder()
 .name("put-es-type")
 .displayName("Type")
 .description("The type of this document (used by Elasticsearch < 7.0 for 
indexing and searching). Leave empty for ES >= 7.0") // <-
 .required(false) // <- CHANGE
 .expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
 .addValidator(StandardValidators.NON_EMPTY_EL_VALIDATOR)
 .build();
{code}
 
 * In *AbstractElasticsearchHttpProcessor.java*, check for the presence of 
"docType". If not present, assume elasticsearch 7.x or above and omit from bulk 
API URL:

 
{code:java}
protected void buildBulkCommand(StringBuilder sb, String index, String docType, 
String indexOp, String id, String jsonString) {
if (indexOp.equalsIgnoreCase("index")) {
sb.append("{\"index\": { \"_index\": \"");
sb.append(StringEscapeUtils.escapeJson(index));
if (!(StringUtils.isEmpty(docType) | docType == null)){ // <- 
CHANGE START
sb.append("\", \"_type\": \"");
sb.append(StringEscapeUtils.escapeJson(docType));
sb.append("\"");
}// <- CHANGE END
if (!StringUtils.isEmpty(id)) { 
sb.append(", \"_id\": \"");
sb.append(StringEscapeUtils.escapeJson(id));
sb.append("\"");
} 
sb.append("}}\n");
sb.append(jsonString);
sb.append("\n");
} else if (indexOp.equalsIgnoreCase("upsert") || 
indexOp.equalsIgnoreCase("update")) {
sb.append("{\"update\": { \"_index\": \"");
sb.append(StringEscapeUtils.escapeJson(index));
sb.append("\", \"_type\": \"");
sb.append(StringEscapeUtils.escapeJson(docType));
sb.append("\", \"_id\": \"");
sb.append(StringEscapeUtils.escapeJson(id));
sb.append("\" }\n");
sb.append("{\"doc\": ");
sb.append(jsonString);
sb.append(", \"doc_as_upsert\": ");
sb.append(indexOp.equalsIgnoreCase("upsert"));
sb.append(" }\n");
} else if (indexOp.equalsIgnoreCase("delete")) {
sb.append("{\"delete\": { \"_index\": \"");
sb.append(StringEscapeUtils.escapeJson(index));
sb.append("\", \"_type\": \"");
sb.append(StringEscapeUtils.escapeJson(docType));
sb.append("\", \"_id\": \"");
sb.append(StringEscapeUtils.escapeJson(id));
sb.append("\" }\n");
}
}
{code}
 
 * The *TestPutElasticsearchHttp.java* test file needs to be updated to reflect 
that now a requests without type is valid (it's currently marked as invalid)

  was:
In ES 7.x and above, document "type" is no longer compulsory and in fact is 
deprecated. When using the 1.9.2 version of PutElasticsearchHttp with ES v7.2, 
it still works however you'll see the following HTTP in the response:

 

{{HTTP/1.1 200 OK}}
 {{Warning: 299 Elasticsearch-7.2.0-508c38a "[types removal] Specifying types 
in bulk requests is deprecated."}}
 {{content-type: application/json; charset=UTF-8}}

 

The fix is relatively straightforward:
 * In *PutElasticserachHttp.java*, remove the requirement of a compulsory 
"Type" property:

{code:java}
public static final PropertyDescriptor TYPE = new PropertyDescriptor.Builder()
 .name("put-es-type")
 .displayName("Type")
 .description("The type of this document (used by Elasticsearch < 7.0 for 
indexing and searching). Leave empty for ES >= 7.0") // <-
 .required(false) // <- CHANGE
 .expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
 .addValidator(StandardValidators.NON_EMPTY_EL_VALIDATOR)
 .build();
{code}
 
 * In *AbstractElasticsearchHttpProcessor.java*, check for the presence of 
"docType". If not present, assume elasticsearch 7.x or above and omit from bulk 
API URL:

 
{code:java}
protected void buildBulkCommand(StringBuilder sb, String index, String docType, 
String indexOp, String id, String jsonString) {
if (indexOp.equalsIgnoreCase("index")) {
sb.append("{\"index\": { \"_index\": \"");
sb.append(StringEscapeUtils.escapeJson(index));
if (!(StringUtils.isEmpty(docType) | docType == null)){
sb.append("\", \"_type\": \"");
   

[jira] [Updated] (NIFI-6404) PutElasticsearchHttp: Remove _type as being compulsory

2019-06-28 Thread David Vassallo (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-6404?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

David Vassallo updated NIFI-6404:
-
Description: 
In ES 7.x and above, document "type" is no longer compulsory and in fact is 
deprecated. When using the 1.9.2 version of PutElasticsearchHttp with ES v7.2, 
it still works however you'll see the following HTTP in the response:

 

{{HTTP/1.1 200 OK}}
 {{Warning: 299 Elasticsearch-7.2.0-508c38a "[types removal] Specifying types 
in bulk requests is deprecated."}}
 {{content-type: application/json; charset=UTF-8}}

 

The fix is relatively straightforward:
 * In *PutElasticserachHttp.java*, remove the requirement of a compulsory 
"Type" property:

{code:java}
public static final PropertyDescriptor TYPE = new PropertyDescriptor.Builder()
 .name("put-es-type")
 .displayName("Type")
 .description("The type of this document (used by Elasticsearch < 7.0 for 
indexing and searching). Leave empty for ES >= 7.0") // <-
 .required(false) // <- CHANGE
 .expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
 .addValidator(StandardValidators.NON_EMPTY_EL_VALIDATOR)
 .build();
{code}
 
 * In *AbstractElasticsearchHttpProcessor.java*, check for the presence of 
"docType". If not present, assume elasticsearch 7.x or above and omit from bulk 
API URL:

 
{code:java}
protected void buildBulkCommand(StringBuilder sb, String index, String docType, 
String indexOp, String id, String jsonString) {
if (indexOp.equalsIgnoreCase("index")) {
sb.append("{\"index\": { \"_index\": \"");
sb.append(StringEscapeUtils.escapeJson(index));
if (!(StringUtils.isEmpty(docType) | docType == null)){
sb.append("\", \"_type\": \"");
sb.append(StringEscapeUtils.escapeJson(docType));
sb.append("\"");
}
if (!StringUtils.isEmpty(id)) {  // <- CHANGE START
sb.append(", \"_id\": \"");
sb.append(StringEscapeUtils.escapeJson(id));
sb.append("\"");
} // <- CHANGE END
sb.append("}}\n");
sb.append(jsonString);
sb.append("\n");
} else if (indexOp.equalsIgnoreCase("upsert") || 
indexOp.equalsIgnoreCase("update")) {
sb.append("{\"update\": { \"_index\": \"");
sb.append(StringEscapeUtils.escapeJson(index));
sb.append("\", \"_type\": \"");
sb.append(StringEscapeUtils.escapeJson(docType));
sb.append("\", \"_id\": \"");
sb.append(StringEscapeUtils.escapeJson(id));
sb.append("\" }\n");
sb.append("{\"doc\": ");
sb.append(jsonString);
sb.append(", \"doc_as_upsert\": ");
sb.append(indexOp.equalsIgnoreCase("upsert"));
sb.append(" }\n");
} else if (indexOp.equalsIgnoreCase("delete")) {
sb.append("{\"delete\": { \"_index\": \"");
sb.append(StringEscapeUtils.escapeJson(index));
sb.append("\", \"_type\": \"");
sb.append(StringEscapeUtils.escapeJson(docType));
sb.append("\", \"_id\": \"");
sb.append(StringEscapeUtils.escapeJson(id));
sb.append("\" }\n");
}
}
{code}
 
 * The *TestPutElasticsearchHttp.java* test file needs to be updated to reflect 
that now a requests without type is valid (it's currently marked as invalid)

  was:
In ES 7.x and above, document "type" is no longer compulsory and in fact is 
deprecated. When using the 1.9.2 version of PutElasticsearchHttp with ES v7.2, 
it still works however you'll see the following HTTP in the response:

 

{{HTTP/1.1 200 OK}}
{{Warning: 299 Elasticsearch-7.2.0-508c38a "[types removal] Specifying types in 
bulk requests is deprecated."}}
{{content-type: application/json; charset=UTF-8}}

 

The fix is relatively straightforward:
 * In *PutElasticserachHttp.java*, remove the requirement of a compulsory 
"Type" property:

{{public static final PropertyDescriptor TYPE = new 
PropertyDescriptor.Builder()}}
{{ .name("put-es-type")}}
{{ .displayName("Type")}}
{{ .description(*"The type of this document (used by Elasticsearch < 7.0 for 
indexing and searching). Leave empty for ES >= 7.0"*)}}
{{ *.required(false)*}}
{{ .expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)}}
{{ .addValidator(StandardValidators.NON_EMPTY_EL_VALIDATOR)}}
{{ .build();}}

 
 * In *AbstractElasticsearchHttpProcessor.java*, check for the presence of 
"docType". If not present, assume elasticsearch 7.x or above and omit from bulk 
API URL:

 

{{protected void buildBulkCommand(StringBuilder sb, String index, String 
docType, String indexOp, String id, String jsonString) {}}
{{ if (indexOp.equalsIgnoreCase("index")) {}}
{{   sb.append("{\"index\": { \"_index\": \"");}}
{{   sb.append(StringEscapeUtils.escapeJson(index));}}
{{ *if (!(StringUtils.isEmpty(docType) | docType == null)){*}}
{{   *sb.append("\", \"_type\": \"");*}}
{{   *sb.append(StringEscapeUtils.escapeJson(docType));*}}

[jira] [Created] (NIFI-6404) PutElasticsearchHttp: Remove _type as being compulsory

2019-06-28 Thread David Vassallo (JIRA)
David Vassallo created NIFI-6404:


 Summary: PutElasticsearchHttp: Remove _type as being compulsory
 Key: NIFI-6404
 URL: https://issues.apache.org/jira/browse/NIFI-6404
 Project: Apache NiFi
  Issue Type: Improvement
  Components: Extensions
Affects Versions: 1.9.2
 Environment: Elasticsearch 7.x
Reporter: David Vassallo


In ES 7.x and above, document "type" is no longer compulsory and in fact is 
deprecated. When using the 1.9.2 version of PutElasticsearchHttp with ES v7.2, 
it still works however you'll see the following HTTP in the response:

 

{{HTTP/1.1 200 OK}}
{{Warning: 299 Elasticsearch-7.2.0-508c38a "[types removal] Specifying types in 
bulk requests is deprecated."}}
{{content-type: application/json; charset=UTF-8}}

 

The fix is relatively straightforward:
 * In *PutElasticserachHttp.java*, remove the requirement of a compulsory 
"Type" property:

{{public static final PropertyDescriptor TYPE = new 
PropertyDescriptor.Builder()}}
{{ .name("put-es-type")}}
{{ .displayName("Type")}}
{{ .description(*"The type of this document (used by Elasticsearch < 7.0 for 
indexing and searching). Leave empty for ES >= 7.0"*)}}
{{ *.required(false)*}}
{{ .expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)}}
{{ .addValidator(StandardValidators.NON_EMPTY_EL_VALIDATOR)}}
{{ .build();}}

 
 * In *AbstractElasticsearchHttpProcessor.java*, check for the presence of 
"docType". If not present, assume elasticsearch 7.x or above and omit from bulk 
API URL:

 

{{protected void buildBulkCommand(StringBuilder sb, String index, String 
docType, String indexOp, String id, String jsonString) {}}
{{ if (indexOp.equalsIgnoreCase("index")) {}}
{{   sb.append("{\"index\": { \"_index\": \"");}}
{{   sb.append(StringEscapeUtils.escapeJson(index));}}
{{ *if (!(StringUtils.isEmpty(docType) | docType == null)){*}}
{{   *sb.append("\", \"_type\": \"");*}}
{{   *sb.append(StringEscapeUtils.escapeJson(docType));*}}
{{   *sb.append("\"");*}}
{{ *}*}}
{{ if (!StringUtils.isEmpty(id)) {}}
{{   sb.append(", \"_id\": \"");}}
{{   sb.append(StringEscapeUtils.escapeJson(id));}}
{{   sb.append("\"");}}
{{ }}}
{{   sb.append("}}\n");}}
{{   sb.append(jsonString);}}
{{   sb.append("\n");}}
{{ } else if (indexOp.equalsIgnoreCase("upsert") || 
indexOp.equalsIgnoreCase("update")) {}}
{{   sb.append("{\"update\": { \"_index\": \"");}}
{{   sb.append(StringEscapeUtils.escapeJson(index));}}
{{   sb.append("\", \"_type\": \"");}}
{{   sb.append(StringEscapeUtils.escapeJson(docType));}}
{{   sb.append("\", \"_id\": \"");}}
{{   sb.append(StringEscapeUtils.escapeJson(id));}}
{{   sb.append("\" }\n");}}
{{   sb.append("{\"doc\": ");}}
{{   sb.append(jsonString);}}
{{   sb.append(", \"doc_as_upsert\": ");}}
{{   sb.append(indexOp.equalsIgnoreCase("upsert"));}}
{{   sb.append(" }\n");}}
{{ } else if (indexOp.equalsIgnoreCase("delete")) {}}
{{   sb.append("{\"delete\": { \"_index\": \"");}}
{{   sb.append(StringEscapeUtils.escapeJson(index));}}
{{   sb.append("\", \"_type\": \"");}}
{{  sb.append(StringEscapeUtils.escapeJson(docType));}}
{{  sb.append("\", \"_id\": \"");}}
{{  sb.append(StringEscapeUtils.escapeJson(id));}}
{{  sb.append("\" }\n");}}
{{ }}}
{{}}}
 * The *TestPutElasticsearchHttp.java* test file needs to be updated to reflect 
that now a requests without type is valid (it's currently marked as invalid)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)