[jira] [Commented] (NIFI-5282) GCPProcessor with HTTP Proxy with Authentication

2018-08-30 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5282?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16598278#comment-16598278
 ] 

ASF GitHub Bot commented on NIFI-5282:
--

Github user ijokarumawak commented on the issue:

https://github.com/apache/nifi/pull/2943
  
Hello @jugi92 , thank you very much for your contribution. I stumble upon 
the same needs and wanted to merged this urgently. Currently, GCP processors do 
not work if they run on a server which doesn't have direct internet connection 
and requires forward proxy to go out. Because GCPCredentialControllerService 
doesn't support proxy configuration.

I am going to create my own PR based on your commit to fix the conflict. I 
will review your change at the same time and make change if necessary. I hope 
this approach helps the PR to get merged quickly. Will update you once the new 
PR is ready. Thanks!


> GCPProcessor with HTTP Proxy with Authentication
> 
>
> Key: NIFI-5282
> URL: https://issues.apache.org/jira/browse/NIFI-5282
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 1.6.0
>Reporter: Julian Gimbel
>Assignee: Sivaprasanna Sethuraman
>Priority: Minor
>
> The [AbstractGCPProcessor 
> |https://github.com/apache/nifi/blob/master/nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/AbstractGCPProcessor.java]
>  already accepts http proxy settings but it but be even better if it accepts 
> authenticated proxies with user and password aswell.
> In the best case it would support the ProxyService introduced in 
> [NIFI-4199|https://issues.apache.org/jira/projects/NIFI/issues/NIFI-4199] and 
> all of its options.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2943: NIFI-5282 - Proxy support gcp

2018-08-30 Thread ijokarumawak
Github user ijokarumawak commented on the issue:

https://github.com/apache/nifi/pull/2943
  
Hello @jugi92 , thank you very much for your contribution. I stumble upon 
the same needs and wanted to merged this urgently. Currently, GCP processors do 
not work if they run on a server which doesn't have direct internet connection 
and requires forward proxy to go out. Because GCPCredentialControllerService 
doesn't support proxy configuration.

I am going to create my own PR based on your commit to fix the conflict. I 
will review your change at the same time and make change if necessary. I hope 
this approach helps the PR to get merged quickly. Will update you once the new 
PR is ready. Thanks!


---


[jira] [Commented] (NIFI-5147) Improve HashAttribute processor

2018-08-30 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5147?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16598101#comment-16598101
 ] 

ASF GitHub Bot commented on NIFI-5147:
--

Github user alopresto commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2836#discussion_r214223024
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/TestCalculateAttributeHash.java
 ---
@@ -0,0 +1,178 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import org.apache.commons.codec.binary.Hex;
+import org.apache.commons.codec.digest.DigestUtils;
+import org.apache.nifi.util.MockFlowFile;
+import org.apache.nifi.util.TestRunner;
+import org.apache.nifi.util.TestRunners;
+import org.junit.Assert;
+import org.junit.Test;
+
+import java.nio.charset.Charset;
+import java.nio.charset.StandardCharsets;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+
+public class TestCalculateAttributeHash {
+
+private static final Charset UTF8 = StandardCharsets.UTF_8;
+@Test
+public void testMD2() throws Exception {
+testAllAlgorithm("MD2");
+testParitalAlgorithm("MD2");
+testMissingAlgorithm("MD2");
+}
+
+@Test
+public void testMD5() throws Exception {
+testAllAlgorithm("MD5");
+testParitalAlgorithm("MD5");
+testMissingAlgorithm("MD5");
+}
+
+@Test
+public void testSHA1() throws Exception {
+testAllAlgorithm("SHA-1");
+testParitalAlgorithm("SHA-1");
+testMissingAlgorithm("SHA-1");
+}
+
+@Test
+public void testSHA256() throws Exception {
+testAllAlgorithm("SHA-256");
+testParitalAlgorithm("SHA-256");
+testMissingAlgorithm("SHA-256");
+}
+
+@Test
+public void testSHA384() throws Exception {
+testAllAlgorithm("SHA-384");
+testParitalAlgorithm("SHA-384");
+testMissingAlgorithm("SHA-384");
+}
+
+@Test
+public void testSHA512() throws Exception {
+testAllAlgorithm("SHA-512");
+testParitalAlgorithm("SHA-512");
+testMissingAlgorithm("SHA-512");
+}
+
+public void testAllAlgorithm(String algorithm) {
+final TestRunner runner = TestRunners.newTestRunner(new 
CalculateAttributeHash());
+
runner.setProperty(CalculateAttributeHash.HASH_ALGORITHM.getName(), algorithm);
+runner.setProperty("name", String.format("%s_%s", "name", 
algorithm));
+runner.setProperty("value", String.format("%s_%s", "value", 
algorithm));
+
+final Map attributeMap = new HashMap<>();
+attributeMap.put("name", "abcdefg");
+attributeMap.put("value", "hijklmnop");
+runner.enqueue(new byte[0], attributeMap);
+
+runner.run(1);
+
+runner.assertTransferCount(HashAttribute.REL_FAILURE, 0);
+runner.assertTransferCount(HashAttribute.REL_SUCCESS, 1);
+
+final List success = 
runner.getFlowFilesForRelationship(HashAttribute.REL_SUCCESS);
+
+for (final MockFlowFile flowFile : success) {
+
Assert.assertEquals(Hex.encodeHexString(DigestUtils.getDigest(algorithm).digest("abcdefg".getBytes(UTF8))),
+flowFile.getAttribute(String.format("%s_%s", "name", 
algorithm)));
+
Assert.assertEquals(Hex.encodeHexString(DigestUtils.getDigest(algorithm).digest("hijklmnop".getBytes(UTF8))),
+flowFile.getAttribute(String.format("%s_%s", "value", 
algorithm)));
+}
+}
+
+public void testParitalAlgorithm(String algorithm) {
+final TestRunner runner = TestRunners.newTestRunner(new 
Calcul

[GitHub] nifi pull request #2836: NIFI-5147 Calculate hash attribute redux

2018-08-30 Thread alopresto
Github user alopresto commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2836#discussion_r214223024
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/TestCalculateAttributeHash.java
 ---
@@ -0,0 +1,178 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import org.apache.commons.codec.binary.Hex;
+import org.apache.commons.codec.digest.DigestUtils;
+import org.apache.nifi.util.MockFlowFile;
+import org.apache.nifi.util.TestRunner;
+import org.apache.nifi.util.TestRunners;
+import org.junit.Assert;
+import org.junit.Test;
+
+import java.nio.charset.Charset;
+import java.nio.charset.StandardCharsets;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+
+public class TestCalculateAttributeHash {
+
+private static final Charset UTF8 = StandardCharsets.UTF_8;
+@Test
+public void testMD2() throws Exception {
+testAllAlgorithm("MD2");
+testParitalAlgorithm("MD2");
+testMissingAlgorithm("MD2");
+}
+
+@Test
+public void testMD5() throws Exception {
+testAllAlgorithm("MD5");
+testParitalAlgorithm("MD5");
+testMissingAlgorithm("MD5");
+}
+
+@Test
+public void testSHA1() throws Exception {
+testAllAlgorithm("SHA-1");
+testParitalAlgorithm("SHA-1");
+testMissingAlgorithm("SHA-1");
+}
+
+@Test
+public void testSHA256() throws Exception {
+testAllAlgorithm("SHA-256");
+testParitalAlgorithm("SHA-256");
+testMissingAlgorithm("SHA-256");
+}
+
+@Test
+public void testSHA384() throws Exception {
+testAllAlgorithm("SHA-384");
+testParitalAlgorithm("SHA-384");
+testMissingAlgorithm("SHA-384");
+}
+
+@Test
+public void testSHA512() throws Exception {
+testAllAlgorithm("SHA-512");
+testParitalAlgorithm("SHA-512");
+testMissingAlgorithm("SHA-512");
+}
+
+public void testAllAlgorithm(String algorithm) {
+final TestRunner runner = TestRunners.newTestRunner(new 
CalculateAttributeHash());
+
runner.setProperty(CalculateAttributeHash.HASH_ALGORITHM.getName(), algorithm);
+runner.setProperty("name", String.format("%s_%s", "name", 
algorithm));
+runner.setProperty("value", String.format("%s_%s", "value", 
algorithm));
+
+final Map attributeMap = new HashMap<>();
+attributeMap.put("name", "abcdefg");
+attributeMap.put("value", "hijklmnop");
+runner.enqueue(new byte[0], attributeMap);
+
+runner.run(1);
+
+runner.assertTransferCount(HashAttribute.REL_FAILURE, 0);
+runner.assertTransferCount(HashAttribute.REL_SUCCESS, 1);
+
+final List success = 
runner.getFlowFilesForRelationship(HashAttribute.REL_SUCCESS);
+
+for (final MockFlowFile flowFile : success) {
+
Assert.assertEquals(Hex.encodeHexString(DigestUtils.getDigest(algorithm).digest("abcdefg".getBytes(UTF8))),
+flowFile.getAttribute(String.format("%s_%s", "name", 
algorithm)));
+
Assert.assertEquals(Hex.encodeHexString(DigestUtils.getDigest(algorithm).digest("hijklmnop".getBytes(UTF8))),
+flowFile.getAttribute(String.format("%s_%s", "value", 
algorithm)));
+}
+}
+
+public void testParitalAlgorithm(String algorithm) {
+final TestRunner runner = TestRunners.newTestRunner(new 
CalculateAttributeHash());
+
runner.setProperty(CalculateAttributeHash.HASH_ALGORITHM.getName(), algorithm);
+runner.setProperty("name", String.format("%s_%s", "name", 
algorithm));
+runner.setProperty("value", String.format

[GitHub] nifi pull request #2836: NIFI-5147 Calculate hash attribute redux

2018-08-30 Thread alopresto
Github user alopresto commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2836#discussion_r214222959
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/TestCalculateAttributeHash.java
 ---
@@ -0,0 +1,178 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import org.apache.commons.codec.binary.Hex;
+import org.apache.commons.codec.digest.DigestUtils;
+import org.apache.nifi.util.MockFlowFile;
+import org.apache.nifi.util.TestRunner;
+import org.apache.nifi.util.TestRunners;
+import org.junit.Assert;
+import org.junit.Test;
+
+import java.nio.charset.Charset;
+import java.nio.charset.StandardCharsets;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+
+public class TestCalculateAttributeHash {
+
+private static final Charset UTF8 = StandardCharsets.UTF_8;
+@Test
+public void testMD2() throws Exception {
+testAllAlgorithm("MD2");
+testParitalAlgorithm("MD2");
+testMissingAlgorithm("MD2");
+}
+
+@Test
+public void testMD5() throws Exception {
+testAllAlgorithm("MD5");
+testParitalAlgorithm("MD5");
+testMissingAlgorithm("MD5");
+}
+
+@Test
+public void testSHA1() throws Exception {
+testAllAlgorithm("SHA-1");
+testParitalAlgorithm("SHA-1");
+testMissingAlgorithm("SHA-1");
+}
+
+@Test
+public void testSHA256() throws Exception {
+testAllAlgorithm("SHA-256");
+testParitalAlgorithm("SHA-256");
+testMissingAlgorithm("SHA-256");
+}
+
+@Test
+public void testSHA384() throws Exception {
+testAllAlgorithm("SHA-384");
+testParitalAlgorithm("SHA-384");
+testMissingAlgorithm("SHA-384");
+}
+
+@Test
+public void testSHA512() throws Exception {
+testAllAlgorithm("SHA-512");
+testParitalAlgorithm("SHA-512");
+testMissingAlgorithm("SHA-512");
+}
+
+public void testAllAlgorithm(String algorithm) {
+final TestRunner runner = TestRunners.newTestRunner(new 
CalculateAttributeHash());
+
runner.setProperty(CalculateAttributeHash.HASH_ALGORITHM.getName(), algorithm);
+runner.setProperty("name", String.format("%s_%s", "name", 
algorithm));
+runner.setProperty("value", String.format("%s_%s", "value", 
algorithm));
+
+final Map attributeMap = new HashMap<>();
+attributeMap.put("name", "abcdefg");
+attributeMap.put("value", "hijklmnop");
+runner.enqueue(new byte[0], attributeMap);
+
+runner.run(1);
+
+runner.assertTransferCount(HashAttribute.REL_FAILURE, 0);
+runner.assertTransferCount(HashAttribute.REL_SUCCESS, 1);
+
+final List success = 
runner.getFlowFilesForRelationship(HashAttribute.REL_SUCCESS);
+
+for (final MockFlowFile flowFile : success) {
+
Assert.assertEquals(Hex.encodeHexString(DigestUtils.getDigest(algorithm).digest("abcdefg".getBytes(UTF8))),
+flowFile.getAttribute(String.format("%s_%s", "name", 
algorithm)));
+
Assert.assertEquals(Hex.encodeHexString(DigestUtils.getDigest(algorithm).digest("hijklmnop".getBytes(UTF8))),
+flowFile.getAttribute(String.format("%s_%s", "value", 
algorithm)));
+}
+}
+
+public void testParitalAlgorithm(String algorithm) {
--- End diff --

Typo: `testPartialAlgorithm`


---


[jira] [Commented] (NIFI-5147) Improve HashAttribute processor

2018-08-30 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5147?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16598100#comment-16598100
 ] 

ASF GitHub Bot commented on NIFI-5147:
--

Github user alopresto commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2836#discussion_r214222959
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/TestCalculateAttributeHash.java
 ---
@@ -0,0 +1,178 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import org.apache.commons.codec.binary.Hex;
+import org.apache.commons.codec.digest.DigestUtils;
+import org.apache.nifi.util.MockFlowFile;
+import org.apache.nifi.util.TestRunner;
+import org.apache.nifi.util.TestRunners;
+import org.junit.Assert;
+import org.junit.Test;
+
+import java.nio.charset.Charset;
+import java.nio.charset.StandardCharsets;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+
+public class TestCalculateAttributeHash {
+
+private static final Charset UTF8 = StandardCharsets.UTF_8;
+@Test
+public void testMD2() throws Exception {
+testAllAlgorithm("MD2");
+testParitalAlgorithm("MD2");
+testMissingAlgorithm("MD2");
+}
+
+@Test
+public void testMD5() throws Exception {
+testAllAlgorithm("MD5");
+testParitalAlgorithm("MD5");
+testMissingAlgorithm("MD5");
+}
+
+@Test
+public void testSHA1() throws Exception {
+testAllAlgorithm("SHA-1");
+testParitalAlgorithm("SHA-1");
+testMissingAlgorithm("SHA-1");
+}
+
+@Test
+public void testSHA256() throws Exception {
+testAllAlgorithm("SHA-256");
+testParitalAlgorithm("SHA-256");
+testMissingAlgorithm("SHA-256");
+}
+
+@Test
+public void testSHA384() throws Exception {
+testAllAlgorithm("SHA-384");
+testParitalAlgorithm("SHA-384");
+testMissingAlgorithm("SHA-384");
+}
+
+@Test
+public void testSHA512() throws Exception {
+testAllAlgorithm("SHA-512");
+testParitalAlgorithm("SHA-512");
+testMissingAlgorithm("SHA-512");
+}
+
+public void testAllAlgorithm(String algorithm) {
+final TestRunner runner = TestRunners.newTestRunner(new 
CalculateAttributeHash());
+
runner.setProperty(CalculateAttributeHash.HASH_ALGORITHM.getName(), algorithm);
+runner.setProperty("name", String.format("%s_%s", "name", 
algorithm));
+runner.setProperty("value", String.format("%s_%s", "value", 
algorithm));
+
+final Map attributeMap = new HashMap<>();
+attributeMap.put("name", "abcdefg");
+attributeMap.put("value", "hijklmnop");
+runner.enqueue(new byte[0], attributeMap);
+
+runner.run(1);
+
+runner.assertTransferCount(HashAttribute.REL_FAILURE, 0);
+runner.assertTransferCount(HashAttribute.REL_SUCCESS, 1);
+
+final List success = 
runner.getFlowFilesForRelationship(HashAttribute.REL_SUCCESS);
+
+for (final MockFlowFile flowFile : success) {
+
Assert.assertEquals(Hex.encodeHexString(DigestUtils.getDigest(algorithm).digest("abcdefg".getBytes(UTF8))),
+flowFile.getAttribute(String.format("%s_%s", "name", 
algorithm)));
+
Assert.assertEquals(Hex.encodeHexString(DigestUtils.getDigest(algorithm).digest("hijklmnop".getBytes(UTF8))),
+flowFile.getAttribute(String.format("%s_%s", "value", 
algorithm)));
+}
+}
+
+public void testParitalAlgorithm(String algorithm) {
--- End diff --

Typo: `testPartialAlgorithm`


> Improve HashAt

[jira] [Commented] (NIFI-5555) Add visual indicators for deprecated components

2018-08-30 Thread Joseph Witt (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16598023#comment-16598023
 ] 

Joseph Witt commented on NIFI-:
---

visual indicator on the flow graph/everywhere we show the processor in listings 
is good.  In a way it would be a permanent (as long as that component is used) 
bulletin indicator.

> Add visual indicators for deprecated components
> ---
>
> Key: NIFI-
> URL: https://issues.apache.org/jira/browse/NIFI-
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core UI
>Reporter: Pierre Villard
>Priority: Major
>
> Since NiFi 1.3.0 and NIFI-391, we can add a
> {noformat}
> @DeprecationNotice{noformat}
> annotation to components that we want to deprecate.
> As of now, this deprecation notice is only reflected in the Usage 
> documentation. But it can be completely missed by users and it can also be 
> completely unnoticed on existing workflows.
> I suggest the two following improvements:
>  * In the component listing, strikethrough the line displaying the component 
> and prefix the description with "DEPRECATED (see usage for more information)".
>  * Just like we have a specific icon for processors running on the primary 
> node only, it could be a good idea to have an icon with something like an 
> exclamation mark instead of the "P".
> Thoughts?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5555) Add visual indicators for deprecated components

2018-08-30 Thread Ashmeet Kandhari (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16598009#comment-16598009
 ] 

Ashmeet Kandhari commented on NIFI-:


Sounds good.
What do you think of changing the default colour of the processor as red or 
something along with icon?

> Add visual indicators for deprecated components
> ---
>
> Key: NIFI-
> URL: https://issues.apache.org/jira/browse/NIFI-
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core UI
>Reporter: Pierre Villard
>Priority: Major
>
> Since NiFi 1.3.0 and NIFI-391, we can add a
> {noformat}
> @DeprecationNotice{noformat}
> annotation to components that we want to deprecate.
> As of now, this deprecation notice is only reflected in the Usage 
> documentation. But it can be completely missed by users and it can also be 
> completely unnoticed on existing workflows.
> I suggest the two following improvements:
>  * In the component listing, strikethrough the line displaying the component 
> and prefix the description with "DEPRECATED (see usage for more information)".
>  * Just like we have a specific icon for processors running on the primary 
> node only, it could be a good idea to have an icon with something like an 
> exclamation mark instead of the "P".
> Thoughts?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5333) Create GetMongoRecord processor

2018-08-30 Thread Joseph Witt (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5333?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16597999#comment-16597999
 ] 

Joseph Witt commented on NIFI-5333:
---

There is already a GetMongo to pull data from Mongo.  Mike appears to just be 
pointing out that pulling them in the form of format/schema aware records 
leveraging nifi's record reader/writer capabilities will be far more 
powerful/efficient/performant.

> Create GetMongoRecord processor
> ---
>
> Key: NIFI-5333
> URL: https://issues.apache.org/jira/browse/NIFI-5333
> Project: Apache NiFi
>  Issue Type: New Feature
>Reporter: Mike Thomsen
>Priority: Major
>
> A processor similar to GetMongo that uses the record API should be created.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5333) Create GetMongoRecord processor

2018-08-30 Thread Ashmeet Kandhari (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5333?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16597998#comment-16597998
 ] 

Ashmeet Kandhari commented on NIFI-5333:


Hi,

Would you like to explain your use case?

> Create GetMongoRecord processor
> ---
>
> Key: NIFI-5333
> URL: https://issues.apache.org/jira/browse/NIFI-5333
> Project: Apache NiFi
>  Issue Type: New Feature
>Reporter: Mike Thomsen
>Priority: Major
>
> A processor similar to GetMongo that uses the record API should be created.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2979: [WIP] An experiment that I am looking at to elimina...

2018-08-30 Thread markap14
GitHub user markap14 opened a pull request:

https://github.com/apache/nifi/pull/2979

[WIP] An experiment that I am looking at to eliminate what appears to be 
unneeded synchronization in the WriteAheadProvenanceRepository. Not ready to be 
merged to master yet.

…. Also, unclear if the addEvents(Iterable) is the 
big difference maker or if it's the change to tryLease, etc.

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [ ] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [ ] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [ ] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [ ] Is your initial contribution a single, squashed commit?

### For code changes:
- [ ] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/markap14/nifi prov-experiment

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/2979.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2979


commit 19d68cee3c4ecc01a4b54eba151d6471c392e45d
Author: Mark Payne 
Date:   2018-08-28T16:45:03Z

Checkpoint. Code is a little hacky and needs to be tested much better. 
Also, unclear if the addEvents(Iterable) is the big 
difference maker or if it's the change to tryLease, etc.




---


[jira] [Commented] (NIFI-5565) Alter the Nifi Developer's Guide to better reference the existing Confluence documentation

2018-08-30 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5565?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16597894#comment-16597894
 ] 

ASF GitHub Bot commented on NIFI-5565:
--

GitHub user nalewis opened a pull request:

https://github.com/apache/nifi/pull/2978

NIFI-5565 Added reference to confluence documentation from the developers 
guide

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [X] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [X] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [X] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [X] Is your initial contribution a single, squashed commit?

### For code changes:
- [ ] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [X] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/nalewis/nifi NIFI-5565

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/2978.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2978


commit 4808a7344dffa40afb8a66561a14b81cfabae106
Author: Nick Lewis 
Date:   2018-08-30T20:15:14Z

Added reference to confluence documentation from the developers guide




> Alter the Nifi Developer's Guide to better reference the existing Confluence 
> documentation
> --
>
> Key: NIFI-5565
> URL: https://issues.apache.org/jira/browse/NIFI-5565
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Documentation & Website
>Reporter: Nicholas Lewis
>Priority: Minor
>  Labels: beginner, documentation
>
> It was noted that the [confluence 
> documentation|https://cwiki.apache.org/confluence/display/NIFI/Contributor+Guide]
>  could get more traffic by being better referenced from the [developer's 
> guide|https://nifi.apache.org/docs/nifi-docs/html/developer-guide.html#how-to-contribute-to-apache-nifi].
>  At the moment it is just linked to in the community drop down menu.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2978: NIFI-5565 Added reference to confluence documentati...

2018-08-30 Thread nalewis
GitHub user nalewis opened a pull request:

https://github.com/apache/nifi/pull/2978

NIFI-5565 Added reference to confluence documentation from the developers 
guide

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [X] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [X] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [X] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [X] Is your initial contribution a single, squashed commit?

### For code changes:
- [ ] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [X] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/nalewis/nifi NIFI-5565

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/2978.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2978


commit 4808a7344dffa40afb8a66561a14b81cfabae106
Author: Nick Lewis 
Date:   2018-08-30T20:15:14Z

Added reference to confluence documentation from the developers guide




---


[jira] [Commented] (NIFI-4426) Remove custom jBCrypt implementation because Java 7 is no longer supported

2018-08-30 Thread Andy LoPresto (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4426?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16597875#comment-16597875
 ] 

Andy LoPresto commented on NIFI-4426:
-

I removed it in an additional commit 
(8f37b5ee10650fb36a4ea85f8ff817e4b745c45c). Thanks Michael. 

> Remove custom jBCrypt implementation because Java 7 is no longer supported
> --
>
> Key: NIFI-4426
> URL: https://issues.apache.org/jira/browse/NIFI-4426
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core Framework
>Affects Versions: 1.4.0
>Reporter: Andy LoPresto
>Assignee: Nathan Gough
>Priority: Minor
>  Labels: security
> Fix For: 1.8.0
>
> Attachments: Screen Shot 2018-08-28 at 6.38.16 PM.png, Screen Shot 
> 2018-08-28 at 6.38.31 PM.png
>
>
> The {{jBCrypt}} library is included and slightly modified in order to provide 
> Java 7 compatibility because the external module is compiled for Java 8. Now 
> that NiFi doesn't support Java 7, this modification can be removed and the 
> standalone module can be depended upon via Maven as per normal. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4426) Remove custom jBCrypt implementation because Java 7 is no longer supported

2018-08-30 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4426?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16597874#comment-16597874
 ] 

ASF subversion and git services commented on NIFI-4426:
---

Commit 8f37b5ee10650fb36a4ea85f8ff817e4b745c45c in nifi's branch 
refs/heads/master from [~alopresto]
[ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=8f37b5e ]

NIFI-4426 Removed exclude from 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/pom.xml.


> Remove custom jBCrypt implementation because Java 7 is no longer supported
> --
>
> Key: NIFI-4426
> URL: https://issues.apache.org/jira/browse/NIFI-4426
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core Framework
>Affects Versions: 1.4.0
>Reporter: Andy LoPresto
>Assignee: Nathan Gough
>Priority: Minor
>  Labels: security
> Fix For: 1.8.0
>
> Attachments: Screen Shot 2018-08-28 at 6.38.16 PM.png, Screen Shot 
> 2018-08-28 at 6.38.31 PM.png
>
>
> The {{jBCrypt}} library is included and slightly modified in order to provide 
> Java 7 compatibility because the external module is compiled for Java 8. Now 
> that NiFi doesn't support Java 7, this modification can be removed and the 
> standalone module can be depended upon via Maven as per normal. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4426) Remove custom jBCrypt implementation because Java 7 is no longer supported

2018-08-30 Thread Michael Moser (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4426?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16597852#comment-16597852
 ] 

Michael Moser commented on NIFI-4426:
-

Very minor nit, but is there another BCrypt.java to take 
care of in 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/pom.xml?

> Remove custom jBCrypt implementation because Java 7 is no longer supported
> --
>
> Key: NIFI-4426
> URL: https://issues.apache.org/jira/browse/NIFI-4426
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core Framework
>Affects Versions: 1.4.0
>Reporter: Andy LoPresto
>Assignee: Nathan Gough
>Priority: Minor
>  Labels: security
> Fix For: 1.8.0
>
> Attachments: Screen Shot 2018-08-28 at 6.38.16 PM.png, Screen Shot 
> 2018-08-28 at 6.38.31 PM.png
>
>
> The {{jBCrypt}} library is included and slightly modified in order to provide 
> Java 7 compatibility because the external module is compiled for Java 8. Now 
> that NiFi doesn't support Java 7, this modification can be removed and the 
> standalone module can be depended upon via Maven as per normal. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5456) PutKinesisStream - Fails to work with AWS Private Link endpoint

2018-08-30 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5456?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16597848#comment-16597848
 ] 

ASF GitHub Bot commented on NIFI-5456:
--

Github user Mermadi commented on the issue:

https://github.com/apache/nifi/pull/2968
  
Great, I'll give this patch try this weekend. Thanks.


> PutKinesisStream - Fails to work with AWS Private Link endpoint
> ---
>
> Key: NIFI-5456
> URL: https://issues.apache.org/jira/browse/NIFI-5456
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 1.6.0, 1.7.1
> Environment: RedHat 6
>Reporter: Ariel Godinez
>Assignee: Sivaprasanna Sethuraman
>Priority: Major
>  Labels: easyfix
> Attachments: 
> 0001-NIFI-5456-AWS-clients-now-work-with-private-link-end.patch
>
>
> NiFi version: 1.6.0
> PutKinesisStream fails to put due to invalid signing information when using 
> an AWS Private Link as the endpoint override URL. The endpoint override URL 
> pattern for private links is like below along with the error that NiFi 
> outputs when we attempt to use this type of URL as the 'Endpoint Override 
> URL' property value.
> Endpoint Override URL: 
> [https://vpce-|https://vpce-/].kinesis.us-east-2.vpce.amazonaws.com
> ERROR [Timer-Driven Process Thread-11] "o.a.n.p.a.k.stream.PutKinesisStream" 
> PutKinesisStream[id=4c314e25-0164-1000--9bd79c77] Failed to publish 
> due to exception com.amazonaws.services.kinesis.model.AmazonKinesisException: 
> Credential should be scoped to a valid region, not 'vpce'.  (Service: 
> AmazonKinesis; Status Code: 400; Error Code: InvalidSignatureException; 
> Request ID: 6330b83c-a64e-4acf-b892-a505621cf78e) flowfiles 
> [StandardFlowFileRecord[uuid=ba299cec-7cbf-4750-a766-c348b5cd9c73,claim=StandardContentClaim
>  [resourceClaim=StandardResourceClaim[id=1532469012962-1, 
> container=content002, section=1], offset=2159750, 
> length=534625],offset=0,name=900966573101260,size=534625]]
>  
> It looks like 'vpce' is being extracted from the url as the region name when 
> it should be getting 'us-east-2'. We were able to get this processor to work 
> correctly by explicitly passing in the region and service using 
> 'setEndpoint(String endpoint, String serviceName, String regionId)' instead 
> of 'setEndpoint(String endpoint)' in 
> 'nifi/nifi-nar-bundles/nifi-aws-bundle/nifi-aws-abstract-processors/src/main/java/org/apache/nifi/processors/aws/AbstractAWSProcessor.java'
>  line 289



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2968: NIFI-5456: AWS clients now work with private link endpoint...

2018-08-30 Thread Mermadi
Github user Mermadi commented on the issue:

https://github.com/apache/nifi/pull/2968
  
Great, I'll give this patch try this weekend. Thanks.


---


[jira] [Updated] (NIFI-4426) Remove custom jBCrypt implementation because Java 7 is no longer supported

2018-08-30 Thread Andy LoPresto (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-4426?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Andy LoPresto updated NIFI-4426:

   Resolution: Fixed
Fix Version/s: 1.8.0
   Status: Resolved  (was: Patch Available)

> Remove custom jBCrypt implementation because Java 7 is no longer supported
> --
>
> Key: NIFI-4426
> URL: https://issues.apache.org/jira/browse/NIFI-4426
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core Framework
>Affects Versions: 1.4.0
>Reporter: Andy LoPresto
>Assignee: Nathan Gough
>Priority: Minor
>  Labels: security
> Fix For: 1.8.0
>
> Attachments: Screen Shot 2018-08-28 at 6.38.16 PM.png, Screen Shot 
> 2018-08-28 at 6.38.31 PM.png
>
>
> The {{jBCrypt}} library is included and slightly modified in order to provide 
> Java 7 compatibility because the external module is compiled for Java 8. Now 
> that NiFi doesn't support Java 7, this modification can be removed and the 
> standalone module can be depended upon via Maven as per normal. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (NIFI-4426) Remove custom jBCrypt implementation because Java 7 is no longer supported

2018-08-30 Thread Andy LoPresto (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-4426?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Andy LoPresto updated NIFI-4426:

Status: Patch Available  (was: Open)

> Remove custom jBCrypt implementation because Java 7 is no longer supported
> --
>
> Key: NIFI-4426
> URL: https://issues.apache.org/jira/browse/NIFI-4426
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core Framework
>Affects Versions: 1.4.0
>Reporter: Andy LoPresto
>Assignee: Nathan Gough
>Priority: Minor
>  Labels: security
> Attachments: Screen Shot 2018-08-28 at 6.38.16 PM.png, Screen Shot 
> 2018-08-28 at 6.38.31 PM.png
>
>
> The {{jBCrypt}} library is included and slightly modified in order to provide 
> Java 7 compatibility because the external module is compiled for Java 8. Now 
> that NiFi doesn't support Java 7, this modification can be removed and the 
> standalone module can be depended upon via Maven as per normal. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (NIFI-4426) Remove custom jBCrypt implementation because Java 7 is no longer supported

2018-08-30 Thread Andy LoPresto (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-4426?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Andy LoPresto reassigned NIFI-4426:
---

Assignee: Nathan Gough  (was: Andy LoPresto)

> Remove custom jBCrypt implementation because Java 7 is no longer supported
> --
>
> Key: NIFI-4426
> URL: https://issues.apache.org/jira/browse/NIFI-4426
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core Framework
>Affects Versions: 1.4.0
>Reporter: Andy LoPresto
>Assignee: Nathan Gough
>Priority: Minor
>  Labels: security
> Attachments: Screen Shot 2018-08-28 at 6.38.16 PM.png, Screen Shot 
> 2018-08-28 at 6.38.31 PM.png
>
>
> The {{jBCrypt}} library is included and slightly modified in order to provide 
> Java 7 compatibility because the external module is compiled for Java 8. Now 
> that NiFi doesn't support Java 7, this modification can be removed and the 
> standalone module can be depended upon via Maven as per normal. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4426) Remove custom jBCrypt implementation because Java 7 is no longer supported

2018-08-30 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4426?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16597812#comment-16597812
 ] 

ASF GitHub Bot commented on NIFI-4426:
--

Github user alopresto commented on the issue:

https://github.com/apache/nifi/pull/2976
  
Ran `contrib-check` and all tests pass. +1, merging. 


> Remove custom jBCrypt implementation because Java 7 is no longer supported
> --
>
> Key: NIFI-4426
> URL: https://issues.apache.org/jira/browse/NIFI-4426
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core Framework
>Affects Versions: 1.4.0
>Reporter: Andy LoPresto
>Assignee: Andy LoPresto
>Priority: Minor
>  Labels: security
> Attachments: Screen Shot 2018-08-28 at 6.38.16 PM.png, Screen Shot 
> 2018-08-28 at 6.38.31 PM.png
>
>
> The {{jBCrypt}} library is included and slightly modified in order to provide 
> Java 7 compatibility because the external module is compiled for Java 8. Now 
> that NiFi doesn't support Java 7, this modification can be removed and the 
> standalone module can be depended upon via Maven as per normal. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2976: NIFI-4426 - Replaced Java7 jBCrypt implementation which wa...

2018-08-30 Thread alopresto
Github user alopresto commented on the issue:

https://github.com/apache/nifi/pull/2976
  
Ran `contrib-check` and all tests pass. +1, merging. 


---


[jira] [Commented] (NIFI-4426) Remove custom jBCrypt implementation because Java 7 is no longer supported

2018-08-30 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4426?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16597810#comment-16597810
 ] 

ASF GitHub Bot commented on NIFI-4426:
--

Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/2976


> Remove custom jBCrypt implementation because Java 7 is no longer supported
> --
>
> Key: NIFI-4426
> URL: https://issues.apache.org/jira/browse/NIFI-4426
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core Framework
>Affects Versions: 1.4.0
>Reporter: Andy LoPresto
>Assignee: Andy LoPresto
>Priority: Minor
>  Labels: security
> Attachments: Screen Shot 2018-08-28 at 6.38.16 PM.png, Screen Shot 
> 2018-08-28 at 6.38.31 PM.png
>
>
> The {{jBCrypt}} library is included and slightly modified in order to provide 
> Java 7 compatibility because the external module is compiled for Java 8. Now 
> that NiFi doesn't support Java 7, this modification can be removed and the 
> standalone module can be depended upon via Maven as per normal. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4426) Remove custom jBCrypt implementation because Java 7 is no longer supported

2018-08-30 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4426?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16597809#comment-16597809
 ] 

ASF subversion and git services commented on NIFI-4426:
---

Commit c9267347edf6664a65317cfe4498d190ac9e7a28 in nifi's branch 
refs/heads/master from thenatog
[ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=c926734 ]

NIFI-4426 - Replaced Java7 jBCrypt implementation which was made for Java7 
backwards compatibility. It now uses a normal maven import to provide jBCrypt.

This closes #2976.

Signed-off-by: Andy LoPresto 


> Remove custom jBCrypt implementation because Java 7 is no longer supported
> --
>
> Key: NIFI-4426
> URL: https://issues.apache.org/jira/browse/NIFI-4426
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core Framework
>Affects Versions: 1.4.0
>Reporter: Andy LoPresto
>Assignee: Andy LoPresto
>Priority: Minor
>  Labels: security
> Attachments: Screen Shot 2018-08-28 at 6.38.16 PM.png, Screen Shot 
> 2018-08-28 at 6.38.31 PM.png
>
>
> The {{jBCrypt}} library is included and slightly modified in order to provide 
> Java 7 compatibility because the external module is compiled for Java 8. Now 
> that NiFi doesn't support Java 7, this modification can be removed and the 
> standalone module can be depended upon via Maven as per normal. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2976: NIFI-4426 - Replaced Java7 jBCrypt implementation w...

2018-08-30 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/2976


---


[jira] [Updated] (NIFI-5561) Add component name filtering to S2S Provenance Reporting Task

2018-08-30 Thread Andy LoPresto (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-5561?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Andy LoPresto updated NIFI-5561:

   Resolution: Fixed
Fix Version/s: 1.8.0
   Status: Resolved  (was: Patch Available)

> Add component name filtering to S2S Provenance Reporting Task
> -
>
> Key: NIFI-5561
> URL: https://issues.apache.org/jira/browse/NIFI-5561
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Reporter: Pierre Villard
>Assignee: Pierre Villard
>Priority: Major
> Fix For: 1.8.0
>
>
> I'd like to add component name as a way to filter events sent by the 
> SiteToSite Provenance Reporting task so that, for example, all events 
> generated by components containing "Prov" in the name are picked.
> This will be much easier to manage rather than component IDs as the ID of a 
> component could change when a workflow is promoted from one environment to 
> another in a CI/CD pipeline.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5561) Add component name filtering to S2S Provenance Reporting Task

2018-08-30 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5561?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16597765#comment-16597765
 ] 

ASF GitHub Bot commented on NIFI-5561:
--

Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/2973


> Add component name filtering to S2S Provenance Reporting Task
> -
>
> Key: NIFI-5561
> URL: https://issues.apache.org/jira/browse/NIFI-5561
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Reporter: Pierre Villard
>Assignee: Pierre Villard
>Priority: Major
>
> I'd like to add component name as a way to filter events sent by the 
> SiteToSite Provenance Reporting task so that, for example, all events 
> generated by components containing "Prov" in the name are picked.
> This will be much easier to manage rather than component IDs as the ID of a 
> component could change when a workflow is promoted from one environment to 
> another in a CI/CD pipeline.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2971: NIFI-5557: handling expired ticket by rollback and ...

2018-08-30 Thread jtstorck
Github user jtstorck commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2971#discussion_r214136451
  
--- Diff: 
nifi-nar-bundles/nifi-hadoop-bundle/nifi-hdfs-processors/src/main/java/org/apache/nifi/processors/hadoop/PutHDFS.java
 ---
@@ -266,6 +268,13 @@ public Object run() {
 throw new 
IOException(configuredRootDirPath.toString() + " could not be created");
 }
 changeOwner(context, hdfs, configuredRootDirPath, 
flowFile);
+} catch (IOException e) {
+if (!Strings.isNullOrEmpty(e.getMessage()) && 
e.getMessage().contains(String.format("Couldn't setup connection for %s", 
ugi.getUserName( {
--- End diff --

@ekovacs I think we should be more selective in this check.  I don't think 
there's a better way to detect this error scenario than string matching at this 
point, but the exception stack should be inspected to see if you can find the 
GSSException as the root cause:
`Caused by: org.ietf.jgss.GSSException: No valid credentials provided 
(Mechanism level: Failed to find any Kerberos tgt) `
If you iterate through the causes when PutHDFS encounters an IOException, 
and see that GSSException, we can do a penalize with a session rollback.
Otherwise, we'd want to pass the flowfile to the failure relationship.


---


[jira] [Commented] (NIFI-5557) PutHDFS "GSSException: No valid credentials provided" when krb ticket expires

2018-08-30 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5557?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16597766#comment-16597766
 ] 

ASF GitHub Bot commented on NIFI-5557:
--

Github user jtstorck commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2971#discussion_r214136451
  
--- Diff: 
nifi-nar-bundles/nifi-hadoop-bundle/nifi-hdfs-processors/src/main/java/org/apache/nifi/processors/hadoop/PutHDFS.java
 ---
@@ -266,6 +268,13 @@ public Object run() {
 throw new 
IOException(configuredRootDirPath.toString() + " could not be created");
 }
 changeOwner(context, hdfs, configuredRootDirPath, 
flowFile);
+} catch (IOException e) {
+if (!Strings.isNullOrEmpty(e.getMessage()) && 
e.getMessage().contains(String.format("Couldn't setup connection for %s", 
ugi.getUserName( {
--- End diff --

@ekovacs I think we should be more selective in this check.  I don't think 
there's a better way to detect this error scenario than string matching at this 
point, but the exception stack should be inspected to see if you can find the 
GSSException as the root cause:
`Caused by: org.ietf.jgss.GSSException: No valid credentials provided 
(Mechanism level: Failed to find any Kerberos tgt) `
If you iterate through the causes when PutHDFS encounters an IOException, 
and see that GSSException, we can do a penalize with a session rollback.
Otherwise, we'd want to pass the flowfile to the failure relationship.


> PutHDFS "GSSException: No valid credentials provided" when krb ticket expires
> -
>
> Key: NIFI-5557
> URL: https://issues.apache.org/jira/browse/NIFI-5557
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.5.0
>Reporter: Endre Kovacs
>Assignee: Endre Kovacs
>Priority: Major
>
> when using *PutHDFS* processor in a kerberized environment, with a flow 
> "traffic" which approximately matches or less frequent then the lifetime of 
> the ticket of the principal, we see this in the log:
> {code:java}
> INFO [Timer-Driven Process Thread-4] o.a.h.io.retry.RetryInvocationHandler 
> Exception while invoking getFileInfo of class 
> ClientNamenodeProtocolTranslatorPB over host2/ip2:8020 after 13 fail over 
> attempts. Trying to fail over immediately.
> java.io.IOException: Failed on local exception: java.io.IOException: Couldn't 
> setup connection for princi...@example.com to host2.example.com/ip2:8020; 
> Host Details : local host is: "host1.example.com/ip1"; destination host is: 
> "host2.example.com":8020; 
> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:776)
> at org.apache.hadoop.ipc.Client.call(Client.java:1479)
> at org.apache.hadoop.ipc.Client.call(Client.java:1412)
> at 
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
> at com.sun.proxy.$Proxy134.getFileInfo(Unknown Source)
> at 
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:771)
> at sun.reflect.GeneratedMethodAccessor344.invoke(Unknown Source)
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at 
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
> at 
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
> at com.sun.proxy.$Proxy135.getFileInfo(Unknown Source)
> at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2108)
> at 
> org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1305)
> at 
> org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301)
> at 
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
> at 
> org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1317)
> at org.apache.nifi.processors.hadoop.PutHDFS$1.run(PutHDFS.java:254)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:360)
> at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1678)
> at org.apache.nifi.processors.hadoop.PutHDFS.onTrigger(PutHDFS.java:222)
> {code}
> and the flowfile is routed to failure relationship.
> *To reproduce:*
> Create a principal in your KDC with two minutes ticket lifetime,
> and set up a similar flow:
> {code:java}
> GetFile => putHDFS - success- -> logAttributes
> \
>  fail
>   

[GitHub] nifi pull request #2973: NIFI-5561 - Add component name filtering to S2S Pro...

2018-08-30 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/2973


---


[jira] [Commented] (NIFI-5561) Add component name filtering to S2S Provenance Reporting Task

2018-08-30 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5561?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16597764#comment-16597764
 ] 

ASF subversion and git services commented on NIFI-5561:
---

Commit cbd942df10440405927f80bc71040f6218610ebc in nifi's branch 
refs/heads/master from [~pvillard]
[ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=cbd942d ]

NIFI-5561 - Add component name filtering to S2S Provenance Reporting Task.
Added regression test for ProvenanceEventConsumer#isFilteringEnabled().
Changed isFilteringEnabled implementation to be expandable as other attributes 
are added using Streams.
EL + indentation.

This closes #2973.

Co-authored-by: Andy LoPresto 
Signed-off-by: Andy LoPresto 


> Add component name filtering to S2S Provenance Reporting Task
> -
>
> Key: NIFI-5561
> URL: https://issues.apache.org/jira/browse/NIFI-5561
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Reporter: Pierre Villard
>Assignee: Pierre Villard
>Priority: Major
>
> I'd like to add component name as a way to filter events sent by the 
> SiteToSite Provenance Reporting task so that, for example, all events 
> generated by components containing "Prov" in the name are picked.
> This will be much easier to manage rather than component IDs as the ID of a 
> component could change when a workflow is promoted from one environment to 
> another in a CI/CD pipeline.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5552) CSVReader Can't Derive Schema from Quoted Headers

2018-08-30 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5552?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16597726#comment-16597726
 ] 

ASF GitHub Bot commented on NIFI-5552:
--

Github user markap14 commented on the issue:

https://github.com/apache/nifi/pull/2966
  
@pvillard31 we don't want to change `recordField.getFieldName()` - that is 
the NiFi class and has nothing to do with Avro. We really want to avoid 
enforcing any Avro naming peculiarities on NiFi - we should only enforce them 
for Avro, so I think the `AvroTypeUtil` is the right place to put this. You 
should not need to worry about breaking backward compatibility, because the 
`normalizeNameForAvro` method should make no changes to the name unless the 
name is invalid - in which case it would have thrown an Exception previously.


> CSVReader Can't Derive Schema from Quoted Headers
> -
>
> Key: NIFI-5552
> URL: https://issues.apache.org/jira/browse/NIFI-5552
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Extensions
>Affects Versions: 1.7.1
>Reporter: Shawn Weeks
>Assignee: Pierre Villard
>Priority: Minor
>
> When deriving the schema from a CSV File Header NiFi is unable to generate a 
> valid schema if the Header Columns are Double Quoted even though the 
> CSVReader is set to handle quotes. Using the nile.csv sample file from 
> https://people.sc.fsu.edu/~jburkardt/data/csv/csv.html results in an Illegal 
> initial character exception in the Avro Schema generator. In this specific 
> case the header did not contain any spaces or special characters though it 
> was case sensitive.
> {code:java}
> org.apache.avro.SchemaParseException: Illegal initial character: "Flood"
> at org.apache.avro.Schema.validateName(Schema.java:1147)
> at org.apache.avro.Schema.access$200(Schema.java:81)
> at org.apache.avro.Schema$Field.(Schema.java:403)
> at org.apache.avro.Schema$Field.(Schema.java:423)
> at org.apache.avro.Schema$Field.(Schema.java:415)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:123)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:114)
> at org.apache.nifi.avro.AvroTypeUtil.extractAvroSchema(AvroTypeUtil.java:94)
> at 
> org.apache.nifi.schema.access.WriteAvroSchemaAttributeStrategy.getAttributes(WriteAvroSchemaAttributeStrategy.java:58)
> at org.apache.nifi.json.WriteJsonResult.writeRecord(WriteJsonResult.java:137)
> at 
> org.apache.nifi.serialization.AbstractRecordSetWriter.write(AbstractRecordSetWriter.java:59)
> at 
> org.apache.nifi.processors.standard.AbstractRecordProcessor$1.process(AbstractRecordProcessor.java:122)
> at 
> org.apache.nifi.controller.repository.StandardProcessSession.write(StandardProcessSession.java:2885)
> at 
> org.apache.nifi.processors.standard.AbstractRecordProcessor.onTrigger(AbstractRecordProcessor.java:109)
> at 
> org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
> at 
> org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1165)
> at 
> org.apache.nifi.controller.tasks.ConnectableTask.invoke(ConnectableTask.java:203)
> at 
> org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:117)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
> at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
> at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
> at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> at java.lang.Thread.run(Thread.java:748){code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2966: NIFI-5552 - Add option to normalize header column names in...

2018-08-30 Thread markap14
Github user markap14 commented on the issue:

https://github.com/apache/nifi/pull/2966
  
@pvillard31 we don't want to change `recordField.getFieldName()` - that is 
the NiFi class and has nothing to do with Avro. We really want to avoid 
enforcing any Avro naming peculiarities on NiFi - we should only enforce them 
for Avro, so I think the `AvroTypeUtil` is the right place to put this. You 
should not need to worry about breaking backward compatibility, because the 
`normalizeNameForAvro` method should make no changes to the name unless the 
name is invalid - in which case it would have thrown an Exception previously.


---


[jira] [Commented] (NIFI-5561) Add component name filtering to S2S Provenance Reporting Task

2018-08-30 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5561?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16597680#comment-16597680
 ] 

ASF GitHub Bot commented on NIFI-5561:
--

Github user alopresto commented on the issue:

https://github.com/apache/nifi/pull/2973
  
Thanks Pierre. Running a final check and will merge. 


> Add component name filtering to S2S Provenance Reporting Task
> -
>
> Key: NIFI-5561
> URL: https://issues.apache.org/jira/browse/NIFI-5561
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Reporter: Pierre Villard
>Assignee: Pierre Villard
>Priority: Major
>
> I'd like to add component name as a way to filter events sent by the 
> SiteToSite Provenance Reporting task so that, for example, all events 
> generated by components containing "Prov" in the name are picked.
> This will be much easier to manage rather than component IDs as the ID of a 
> component could change when a workflow is promoted from one environment to 
> another in a CI/CD pipeline.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2973: NIFI-5561 - Add component name filtering to S2S Provenance...

2018-08-30 Thread alopresto
Github user alopresto commented on the issue:

https://github.com/apache/nifi/pull/2973
  
Thanks Pierre. Running a final check and will merge. 


---


[jira] [Commented] (NIFI-5561) Add component name filtering to S2S Provenance Reporting Task

2018-08-30 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5561?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16597676#comment-16597676
 ] 

ASF GitHub Bot commented on NIFI-5561:
--

Github user alopresto commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2973#discussion_r214106940
  
--- Diff: 
nifi-nar-bundles/nifi-site-to-site-reporting-bundle/nifi-site-to-site-reporting-task/src/main/java/org/apache/nifi/reporting/SiteToSiteProvenanceReportingTask.java
 ---
@@ -151,6 +151,25 @@
 .addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
 .build();
 
+static final PropertyDescriptor FILTER_COMPONENT_NAME = new 
PropertyDescriptor.Builder()
+.name("s2s-prov-task-name-filter")
+.displayName("Component Name to Include")
+.description("Regular expression to filter the provenance events 
based on the component name. Only the events matching the regular "
++ "expression will be sent. If no filter is set, all the 
events are sent. If multiple filters are set, the filters are cumulative.")
+.required(false)
+.addValidator(StandardValidators.REGULAR_EXPRESSION_VALIDATOR)
--- End diff --

That's an excellent point about the VR scoping that I forgot. I think this 
is nominally more "consistent" across the app, but you're right that it's not 
as valuable as I expected. Thanks. 


> Add component name filtering to S2S Provenance Reporting Task
> -
>
> Key: NIFI-5561
> URL: https://issues.apache.org/jira/browse/NIFI-5561
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Reporter: Pierre Villard
>Assignee: Pierre Villard
>Priority: Major
>
> I'd like to add component name as a way to filter events sent by the 
> SiteToSite Provenance Reporting task so that, for example, all events 
> generated by components containing "Prov" in the name are picked.
> This will be much easier to manage rather than component IDs as the ID of a 
> component could change when a workflow is promoted from one environment to 
> another in a CI/CD pipeline.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2973: NIFI-5561 - Add component name filtering to S2S Pro...

2018-08-30 Thread alopresto
Github user alopresto commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2973#discussion_r214106940
  
--- Diff: 
nifi-nar-bundles/nifi-site-to-site-reporting-bundle/nifi-site-to-site-reporting-task/src/main/java/org/apache/nifi/reporting/SiteToSiteProvenanceReportingTask.java
 ---
@@ -151,6 +151,25 @@
 .addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
 .build();
 
+static final PropertyDescriptor FILTER_COMPONENT_NAME = new 
PropertyDescriptor.Builder()
+.name("s2s-prov-task-name-filter")
+.displayName("Component Name to Include")
+.description("Regular expression to filter the provenance events 
based on the component name. Only the events matching the regular "
++ "expression will be sent. If no filter is set, all the 
events are sent. If multiple filters are set, the filters are cumulative.")
+.required(false)
+.addValidator(StandardValidators.REGULAR_EXPRESSION_VALIDATOR)
--- End diff --

That's an excellent point about the VR scoping that I forgot. I think this 
is nominally more "consistent" across the app, but you're right that it's not 
as valuable as I expected. Thanks. 


---


[jira] [Commented] (NIFI-4426) Remove custom jBCrypt implementation because Java 7 is no longer supported

2018-08-30 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4426?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16597635#comment-16597635
 ] 

ASF GitHub Bot commented on NIFI-4426:
--

Github user alopresto commented on the issue:

https://github.com/apache/nifi/pull/2976
  
Reviewing...


> Remove custom jBCrypt implementation because Java 7 is no longer supported
> --
>
> Key: NIFI-4426
> URL: https://issues.apache.org/jira/browse/NIFI-4426
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core Framework
>Affects Versions: 1.4.0
>Reporter: Andy LoPresto
>Assignee: Andy LoPresto
>Priority: Minor
>  Labels: security
> Attachments: Screen Shot 2018-08-28 at 6.38.16 PM.png, Screen Shot 
> 2018-08-28 at 6.38.31 PM.png
>
>
> The {{jBCrypt}} library is included and slightly modified in order to provide 
> Java 7 compatibility because the external module is compiled for Java 8. Now 
> that NiFi doesn't support Java 7, this modification can be removed and the 
> standalone module can be depended upon via Maven as per normal. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2976: NIFI-4426 - Replaced Java7 jBCrypt implementation which wa...

2018-08-30 Thread alopresto
Github user alopresto commented on the issue:

https://github.com/apache/nifi/pull/2976
  
Reviewing...


---


[jira] [Created] (NIFI-5565) Alter the Nifi Developer's Guide to better reference the existing Confluence documentation

2018-08-30 Thread Nicholas Lewis (JIRA)
Nicholas Lewis created NIFI-5565:


 Summary: Alter the Nifi Developer's Guide to better reference the 
existing Confluence documentation
 Key: NIFI-5565
 URL: https://issues.apache.org/jira/browse/NIFI-5565
 Project: Apache NiFi
  Issue Type: Improvement
  Components: Documentation & Website
Reporter: Nicholas Lewis


It was noted that the [confluence 
documentation|https://cwiki.apache.org/confluence/display/NIFI/Contributor+Guide]
 could get more traffic by being better referenced from the [developer's 
guide|https://nifi.apache.org/docs/nifi-docs/html/developer-guide.html#how-to-contribute-to-apache-nifi].
 At the moment it is just linked to in the community drop down menu.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5562) Upgrade Guava dependencies

2018-08-30 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5562?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16597545#comment-16597545
 ] 

ASF GitHub Bot commented on NIFI-5562:
--

GitHub user thenatog opened a pull request:

https://github.com/apache/nifi/pull/2977

NIFI-5562 - Upgraded guava versions from v18.0 to v25.1. Verified all…

… tests work as expected except for 1.

NIFI-5562 - Upgraded to Guava 26.0-jre which fixes a cache eviction bug for 
Cache.asMap.compute* method. This caused a failing test in NiFi's 
TestRouteText.java.

NIFI-5562 - Cleaning up some stuff.

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [x] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [x] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [x] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [x] Is your initial contribution a single, squashed commit?

### For code changes:
- [x] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/thenatog/nifi NIFI-5562-rebased

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/2977.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2977


commit 43604833545bde87b8f32c20751b6235194747e0
Author: thenatog 
Date:   2018-07-24T20:28:29Z

NIFI-5562 - Upgraded guava versions from v18.0 to v25.1. Verified all tests 
work as expected except for 1.

NIFI-5562 - Upgraded to Guava 26.0-jre which fixes a cache eviction bug for 
Cache.asMap.compute* method. This caused a failing test in NiFi's 
TestRouteText.java.

NIFI-5562 - Cleaning up some stuff.




> Upgrade Guava dependencies
> --
>
> Key: NIFI-5562
> URL: https://issues.apache.org/jira/browse/NIFI-5562
> Project: Apache NiFi
>  Issue Type: Improvement
>Reporter: Nathan Gough
>Assignee: Nathan Gough
>Priority: Major
>
> A lot of the current Guava dependency versions are v18. Upgrade dependencies 
> from v18 to 25.1-jre and test.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2977: NIFI-5562 - Upgraded guava versions from v18.0 to v...

2018-08-30 Thread thenatog
GitHub user thenatog opened a pull request:

https://github.com/apache/nifi/pull/2977

NIFI-5562 - Upgraded guava versions from v18.0 to v25.1. Verified all…

… tests work as expected except for 1.

NIFI-5562 - Upgraded to Guava 26.0-jre which fixes a cache eviction bug for 
Cache.asMap.compute* method. This caused a failing test in NiFi's 
TestRouteText.java.

NIFI-5562 - Cleaning up some stuff.

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [x] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [x] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [x] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [x] Is your initial contribution a single, squashed commit?

### For code changes:
- [x] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/thenatog/nifi NIFI-5562-rebased

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/2977.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2977


commit 43604833545bde87b8f32c20751b6235194747e0
Author: thenatog 
Date:   2018-07-24T20:28:29Z

NIFI-5562 - Upgraded guava versions from v18.0 to v25.1. Verified all tests 
work as expected except for 1.

NIFI-5562 - Upgraded to Guava 26.0-jre which fixes a cache eviction bug for 
Cache.asMap.compute* method. This caused a failing test in NiFi's 
TestRouteText.java.

NIFI-5562 - Cleaning up some stuff.




---


[jira] [Commented] (NIFI-5542) Add support for node groups to FileAccessPolicyProvider

2018-08-30 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5542?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16597482#comment-16597482
 ] 

ASF GitHub Bot commented on NIFI-5542:
--

Github user achristianson commented on the issue:

https://github.com/apache/nifi/pull/2970
  
The group id lookup logic was running even if group name was not specified. 
Fixed that. All tests are passing locally.


> Add support for node groups to FileAccessPolicyProvider
> ---
>
> Key: NIFI-5542
> URL: https://issues.apache.org/jira/browse/NIFI-5542
> Project: Apache NiFi
>  Issue Type: Improvement
>Reporter: Andrew Christianson
>Priority: Major
>
> Currently in FileAccessPolicyProvider, it is possible to specify a set of 
> node identities, which are given access to /proxy. This works well for static 
> clusters, but does not work so well for dynamic clusters (scaling up/down # 
> of nodes) because we don't know in advance what the node identities will be 
> or how many there will be.
> In order to support dynamic sets of node identities, add support for 
> specifying a "Node Group," for which all identities in the group will be 
> granted access to /proxy. A UserGroupProvider can then be implemented to 
> gather node identities dynamically from the cluster environment.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2970: NIFI-5542 Added support for node groups to FileAccessPolic...

2018-08-30 Thread achristianson
Github user achristianson commented on the issue:

https://github.com/apache/nifi/pull/2970
  
The group id lookup logic was running even if group name was not specified. 
Fixed that. All tests are passing locally.


---


[jira] [Commented] (NIFI-5542) Add support for node groups to FileAccessPolicyProvider

2018-08-30 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5542?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16597473#comment-16597473
 ] 

ASF GitHub Bot commented on NIFI-5542:
--

Github user pepov commented on the issue:

https://github.com/apache/nifi/pull/2970
  
Some unit tests are failing according to the travis build that are relevant:
```
[ERROR] 
testOnConfiguredWhenInitialAdminAndLegacyUsersProvided(org.apache.nifi.authorization.FileAccessPolicyProviderTest)
  Time elapsed: 0.114 s  <<< ERROR!
java.lang.Exception: Unexpected exception, 
expected 
but was
at 
org.apache.nifi.authorization.FileAccessPolicyProviderTest.testOnConfiguredWhenInitialAdminAndLegacyUsersProvided(FileAccessPolicyProviderTest.java:523)

[ERROR] 
testOnConfiguredWhenBadLegacyUsersFileProvided(org.apache.nifi.authorization.FileAccessPolicyProviderTest)
  Time elapsed: 0.036 s  <<< ERROR!
java.lang.Exception: Unexpected exception, 
expected 
but was
at 
org.apache.nifi.authorization.FileAccessPolicyProviderTest.testOnConfiguredWhenBadLegacyUsersFileProvided(FileAccessPolicyProviderTest.java:509)
```
https://api.travis-ci.org/v3/job/422545351/log.txt


> Add support for node groups to FileAccessPolicyProvider
> ---
>
> Key: NIFI-5542
> URL: https://issues.apache.org/jira/browse/NIFI-5542
> Project: Apache NiFi
>  Issue Type: Improvement
>Reporter: Andrew Christianson
>Priority: Major
>
> Currently in FileAccessPolicyProvider, it is possible to specify a set of 
> node identities, which are given access to /proxy. This works well for static 
> clusters, but does not work so well for dynamic clusters (scaling up/down # 
> of nodes) because we don't know in advance what the node identities will be 
> or how many there will be.
> In order to support dynamic sets of node identities, add support for 
> specifying a "Node Group," for which all identities in the group will be 
> granted access to /proxy. A UserGroupProvider can then be implemented to 
> gather node identities dynamically from the cluster environment.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2970: NIFI-5542 Added support for node groups to FileAccessPolic...

2018-08-30 Thread pepov
Github user pepov commented on the issue:

https://github.com/apache/nifi/pull/2970
  
Some unit tests are failing according to the travis build that are relevant:
```
[ERROR] 
testOnConfiguredWhenInitialAdminAndLegacyUsersProvided(org.apache.nifi.authorization.FileAccessPolicyProviderTest)
  Time elapsed: 0.114 s  <<< ERROR!
java.lang.Exception: Unexpected exception, 
expected 
but was
at 
org.apache.nifi.authorization.FileAccessPolicyProviderTest.testOnConfiguredWhenInitialAdminAndLegacyUsersProvided(FileAccessPolicyProviderTest.java:523)

[ERROR] 
testOnConfiguredWhenBadLegacyUsersFileProvided(org.apache.nifi.authorization.FileAccessPolicyProviderTest)
  Time elapsed: 0.036 s  <<< ERROR!
java.lang.Exception: Unexpected exception, 
expected 
but was
at 
org.apache.nifi.authorization.FileAccessPolicyProviderTest.testOnConfiguredWhenBadLegacyUsersFileProvided(FileAccessPolicyProviderTest.java:509)
```
https://api.travis-ci.org/v3/job/422545351/log.txt


---


[jira] [Commented] (NIFI-4426) Remove custom jBCrypt implementation because Java 7 is no longer supported

2018-08-30 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4426?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16597467#comment-16597467
 ] 

ASF GitHub Bot commented on NIFI-4426:
--

GitHub user thenatog opened a pull request:

https://github.com/apache/nifi/pull/2976

NIFI-4426 - Replaced Java7 jBCrypt implementation which was made for …

…Java7 backwards compatibility. It now uses a normal maven import to 
provide jBCrypt.

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [x] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [x] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [x] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [x] Is your initial contribution a single, squashed commit?

### For code changes:
- [x] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/thenatog/nifi NIFI-4426

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/2976.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2976


commit 42a98b23a6b5703ee3c59a88cd254ed0c26906dc
Author: thenatog 
Date:   2018-08-29T16:35:21Z

NIFI-4426 - Replaced Java7 jBCrypt implementation which was made for Java7 
backwards compatibility. It now uses a normal maven import to provide jBCrypt.




> Remove custom jBCrypt implementation because Java 7 is no longer supported
> --
>
> Key: NIFI-4426
> URL: https://issues.apache.org/jira/browse/NIFI-4426
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core Framework
>Affects Versions: 1.4.0
>Reporter: Andy LoPresto
>Assignee: Andy LoPresto
>Priority: Minor
>  Labels: security
> Attachments: Screen Shot 2018-08-28 at 6.38.16 PM.png, Screen Shot 
> 2018-08-28 at 6.38.31 PM.png
>
>
> The {{jBCrypt}} library is included and slightly modified in order to provide 
> Java 7 compatibility because the external module is compiled for Java 8. Now 
> that NiFi doesn't support Java 7, this modification can be removed and the 
> standalone module can be depended upon via Maven as per normal. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2976: NIFI-4426 - Replaced Java7 jBCrypt implementation w...

2018-08-30 Thread thenatog
GitHub user thenatog opened a pull request:

https://github.com/apache/nifi/pull/2976

NIFI-4426 - Replaced Java7 jBCrypt implementation which was made for …

…Java7 backwards compatibility. It now uses a normal maven import to 
provide jBCrypt.

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [x] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [x] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [x] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [x] Is your initial contribution a single, squashed commit?

### For code changes:
- [x] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/thenatog/nifi NIFI-4426

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/2976.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2976


commit 42a98b23a6b5703ee3c59a88cd254ed0c26906dc
Author: thenatog 
Date:   2018-08-29T16:35:21Z

NIFI-4426 - Replaced Java7 jBCrypt implementation which was made for Java7 
backwards compatibility. It now uses a normal maven import to provide jBCrypt.




---


[jira] [Commented] (NIFI-5248) Create new put processors that use the ElasticSearch client service

2018-08-30 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5248?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16597457#comment-16597457
 ] 

ASF GitHub Bot commented on NIFI-5248:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2861#discussion_r214032109
  
--- Diff: 
nifi-nar-bundles/nifi-elasticsearch-bundle/nifi-elasticsearch-restapi-processors/src/main/java/org/apache/nifi/processors/elasticsearch/PutElasticsearchJson.java
 ---
@@ -0,0 +1,321 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.processors.elasticsearch;
+
+import com.fasterxml.jackson.core.JsonProcessingException;
+import com.fasterxml.jackson.databind.ObjectMapper;
+import com.jayway.jsonpath.JsonPath;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.ValidationContext;
+import org.apache.nifi.components.ValidationResult;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.elasticsearch.ElasticSearchClientService;
+import org.apache.nifi.elasticsearch.ElasticSearchError;
+import org.apache.nifi.elasticsearch.IndexOperationRequest;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import 
org.apache.nifi.processors.elasticsearch.put.FlowFileJsonDescription;
+import org.apache.nifi.processors.elasticsearch.put.JsonProcessingError;
+import org.apache.nifi.util.StringUtils;
+
+import java.io.ByteArrayOutputStream;
+import java.io.IOException;
+import java.util.ArrayList;
+import java.util.Collection;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.stream.Collectors;
+
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@Tags({"json", "elasticsearch", "5x", "6x", "put", "index"})
--- End diff --

Fixed.


> Create new put processors that use the ElasticSearch client service
> ---
>
> Key: NIFI-5248
> URL: https://issues.apache.org/jira/browse/NIFI-5248
> Project: Apache NiFi
>  Issue Type: New Feature
>Reporter: Mike Thomsen
>Assignee: Mike Thomsen
>Priority: Major
>
> Two new processors:
>  * PutElasticsearchJson - put raw JSON.
>  * PutElasticsearchRecord - put records.
> Both of them should support the general bulk load API and be able to do 
> things like insert into multiple indexes from one payload.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2861: NIFI-5248 Added new Elasticsearch json and record p...

2018-08-30 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2861#discussion_r214032109
  
--- Diff: 
nifi-nar-bundles/nifi-elasticsearch-bundle/nifi-elasticsearch-restapi-processors/src/main/java/org/apache/nifi/processors/elasticsearch/PutElasticsearchJson.java
 ---
@@ -0,0 +1,321 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.processors.elasticsearch;
+
+import com.fasterxml.jackson.core.JsonProcessingException;
+import com.fasterxml.jackson.databind.ObjectMapper;
+import com.jayway.jsonpath.JsonPath;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.ValidationContext;
+import org.apache.nifi.components.ValidationResult;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.elasticsearch.ElasticSearchClientService;
+import org.apache.nifi.elasticsearch.ElasticSearchError;
+import org.apache.nifi.elasticsearch.IndexOperationRequest;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import 
org.apache.nifi.processors.elasticsearch.put.FlowFileJsonDescription;
+import org.apache.nifi.processors.elasticsearch.put.JsonProcessingError;
+import org.apache.nifi.util.StringUtils;
+
+import java.io.ByteArrayOutputStream;
+import java.io.IOException;
+import java.util.ArrayList;
+import java.util.Collection;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.stream.Collectors;
+
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@Tags({"json", "elasticsearch", "5x", "6x", "put", "index"})
--- End diff --

Fixed.


---


[jira] [Created] (NIFI-5564) Add support for decompression when using ListenTCPRecord

2018-08-30 Thread Bryan Bende (JIRA)
Bryan Bende created NIFI-5564:
-

 Summary: Add support for decompression when using ListenTCPRecord
 Key: NIFI-5564
 URL: https://issues.apache.org/jira/browse/NIFI-5564
 Project: Apache NiFi
  Issue Type: Improvement
Affects Versions: 1.7.1
Reporter: Bryan Bende


We have CompressContent processor which can be used before a processor like 
PutTCP, but on the receiving side ListenTCPRecord will not be able to read 
records since the data is still compressed. ListenTCPRecord should support 
decompressing data.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5542) Add support for node groups to FileAccessPolicyProvider

2018-08-30 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5542?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16597431#comment-16597431
 ] 

ASF GitHub Bot commented on NIFI-5542:
--

Github user achristianson commented on the issue:

https://github.com/apache/nifi/pull/2970
  
@kevdoran @pepov it's now looking up group ID from the given name. Unit 
test updated to reflect such, and docs updated to reflect the new property.


> Add support for node groups to FileAccessPolicyProvider
> ---
>
> Key: NIFI-5542
> URL: https://issues.apache.org/jira/browse/NIFI-5542
> Project: Apache NiFi
>  Issue Type: Improvement
>Reporter: Andrew Christianson
>Priority: Major
>
> Currently in FileAccessPolicyProvider, it is possible to specify a set of 
> node identities, which are given access to /proxy. This works well for static 
> clusters, but does not work so well for dynamic clusters (scaling up/down # 
> of nodes) because we don't know in advance what the node identities will be 
> or how many there will be.
> In order to support dynamic sets of node identities, add support for 
> specifying a "Node Group," for which all identities in the group will be 
> granted access to /proxy. A UserGroupProvider can then be implemented to 
> gather node identities dynamically from the cluster environment.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2970: NIFI-5542 Added support for node groups to FileAccessPolic...

2018-08-30 Thread achristianson
Github user achristianson commented on the issue:

https://github.com/apache/nifi/pull/2970
  
@kevdoran @pepov it's now looking up group ID from the given name. Unit 
test updated to reflect such, and docs updated to reflect the new property.


---


[jira] [Commented] (NIFI-5544) Refactor GetMongo processor codebase

2018-08-30 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5544?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16597425#comment-16597425
 ] 

ASF GitHub Bot commented on NIFI-5544:
--

Github user zenfenan commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2958#discussion_r214022342
  
--- Diff: 
nifi-nar-bundles/nifi-mongodb-bundle/nifi-mongodb-processors/src/main/java/org/apache/nifi/processors/mongodb/GetMongo.java
 ---
@@ -204,144 +212,145 @@ private ObjectWriter getObjectWriter(ObjectMapper 
mapper, String ppSetting) {
 @Override
 public void onTrigger(final ProcessContext context, final 
ProcessSession session) throws ProcessException {
 FlowFile input = null;
+logger = getLogger();
+
 if (context.hasIncomingConnection()) {
 input = session.get();
-
 if (input == null && context.hasNonLoopConnection()) {
 return;
 }
 }
 
-final ComponentLog logger = getLogger();
+final Document query = getQuery(context, session, input );
 
-Map attributes = new HashMap<>();
-attributes.put(CoreAttributes.MIME_TYPE.key(), "application/json");
+if (query == null) {
+return;
+}
 
-final Document query;
+final String jsonTypeSetting = 
context.getProperty(JSON_TYPE).getValue();
+final String usePrettyPrint  = 
context.getProperty(USE_PRETTY_PRINTING).getValue();
 final Charset charset = 
Charset.forName(context.getProperty(CHARSET).evaluateAttributeExpressions(input).getValue());
+final Map attributes = new HashMap<>();
 
-String queryStr;
-if (context.getProperty(QUERY).isSet()) {
-queryStr = 
context.getProperty(QUERY).evaluateAttributeExpressions(input).getValue();
-query = Document.parse(queryStr);
-} else if (!context.getProperty(QUERY).isSet() && input == null) {
-queryStr = "{}";
-query = Document.parse("{}");
-} else {
-try {
-ByteArrayOutputStream out = new ByteArrayOutputStream();
-session.exportTo(input, out);
-out.close();
-queryStr = new String(out.toByteArray());
-query = Document.parse(queryStr);
-} catch (Exception ex) {
-getLogger().error("Error reading flowfile", ex);
-if (input != null) { //Likely culprit is a bad query
-session.transfer(input, REL_FAILURE);
-return;
-} else {
-throw new ProcessException(ex);
-}
-}
-}
+attributes.put(CoreAttributes.MIME_TYPE.key(), "application/json");
 
 if (context.getProperty(QUERY_ATTRIBUTE).isSet()) {
 final String queryAttr = 
context.getProperty(QUERY_ATTRIBUTE).evaluateAttributeExpressions(input).getValue();
-attributes.put(queryAttr, queryStr);
+attributes.put(queryAttr, query.toJson());
 }
 
 final Document projection = context.getProperty(PROJECTION).isSet()
 ? 
Document.parse(context.getProperty(PROJECTION).evaluateAttributeExpressions(input).getValue())
 : null;
 final Document sort = context.getProperty(SORT).isSet()
 ? 
Document.parse(context.getProperty(SORT).evaluateAttributeExpressions(input).getValue())
 : null;
-final String jsonTypeSetting = 
context.getProperty(JSON_TYPE).getValue();
-final String usePrettyPrint  = 
context.getProperty(USE_PRETTY_PRINTING).getValue();
-configureMapper(jsonTypeSetting);
 
+final MongoCollection collection = 
getCollection(context, input);
+final FindIterable it = collection.find(query);
 
-try {
-final MongoCollection collection = 
getCollection(context, input);
+attributes.put(DB_NAME, 
collection.getNamespace().getDatabaseName());
+attributes.put(COL_NAME, 
collection.getNamespace().getCollectionName());
 
-attributes.put(DB_NAME, 
collection.getNamespace().getDatabaseName());
-attributes.put(COL_NAME, 
collection.getNamespace().getCollectionName());
+if (projection != null) {
+it.projection(projection);
+}
+if (sort != null) {
+it.sort(sort);
+}
+if (context.getProperty(LIMIT).isSet()) {
+
it.limit(context.getProperty(LIMIT).evaluateAttributeExpressions(input).asInteger());
+ 

[GitHub] nifi pull request #2958: NIFI-5544: GetMongo refactored

2018-08-30 Thread zenfenan
Github user zenfenan commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2958#discussion_r214022342
  
--- Diff: 
nifi-nar-bundles/nifi-mongodb-bundle/nifi-mongodb-processors/src/main/java/org/apache/nifi/processors/mongodb/GetMongo.java
 ---
@@ -204,144 +212,145 @@ private ObjectWriter getObjectWriter(ObjectMapper 
mapper, String ppSetting) {
 @Override
 public void onTrigger(final ProcessContext context, final 
ProcessSession session) throws ProcessException {
 FlowFile input = null;
+logger = getLogger();
+
 if (context.hasIncomingConnection()) {
 input = session.get();
-
 if (input == null && context.hasNonLoopConnection()) {
 return;
 }
 }
 
-final ComponentLog logger = getLogger();
+final Document query = getQuery(context, session, input );
 
-Map attributes = new HashMap<>();
-attributes.put(CoreAttributes.MIME_TYPE.key(), "application/json");
+if (query == null) {
+return;
+}
 
-final Document query;
+final String jsonTypeSetting = 
context.getProperty(JSON_TYPE).getValue();
+final String usePrettyPrint  = 
context.getProperty(USE_PRETTY_PRINTING).getValue();
 final Charset charset = 
Charset.forName(context.getProperty(CHARSET).evaluateAttributeExpressions(input).getValue());
+final Map attributes = new HashMap<>();
 
-String queryStr;
-if (context.getProperty(QUERY).isSet()) {
-queryStr = 
context.getProperty(QUERY).evaluateAttributeExpressions(input).getValue();
-query = Document.parse(queryStr);
-} else if (!context.getProperty(QUERY).isSet() && input == null) {
-queryStr = "{}";
-query = Document.parse("{}");
-} else {
-try {
-ByteArrayOutputStream out = new ByteArrayOutputStream();
-session.exportTo(input, out);
-out.close();
-queryStr = new String(out.toByteArray());
-query = Document.parse(queryStr);
-} catch (Exception ex) {
-getLogger().error("Error reading flowfile", ex);
-if (input != null) { //Likely culprit is a bad query
-session.transfer(input, REL_FAILURE);
-return;
-} else {
-throw new ProcessException(ex);
-}
-}
-}
+attributes.put(CoreAttributes.MIME_TYPE.key(), "application/json");
 
 if (context.getProperty(QUERY_ATTRIBUTE).isSet()) {
 final String queryAttr = 
context.getProperty(QUERY_ATTRIBUTE).evaluateAttributeExpressions(input).getValue();
-attributes.put(queryAttr, queryStr);
+attributes.put(queryAttr, query.toJson());
 }
 
 final Document projection = context.getProperty(PROJECTION).isSet()
 ? 
Document.parse(context.getProperty(PROJECTION).evaluateAttributeExpressions(input).getValue())
 : null;
 final Document sort = context.getProperty(SORT).isSet()
 ? 
Document.parse(context.getProperty(SORT).evaluateAttributeExpressions(input).getValue())
 : null;
-final String jsonTypeSetting = 
context.getProperty(JSON_TYPE).getValue();
-final String usePrettyPrint  = 
context.getProperty(USE_PRETTY_PRINTING).getValue();
-configureMapper(jsonTypeSetting);
 
+final MongoCollection collection = 
getCollection(context, input);
+final FindIterable it = collection.find(query);
 
-try {
-final MongoCollection collection = 
getCollection(context, input);
+attributes.put(DB_NAME, 
collection.getNamespace().getDatabaseName());
+attributes.put(COL_NAME, 
collection.getNamespace().getCollectionName());
 
-attributes.put(DB_NAME, 
collection.getNamespace().getDatabaseName());
-attributes.put(COL_NAME, 
collection.getNamespace().getCollectionName());
+if (projection != null) {
+it.projection(projection);
+}
+if (sort != null) {
+it.sort(sort);
+}
+if (context.getProperty(LIMIT).isSet()) {
+
it.limit(context.getProperty(LIMIT).evaluateAttributeExpressions(input).asInteger());
+}
+if (context.getProperty(BATCH_SIZE).isSet()) {
+
it.batchSize(context.getProperty(BATCH_SIZE).evaluateAttributeExpressions(input).asInteger());
+}
 
-final FindIterable it = query != n

[jira] [Commented] (NIFI-5518) Add processors for interacting with Apache Kafka 2.0

2018-08-30 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5518?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16597363#comment-16597363
 ] 

ASF GitHub Bot commented on NIFI-5518:
--

Github user pvillard31 commented on the issue:

https://github.com/apache/nifi/pull/2948
  
Now #2915 is merged, the same should probably be included in this PR.


> Add processors for interacting with Apache Kafka 2.0
> 
>
> Key: NIFI-5518
> URL: https://issues.apache.org/jira/browse/NIFI-5518
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Reporter: Mark Payne
>Assignee: Mark Payne
>Priority: Major
> Fix For: 1.8.0
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2948: NIFI-5518: Added processors for integrating with Apache Ka...

2018-08-30 Thread pvillard31
Github user pvillard31 commented on the issue:

https://github.com/apache/nifi/pull/2948
  
Now #2915 is merged, the same should probably be included in this PR.


---


[jira] [Commented] (NIFI-5561) Add component name filtering to S2S Provenance Reporting Task

2018-08-30 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5561?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16597336#comment-16597336
 ] 

ASF GitHub Bot commented on NIFI-5561:
--

Github user pvillard31 commented on the issue:

https://github.com/apache/nifi/pull/2973
  
Added VR scope on properties and fixed indentation to have the same 
everywhere. To see the changes and masking the indentation changes: 
https://github.com/apache/nifi/pull/2973/files?w=1


> Add component name filtering to S2S Provenance Reporting Task
> -
>
> Key: NIFI-5561
> URL: https://issues.apache.org/jira/browse/NIFI-5561
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Reporter: Pierre Villard
>Assignee: Pierre Villard
>Priority: Major
>
> I'd like to add component name as a way to filter events sent by the 
> SiteToSite Provenance Reporting task so that, for example, all events 
> generated by components containing "Prov" in the name are picked.
> This will be much easier to manage rather than component IDs as the ID of a 
> component could change when a workflow is promoted from one environment to 
> another in a CI/CD pipeline.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2973: NIFI-5561 - Add component name filtering to S2S Provenance...

2018-08-30 Thread pvillard31
Github user pvillard31 commented on the issue:

https://github.com/apache/nifi/pull/2973
  
Added VR scope on properties and fixed indentation to have the same 
everywhere. To see the changes and masking the indentation changes: 
https://github.com/apache/nifi/pull/2973/files?w=1


---


[jira] [Commented] (NIFI-5561) Add component name filtering to S2S Provenance Reporting Task

2018-08-30 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5561?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16597322#comment-16597322
 ] 

ASF GitHub Bot commented on NIFI-5561:
--

Github user pvillard31 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2973#discussion_r213990560
  
--- Diff: 
nifi-nar-bundles/nifi-site-to-site-reporting-bundle/nifi-site-to-site-reporting-task/src/main/java/org/apache/nifi/reporting/SiteToSiteProvenanceReportingTask.java
 ---
@@ -151,6 +151,25 @@
 .addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
 .build();
 
+static final PropertyDescriptor FILTER_COMPONENT_NAME = new 
PropertyDescriptor.Builder()
+.name("s2s-prov-task-name-filter")
+.displayName("Component Name to Include")
+.description("Regular expression to filter the provenance events 
based on the component name. Only the events matching the regular "
++ "expression will be sent. If no filter is set, all the 
events are sent. If multiple filters are set, the filters are cumulative.")
+.required(false)
+.addValidator(StandardValidators.REGULAR_EXPRESSION_VALIDATOR)
--- End diff --

Fair enough, I'll add it on the properties. On a related note, I hope to 
find time to start exploring NIFI-5367 to enable VR scope by default in NiFi.


> Add component name filtering to S2S Provenance Reporting Task
> -
>
> Key: NIFI-5561
> URL: https://issues.apache.org/jira/browse/NIFI-5561
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Reporter: Pierre Villard
>Assignee: Pierre Villard
>Priority: Major
>
> I'd like to add component name as a way to filter events sent by the 
> SiteToSite Provenance Reporting task so that, for example, all events 
> generated by components containing "Prov" in the name are picked.
> This will be much easier to manage rather than component IDs as the ID of a 
> component could change when a workflow is promoted from one environment to 
> another in a CI/CD pipeline.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (NIFI-5388) Enable EL support for dynamic properties of Kafka processors

2018-08-30 Thread Mike Thomsen (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-5388?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mike Thomsen resolved NIFI-5388.

   Resolution: Fixed
Fix Version/s: 1.8.0

> Enable EL support for dynamic properties of Kafka processors
> 
>
> Key: NIFI-5388
> URL: https://issues.apache.org/jira/browse/NIFI-5388
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Affects Versions: 1.7.0
>Reporter: Corey Fritz
>Priority: Minor
>  Labels: kafka
> Fix For: 1.8.0
>
> Attachments: NIFI-5388.mp4
>
>
> We have a flow committed to a NiFi Registry that includes a Kafka consumer, 
> and ideally the processor configuration would include some dynamic consumer 
> properties where the values would differ depending on which environment the 
> flow is deployed to. Unfortunately, dynamic properties in the Kafka 
> processors do not currently support EL, which would allow us to use 
> environment variables to pass in those environment-specific values. So I'm 
> proposing EL support be added to the dynamic properties of all Kafka 
> processors.
> I am planning on tackling this ticket myself, unless someone thinks this is a 
> bad idea for some reason.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2973: NIFI-5561 - Add component name filtering to S2S Pro...

2018-08-30 Thread pvillard31
Github user pvillard31 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2973#discussion_r213990560
  
--- Diff: 
nifi-nar-bundles/nifi-site-to-site-reporting-bundle/nifi-site-to-site-reporting-task/src/main/java/org/apache/nifi/reporting/SiteToSiteProvenanceReportingTask.java
 ---
@@ -151,6 +151,25 @@
 .addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
 .build();
 
+static final PropertyDescriptor FILTER_COMPONENT_NAME = new 
PropertyDescriptor.Builder()
+.name("s2s-prov-task-name-filter")
+.displayName("Component Name to Include")
+.description("Regular expression to filter the provenance events 
based on the component name. Only the events matching the regular "
++ "expression will be sent. If no filter is set, all the 
events are sent. If multiple filters are set, the filters are cumulative.")
+.required(false)
+.addValidator(StandardValidators.REGULAR_EXPRESSION_VALIDATOR)
--- End diff --

Fair enough, I'll add it on the properties. On a related note, I hope to 
find time to start exploring NIFI-5367 to enable VR scope by default in NiFi.


---


[jira] [Commented] (NIFI-5388) Enable EL support for dynamic properties of Kafka processors

2018-08-30 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5388?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16597320#comment-16597320
 ] 

ASF subversion and git services commented on NIFI-5388:
---

Commit 89186fb96d0bd1367f234d36f8c45e130e6aad4b in nifi's branch 
refs/heads/master from [~snagafritz]
[ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=89186fb ]

NIFI-5388: enabled EL support for dynamic properties of Kafka 1.0 processors

This closes #2915

Signed-off-by: Mike Thomsen 


> Enable EL support for dynamic properties of Kafka processors
> 
>
> Key: NIFI-5388
> URL: https://issues.apache.org/jira/browse/NIFI-5388
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Affects Versions: 1.7.0
>Reporter: Corey Fritz
>Priority: Minor
>  Labels: kafka
> Attachments: NIFI-5388.mp4
>
>
> We have a flow committed to a NiFi Registry that includes a Kafka consumer, 
> and ideally the processor configuration would include some dynamic consumer 
> properties where the values would differ depending on which environment the 
> flow is deployed to. Unfortunately, dynamic properties in the Kafka 
> processors do not currently support EL, which would allow us to use 
> environment variables to pass in those environment-specific values. So I'm 
> proposing EL support be added to the dynamic properties of all Kafka 
> processors.
> I am planning on tackling this ticket myself, unless someone thinks this is a 
> bad idea for some reason.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5388) Enable EL support for dynamic properties of Kafka processors

2018-08-30 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5388?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16597321#comment-16597321
 ] 

ASF GitHub Bot commented on NIFI-5388:
--

Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/2915


> Enable EL support for dynamic properties of Kafka processors
> 
>
> Key: NIFI-5388
> URL: https://issues.apache.org/jira/browse/NIFI-5388
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Affects Versions: 1.7.0
>Reporter: Corey Fritz
>Priority: Minor
>  Labels: kafka
> Attachments: NIFI-5388.mp4
>
>
> We have a flow committed to a NiFi Registry that includes a Kafka consumer, 
> and ideally the processor configuration would include some dynamic consumer 
> properties where the values would differ depending on which environment the 
> flow is deployed to. Unfortunately, dynamic properties in the Kafka 
> processors do not currently support EL, which would allow us to use 
> environment variables to pass in those environment-specific values. So I'm 
> proposing EL support be added to the dynamic properties of all Kafka 
> processors.
> I am planning on tackling this ticket myself, unless someone thinks this is a 
> bad idea for some reason.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2915: NIFI-5388: enabled EL support for dynamic propertie...

2018-08-30 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/2915


---


[jira] [Commented] (NIFI-5388) Enable EL support for dynamic properties of Kafka processors

2018-08-30 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5388?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16597306#comment-16597306
 ] 

ASF GitHub Bot commented on NIFI-5388:
--

Github user MikeThomsen commented on the issue:

https://github.com/apache/nifi/pull/2915
  
@zenfenan the consumer one has input forbidden on it, so you won't get 
flowfiles there. For the publisher, my guess is it's probably being 
conservative about reconfiguring the producers, which isn't a bad call IMO.


> Enable EL support for dynamic properties of Kafka processors
> 
>
> Key: NIFI-5388
> URL: https://issues.apache.org/jira/browse/NIFI-5388
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Affects Versions: 1.7.0
>Reporter: Corey Fritz
>Priority: Minor
>  Labels: kafka
> Attachments: NIFI-5388.mp4
>
>
> We have a flow committed to a NiFi Registry that includes a Kafka consumer, 
> and ideally the processor configuration would include some dynamic consumer 
> properties where the values would differ depending on which environment the 
> flow is deployed to. Unfortunately, dynamic properties in the Kafka 
> processors do not currently support EL, which would allow us to use 
> environment variables to pass in those environment-specific values. So I'm 
> proposing EL support be added to the dynamic properties of all Kafka 
> processors.
> I am planning on tackling this ticket myself, unless someone thinks this is a 
> bad idea for some reason.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2915: NIFI-5388: enabled EL support for dynamic properties of Ka...

2018-08-30 Thread MikeThomsen
Github user MikeThomsen commented on the issue:

https://github.com/apache/nifi/pull/2915
  
@zenfenan the consumer one has input forbidden on it, so you won't get 
flowfiles there. For the publisher, my guess is it's probably being 
conservative about reconfiguring the producers, which isn't a bad call IMO.


---


[jira] [Commented] (NIFI-5561) Add component name filtering to S2S Provenance Reporting Task

2018-08-30 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5561?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16597299#comment-16597299
 ] 

ASF GitHub Bot commented on NIFI-5561:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2973#discussion_r213980541
  
--- Diff: 
nifi-nar-bundles/nifi-site-to-site-reporting-bundle/nifi-site-to-site-reporting-task/src/main/java/org/apache/nifi/reporting/SiteToSiteProvenanceReportingTask.java
 ---
@@ -151,6 +151,25 @@
 .addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
 .build();
 
+static final PropertyDescriptor FILTER_COMPONENT_NAME = new 
PropertyDescriptor.Builder()
+.name("s2s-prov-task-name-filter")
+.displayName("Component Name to Include")
+.description("Regular expression to filter the provenance events 
based on the component name. Only the events matching the regular "
++ "expression will be sent. If no filter is set, all the 
events are sent. If multiple filters are set, the filters are cumulative.")
+.required(false)
+.addValidator(StandardValidators.REGULAR_EXPRESSION_VALIDATOR)
--- End diff --

Our CM story is still really developing at this point, so I would say 
whatever could be done to help facilitate making it easier for admins to 
configure NiFi in a standardized way is going to make the a little happier even 
if it's just "copy and paste this EL string."


> Add component name filtering to S2S Provenance Reporting Task
> -
>
> Key: NIFI-5561
> URL: https://issues.apache.org/jira/browse/NIFI-5561
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Reporter: Pierre Villard
>Assignee: Pierre Villard
>Priority: Major
>
> I'd like to add component name as a way to filter events sent by the 
> SiteToSite Provenance Reporting task so that, for example, all events 
> generated by components containing "Prov" in the name are picked.
> This will be much easier to manage rather than component IDs as the ID of a 
> component could change when a workflow is promoted from one environment to 
> another in a CI/CD pipeline.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2973: NIFI-5561 - Add component name filtering to S2S Pro...

2018-08-30 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2973#discussion_r213980541
  
--- Diff: 
nifi-nar-bundles/nifi-site-to-site-reporting-bundle/nifi-site-to-site-reporting-task/src/main/java/org/apache/nifi/reporting/SiteToSiteProvenanceReportingTask.java
 ---
@@ -151,6 +151,25 @@
 .addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
 .build();
 
+static final PropertyDescriptor FILTER_COMPONENT_NAME = new 
PropertyDescriptor.Builder()
+.name("s2s-prov-task-name-filter")
+.displayName("Component Name to Include")
+.description("Regular expression to filter the provenance events 
based on the component name. Only the events matching the regular "
++ "expression will be sent. If no filter is set, all the 
events are sent. If multiple filters are set, the filters are cumulative.")
+.required(false)
+.addValidator(StandardValidators.REGULAR_EXPRESSION_VALIDATOR)
--- End diff --

Our CM story is still really developing at this point, so I would say 
whatever could be done to help facilitate making it easier for admins to 
configure NiFi in a standardized way is going to make the a little happier even 
if it's just "copy and paste this EL string."


---


[jira] [Commented] (NIFI-5544) Refactor GetMongo processor codebase

2018-08-30 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5544?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16597297#comment-16597297
 ] 

ASF GitHub Bot commented on NIFI-5544:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2958#discussion_r213979035
  
--- Diff: 
nifi-nar-bundles/nifi-mongodb-bundle/nifi-mongodb-processors/src/main/java/org/apache/nifi/processors/mongodb/GetMongo.java
 ---
@@ -204,144 +212,145 @@ private ObjectWriter getObjectWriter(ObjectMapper 
mapper, String ppSetting) {
 @Override
 public void onTrigger(final ProcessContext context, final 
ProcessSession session) throws ProcessException {
 FlowFile input = null;
+logger = getLogger();
+
 if (context.hasIncomingConnection()) {
 input = session.get();
-
 if (input == null && context.hasNonLoopConnection()) {
 return;
 }
 }
 
-final ComponentLog logger = getLogger();
+final Document query = getQuery(context, session, input );
 
-Map attributes = new HashMap<>();
-attributes.put(CoreAttributes.MIME_TYPE.key(), "application/json");
+if (query == null) {
+return;
+}
 
-final Document query;
+final String jsonTypeSetting = 
context.getProperty(JSON_TYPE).getValue();
+final String usePrettyPrint  = 
context.getProperty(USE_PRETTY_PRINTING).getValue();
 final Charset charset = 
Charset.forName(context.getProperty(CHARSET).evaluateAttributeExpressions(input).getValue());
+final Map attributes = new HashMap<>();
 
-String queryStr;
-if (context.getProperty(QUERY).isSet()) {
-queryStr = 
context.getProperty(QUERY).evaluateAttributeExpressions(input).getValue();
-query = Document.parse(queryStr);
-} else if (!context.getProperty(QUERY).isSet() && input == null) {
-queryStr = "{}";
-query = Document.parse("{}");
-} else {
-try {
-ByteArrayOutputStream out = new ByteArrayOutputStream();
-session.exportTo(input, out);
-out.close();
-queryStr = new String(out.toByteArray());
-query = Document.parse(queryStr);
-} catch (Exception ex) {
-getLogger().error("Error reading flowfile", ex);
-if (input != null) { //Likely culprit is a bad query
-session.transfer(input, REL_FAILURE);
-return;
-} else {
-throw new ProcessException(ex);
-}
-}
-}
+attributes.put(CoreAttributes.MIME_TYPE.key(), "application/json");
 
 if (context.getProperty(QUERY_ATTRIBUTE).isSet()) {
 final String queryAttr = 
context.getProperty(QUERY_ATTRIBUTE).evaluateAttributeExpressions(input).getValue();
-attributes.put(queryAttr, queryStr);
+attributes.put(queryAttr, query.toJson());
 }
 
 final Document projection = context.getProperty(PROJECTION).isSet()
 ? 
Document.parse(context.getProperty(PROJECTION).evaluateAttributeExpressions(input).getValue())
 : null;
 final Document sort = context.getProperty(SORT).isSet()
 ? 
Document.parse(context.getProperty(SORT).evaluateAttributeExpressions(input).getValue())
 : null;
-final String jsonTypeSetting = 
context.getProperty(JSON_TYPE).getValue();
-final String usePrettyPrint  = 
context.getProperty(USE_PRETTY_PRINTING).getValue();
-configureMapper(jsonTypeSetting);
 
+final MongoCollection collection = 
getCollection(context, input);
+final FindIterable it = collection.find(query);
 
-try {
-final MongoCollection collection = 
getCollection(context, input);
+attributes.put(DB_NAME, 
collection.getNamespace().getDatabaseName());
+attributes.put(COL_NAME, 
collection.getNamespace().getCollectionName());
 
-attributes.put(DB_NAME, 
collection.getNamespace().getDatabaseName());
-attributes.put(COL_NAME, 
collection.getNamespace().getCollectionName());
+if (projection != null) {
+it.projection(projection);
+}
+if (sort != null) {
+it.sort(sort);
+}
+if (context.getProperty(LIMIT).isSet()) {
+
it.limit(context.getProperty(LIMIT).evaluateAttributeExpressions(input).asInteger());
   

[jira] [Commented] (NIFI-5544) Refactor GetMongo processor codebase

2018-08-30 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5544?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16597296#comment-16597296
 ] 

ASF GitHub Bot commented on NIFI-5544:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2958#discussion_r213977393
  
--- Diff: 
nifi-nar-bundles/nifi-mongodb-bundle/nifi-mongodb-processors/src/main/java/org/apache/nifi/processors/mongodb/GetMongo.java
 ---
@@ -204,144 +212,145 @@ private ObjectWriter getObjectWriter(ObjectMapper 
mapper, String ppSetting) {
 @Override
 public void onTrigger(final ProcessContext context, final 
ProcessSession session) throws ProcessException {
 FlowFile input = null;
+logger = getLogger();
+
 if (context.hasIncomingConnection()) {
 input = session.get();
-
 if (input == null && context.hasNonLoopConnection()) {
 return;
 }
 }
 
-final ComponentLog logger = getLogger();
+final Document query = getQuery(context, session, input );
--- End diff --

I think I actually refactored this is another one of my Mongo commits... :-D


> Refactor GetMongo processor codebase
> 
>
> Key: NIFI-5544
> URL: https://issues.apache.org/jira/browse/NIFI-5544
> Project: Apache NiFi
>  Issue Type: Improvement
>Reporter: Sivaprasanna Sethuraman
>Assignee: Sivaprasanna Sethuraman
>Priority: Minor
>
> Current codebase of the GetMongo processor can be improved.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2958: NIFI-5544: GetMongo refactored

2018-08-30 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2958#discussion_r213977393
  
--- Diff: 
nifi-nar-bundles/nifi-mongodb-bundle/nifi-mongodb-processors/src/main/java/org/apache/nifi/processors/mongodb/GetMongo.java
 ---
@@ -204,144 +212,145 @@ private ObjectWriter getObjectWriter(ObjectMapper 
mapper, String ppSetting) {
 @Override
 public void onTrigger(final ProcessContext context, final 
ProcessSession session) throws ProcessException {
 FlowFile input = null;
+logger = getLogger();
+
 if (context.hasIncomingConnection()) {
 input = session.get();
-
 if (input == null && context.hasNonLoopConnection()) {
 return;
 }
 }
 
-final ComponentLog logger = getLogger();
+final Document query = getQuery(context, session, input );
--- End diff --

I think I actually refactored this is another one of my Mongo commits... :-D


---


[GitHub] nifi pull request #2958: NIFI-5544: GetMongo refactored

2018-08-30 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2958#discussion_r213979035
  
--- Diff: 
nifi-nar-bundles/nifi-mongodb-bundle/nifi-mongodb-processors/src/main/java/org/apache/nifi/processors/mongodb/GetMongo.java
 ---
@@ -204,144 +212,145 @@ private ObjectWriter getObjectWriter(ObjectMapper 
mapper, String ppSetting) {
 @Override
 public void onTrigger(final ProcessContext context, final 
ProcessSession session) throws ProcessException {
 FlowFile input = null;
+logger = getLogger();
+
 if (context.hasIncomingConnection()) {
 input = session.get();
-
 if (input == null && context.hasNonLoopConnection()) {
 return;
 }
 }
 
-final ComponentLog logger = getLogger();
+final Document query = getQuery(context, session, input );
 
-Map attributes = new HashMap<>();
-attributes.put(CoreAttributes.MIME_TYPE.key(), "application/json");
+if (query == null) {
+return;
+}
 
-final Document query;
+final String jsonTypeSetting = 
context.getProperty(JSON_TYPE).getValue();
+final String usePrettyPrint  = 
context.getProperty(USE_PRETTY_PRINTING).getValue();
 final Charset charset = 
Charset.forName(context.getProperty(CHARSET).evaluateAttributeExpressions(input).getValue());
+final Map attributes = new HashMap<>();
 
-String queryStr;
-if (context.getProperty(QUERY).isSet()) {
-queryStr = 
context.getProperty(QUERY).evaluateAttributeExpressions(input).getValue();
-query = Document.parse(queryStr);
-} else if (!context.getProperty(QUERY).isSet() && input == null) {
-queryStr = "{}";
-query = Document.parse("{}");
-} else {
-try {
-ByteArrayOutputStream out = new ByteArrayOutputStream();
-session.exportTo(input, out);
-out.close();
-queryStr = new String(out.toByteArray());
-query = Document.parse(queryStr);
-} catch (Exception ex) {
-getLogger().error("Error reading flowfile", ex);
-if (input != null) { //Likely culprit is a bad query
-session.transfer(input, REL_FAILURE);
-return;
-} else {
-throw new ProcessException(ex);
-}
-}
-}
+attributes.put(CoreAttributes.MIME_TYPE.key(), "application/json");
 
 if (context.getProperty(QUERY_ATTRIBUTE).isSet()) {
 final String queryAttr = 
context.getProperty(QUERY_ATTRIBUTE).evaluateAttributeExpressions(input).getValue();
-attributes.put(queryAttr, queryStr);
+attributes.put(queryAttr, query.toJson());
 }
 
 final Document projection = context.getProperty(PROJECTION).isSet()
 ? 
Document.parse(context.getProperty(PROJECTION).evaluateAttributeExpressions(input).getValue())
 : null;
 final Document sort = context.getProperty(SORT).isSet()
 ? 
Document.parse(context.getProperty(SORT).evaluateAttributeExpressions(input).getValue())
 : null;
-final String jsonTypeSetting = 
context.getProperty(JSON_TYPE).getValue();
-final String usePrettyPrint  = 
context.getProperty(USE_PRETTY_PRINTING).getValue();
-configureMapper(jsonTypeSetting);
 
+final MongoCollection collection = 
getCollection(context, input);
+final FindIterable it = collection.find(query);
 
-try {
-final MongoCollection collection = 
getCollection(context, input);
+attributes.put(DB_NAME, 
collection.getNamespace().getDatabaseName());
+attributes.put(COL_NAME, 
collection.getNamespace().getCollectionName());
 
-attributes.put(DB_NAME, 
collection.getNamespace().getDatabaseName());
-attributes.put(COL_NAME, 
collection.getNamespace().getCollectionName());
+if (projection != null) {
+it.projection(projection);
+}
+if (sort != null) {
+it.sort(sort);
+}
+if (context.getProperty(LIMIT).isSet()) {
+
it.limit(context.getProperty(LIMIT).evaluateAttributeExpressions(input).asInteger());
+}
+if (context.getProperty(BATCH_SIZE).isSet()) {
+
it.batchSize(context.getProperty(BATCH_SIZE).evaluateAttributeExpressions(input).asInteger());
+}
 
-final FindIterable it = query !

[jira] [Commented] (NIFI-5552) CSVReader Can't Derive Schema from Quoted Headers

2018-08-30 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5552?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16597250#comment-16597250
 ] 

ASF GitHub Bot commented on NIFI-5552:
--

Github user pvillard31 commented on the issue:

https://github.com/apache/nifi/pull/2966
  
Hey @markap14 - thanks for the review. You're right, it's much better to 
address it when creating the Avro schema. That would prevent this error to 
happen again if one day we add a Reader for a format that would accept non Avro 
compatible field names.

Just a quick remark. Initially in ``AvroTypeUtil`` I just wanted to change 
L123 from:
java
final Field field = new Field(recordField.getFieldName(), schema, null, 
recordField.getDefaultValue());

to
java
final Field field = new 
Field(normalizeNameForAvro(recordField.getFieldName()), schema, null, 
recordField.getDefaultValue());


But not sure changing the name just here is really a good idea and if it's 
not going to break some things in other places. In the end, I changed all the 
occurrences of ``recordField.getFieldName()`` to normalize the name. Thoughts?


> CSVReader Can't Derive Schema from Quoted Headers
> -
>
> Key: NIFI-5552
> URL: https://issues.apache.org/jira/browse/NIFI-5552
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Extensions
>Affects Versions: 1.7.1
>Reporter: Shawn Weeks
>Assignee: Pierre Villard
>Priority: Minor
>
> When deriving the schema from a CSV File Header NiFi is unable to generate a 
> valid schema if the Header Columns are Double Quoted even though the 
> CSVReader is set to handle quotes. Using the nile.csv sample file from 
> https://people.sc.fsu.edu/~jburkardt/data/csv/csv.html results in an Illegal 
> initial character exception in the Avro Schema generator. In this specific 
> case the header did not contain any spaces or special characters though it 
> was case sensitive.
> {code:java}
> org.apache.avro.SchemaParseException: Illegal initial character: "Flood"
> at org.apache.avro.Schema.validateName(Schema.java:1147)
> at org.apache.avro.Schema.access$200(Schema.java:81)
> at org.apache.avro.Schema$Field.(Schema.java:403)
> at org.apache.avro.Schema$Field.(Schema.java:423)
> at org.apache.avro.Schema$Field.(Schema.java:415)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:123)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:114)
> at org.apache.nifi.avro.AvroTypeUtil.extractAvroSchema(AvroTypeUtil.java:94)
> at 
> org.apache.nifi.schema.access.WriteAvroSchemaAttributeStrategy.getAttributes(WriteAvroSchemaAttributeStrategy.java:58)
> at org.apache.nifi.json.WriteJsonResult.writeRecord(WriteJsonResult.java:137)
> at 
> org.apache.nifi.serialization.AbstractRecordSetWriter.write(AbstractRecordSetWriter.java:59)
> at 
> org.apache.nifi.processors.standard.AbstractRecordProcessor$1.process(AbstractRecordProcessor.java:122)
> at 
> org.apache.nifi.controller.repository.StandardProcessSession.write(StandardProcessSession.java:2885)
> at 
> org.apache.nifi.processors.standard.AbstractRecordProcessor.onTrigger(AbstractRecordProcessor.java:109)
> at 
> org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
> at 
> org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1165)
> at 
> org.apache.nifi.controller.tasks.ConnectableTask.invoke(ConnectableTask.java:203)
> at 
> org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:117)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
> at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
> at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
> at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> at java.lang.Thread.run(Thread.java:748){code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2966: NIFI-5552 - Add option to normalize header column names in...

2018-08-30 Thread pvillard31
Github user pvillard31 commented on the issue:

https://github.com/apache/nifi/pull/2966
  
Hey @markap14 - thanks for the review. You're right, it's much better to 
address it when creating the Avro schema. That would prevent this error to 
happen again if one day we add a Reader for a format that would accept non Avro 
compatible field names.

Just a quick remark. Initially in ``AvroTypeUtil`` I just wanted to change 
L123 from:
java
final Field field = new Field(recordField.getFieldName(), schema, null, 
recordField.getDefaultValue());

to
java
final Field field = new 
Field(normalizeNameForAvro(recordField.getFieldName()), schema, null, 
recordField.getDefaultValue());


But not sure changing the name just here is really a good idea and if it's 
not going to break some things in other places. In the end, I changed all the 
occurrences of ``recordField.getFieldName()`` to normalize the name. Thoughts?


---


[jira] [Commented] (NIFI-5552) CSVReader Can't Derive Schema from Quoted Headers

2018-08-30 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5552?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16597202#comment-16597202
 ] 

ASF GitHub Bot commented on NIFI-5552:
--

GitHub user pvillard31 reopened a pull request:

https://github.com/apache/nifi/pull/2966

NIFI-5552 - Add option to normalize header column names in CSVRecordR…

…eader

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [x] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [x] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [x] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [ ] Is your initial contribution a single, squashed commit?

### For code changes:
- [x] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [x] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [x] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/pvillard31/nifi NIFI-5552

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/2966.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2966


commit c3ee52ed7e061a01a12332f36a3374b69345eef2
Author: Pierre Villard 
Date:   2018-08-30T08:48:27Z

NIFI-5552 - Normalize field names to ensure avro compatible schemas




> CSVReader Can't Derive Schema from Quoted Headers
> -
>
> Key: NIFI-5552
> URL: https://issues.apache.org/jira/browse/NIFI-5552
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Extensions
>Affects Versions: 1.7.1
>Reporter: Shawn Weeks
>Assignee: Pierre Villard
>Priority: Minor
>
> When deriving the schema from a CSV File Header NiFi is unable to generate a 
> valid schema if the Header Columns are Double Quoted even though the 
> CSVReader is set to handle quotes. Using the nile.csv sample file from 
> https://people.sc.fsu.edu/~jburkardt/data/csv/csv.html results in an Illegal 
> initial character exception in the Avro Schema generator. In this specific 
> case the header did not contain any spaces or special characters though it 
> was case sensitive.
> {code:java}
> org.apache.avro.SchemaParseException: Illegal initial character: "Flood"
> at org.apache.avro.Schema.validateName(Schema.java:1147)
> at org.apache.avro.Schema.access$200(Schema.java:81)
> at org.apache.avro.Schema$Field.(Schema.java:403)
> at org.apache.avro.Schema$Field.(Schema.java:423)
> at org.apache.avro.Schema$Field.(Schema.java:415)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:123)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:114)
> at org.apache.nifi.avro.AvroTypeUtil.extractAvroSchema(AvroTypeUtil.java:94)
> at 
> org.apache.nifi.schema.access.WriteAvroSchemaAttributeStrategy.getAttributes(WriteAvroSchemaAttributeStrategy.java:58)
> at org.apache.nifi.json.WriteJsonResult.writeRecord(WriteJsonResult.java:137)
> at 
> org.apache.nifi.serialization.AbstractRecordSetWriter.write(AbstractRecordSetWriter.java:59)
> at 
> org.apache.nifi.processors.standard.AbstractRecordProcessor$1.process(AbstractRecordProcessor.java:122)
> at 
> org.apache.nifi.controller.repository.StandardProcessSession.write(StandardProcessSession.java:2885)
> at 
> org.apache.nifi.processors.standard.AbstractRecordProcessor.onTrigger(AbstractRecordProcessor.java:109)
> at 
> org.apache.nifi.pr

[GitHub] nifi pull request #2966: NIFI-5552 - Add option to normalize header column n...

2018-08-30 Thread pvillard31
GitHub user pvillard31 reopened a pull request:

https://github.com/apache/nifi/pull/2966

NIFI-5552 - Add option to normalize header column names in CSVRecordR…

…eader

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [x] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [x] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [x] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [ ] Is your initial contribution a single, squashed commit?

### For code changes:
- [x] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [x] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [x] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/pvillard31/nifi NIFI-5552

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/2966.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2966


commit c3ee52ed7e061a01a12332f36a3374b69345eef2
Author: Pierre Villard 
Date:   2018-08-30T08:48:27Z

NIFI-5552 - Normalize field names to ensure avro compatible schemas




---


[jira] [Commented] (NIFI-5552) CSVReader Can't Derive Schema from Quoted Headers

2018-08-30 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5552?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16597172#comment-16597172
 ] 

ASF GitHub Bot commented on NIFI-5552:
--

Github user pvillard31 closed the pull request at:

https://github.com/apache/nifi/pull/2966


> CSVReader Can't Derive Schema from Quoted Headers
> -
>
> Key: NIFI-5552
> URL: https://issues.apache.org/jira/browse/NIFI-5552
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Extensions
>Affects Versions: 1.7.1
>Reporter: Shawn Weeks
>Assignee: Pierre Villard
>Priority: Minor
>
> When deriving the schema from a CSV File Header NiFi is unable to generate a 
> valid schema if the Header Columns are Double Quoted even though the 
> CSVReader is set to handle quotes. Using the nile.csv sample file from 
> https://people.sc.fsu.edu/~jburkardt/data/csv/csv.html results in an Illegal 
> initial character exception in the Avro Schema generator. In this specific 
> case the header did not contain any spaces or special characters though it 
> was case sensitive.
> {code:java}
> org.apache.avro.SchemaParseException: Illegal initial character: "Flood"
> at org.apache.avro.Schema.validateName(Schema.java:1147)
> at org.apache.avro.Schema.access$200(Schema.java:81)
> at org.apache.avro.Schema$Field.(Schema.java:403)
> at org.apache.avro.Schema$Field.(Schema.java:423)
> at org.apache.avro.Schema$Field.(Schema.java:415)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:123)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:114)
> at org.apache.nifi.avro.AvroTypeUtil.extractAvroSchema(AvroTypeUtil.java:94)
> at 
> org.apache.nifi.schema.access.WriteAvroSchemaAttributeStrategy.getAttributes(WriteAvroSchemaAttributeStrategy.java:58)
> at org.apache.nifi.json.WriteJsonResult.writeRecord(WriteJsonResult.java:137)
> at 
> org.apache.nifi.serialization.AbstractRecordSetWriter.write(AbstractRecordSetWriter.java:59)
> at 
> org.apache.nifi.processors.standard.AbstractRecordProcessor$1.process(AbstractRecordProcessor.java:122)
> at 
> org.apache.nifi.controller.repository.StandardProcessSession.write(StandardProcessSession.java:2885)
> at 
> org.apache.nifi.processors.standard.AbstractRecordProcessor.onTrigger(AbstractRecordProcessor.java:109)
> at 
> org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
> at 
> org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1165)
> at 
> org.apache.nifi.controller.tasks.ConnectableTask.invoke(ConnectableTask.java:203)
> at 
> org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:117)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
> at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
> at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
> at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> at java.lang.Thread.run(Thread.java:748){code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2966: NIFI-5552 - Add option to normalize header column n...

2018-08-30 Thread pvillard31
Github user pvillard31 closed the pull request at:

https://github.com/apache/nifi/pull/2966


---


[jira] [Commented] (NIFI-5561) Add component name filtering to S2S Provenance Reporting Task

2018-08-30 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5561?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16597158#comment-16597158
 ] 

ASF GitHub Bot commented on NIFI-5561:
--

Github user pvillard31 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2973#discussion_r213933585
  
--- Diff: 
nifi-nar-bundles/nifi-site-to-site-reporting-bundle/nifi-site-to-site-reporting-task/src/main/java/org/apache/nifi/reporting/SiteToSiteProvenanceReportingTask.java
 ---
@@ -151,6 +151,25 @@
 .addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
 .build();
 
+static final PropertyDescriptor FILTER_COMPONENT_NAME = new 
PropertyDescriptor.Builder()
+.name("s2s-prov-task-name-filter")
+.displayName("Component Name to Include")
+.description("Regular expression to filter the provenance events 
based on the component name. Only the events matching the regular "
++ "expression will be sent. If no filter is set, all the 
events are sent. If multiple filters are set, the filters are cumulative.")
+.required(false)
+.addValidator(StandardValidators.REGULAR_EXPRESSION_VALIDATOR)
--- End diff --

Since it's a reporting task, it won't access the variable registries 
defined at UI-level. I believe this will only access the global file-based 
variable registry (defined in nifi.properties) and the environment/jvm 
variables. Not sure this provides much value but on the other hand it's easy to 
add. Thoughts?


> Add component name filtering to S2S Provenance Reporting Task
> -
>
> Key: NIFI-5561
> URL: https://issues.apache.org/jira/browse/NIFI-5561
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Reporter: Pierre Villard
>Assignee: Pierre Villard
>Priority: Major
>
> I'd like to add component name as a way to filter events sent by the 
> SiteToSite Provenance Reporting task so that, for example, all events 
> generated by components containing "Prov" in the name are picked.
> This will be much easier to manage rather than component IDs as the ID of a 
> component could change when a workflow is promoted from one environment to 
> another in a CI/CD pipeline.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2973: NIFI-5561 - Add component name filtering to S2S Pro...

2018-08-30 Thread pvillard31
Github user pvillard31 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2973#discussion_r213933585
  
--- Diff: 
nifi-nar-bundles/nifi-site-to-site-reporting-bundle/nifi-site-to-site-reporting-task/src/main/java/org/apache/nifi/reporting/SiteToSiteProvenanceReportingTask.java
 ---
@@ -151,6 +151,25 @@
 .addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
 .build();
 
+static final PropertyDescriptor FILTER_COMPONENT_NAME = new 
PropertyDescriptor.Builder()
+.name("s2s-prov-task-name-filter")
+.displayName("Component Name to Include")
+.description("Regular expression to filter the provenance events 
based on the component name. Only the events matching the regular "
++ "expression will be sent. If no filter is set, all the 
events are sent. If multiple filters are set, the filters are cumulative.")
+.required(false)
+.addValidator(StandardValidators.REGULAR_EXPRESSION_VALIDATOR)
--- End diff --

Since it's a reporting task, it won't access the variable registries 
defined at UI-level. I believe this will only access the global file-based 
variable registry (defined in nifi.properties) and the environment/jvm 
variables. Not sure this provides much value but on the other hand it's easy to 
add. Thoughts?


---


[jira] [Commented] (NIFI-5561) Add component name filtering to S2S Provenance Reporting Task

2018-08-30 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5561?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16597157#comment-16597157
 ] 

ASF GitHub Bot commented on NIFI-5561:
--

Github user pvillard31 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2973#discussion_r213933052
  
--- Diff: 
nifi-nar-bundles/nifi-extension-utils/nifi-reporting-utils/src/main/java/org/apache/nifi/reporting/util/provenance/ProvenanceEventConsumer.java
 ---
@@ -241,20 +255,24 @@ private long updateLastEventId(final 
List events, final S
 
 private boolean isFilteringEnabled() {
 return componentTypeRegex != null || !eventTypes.isEmpty() || 
!componentIds.isEmpty()
-|| componentTypeRegexExclude != null || 
!eventTypesExclude.isEmpty() || !componentIdsExclude.isEmpty();
+|| componentTypeRegexExclude != null || 
!eventTypesExclude.isEmpty() || !componentIdsExclude.isEmpty()
+|| componentNameRegex != null || componentNameRegexExclude 
!= null;
--- End diff --

Yeah agree, I added your two commits on this pull request. Thanks!


> Add component name filtering to S2S Provenance Reporting Task
> -
>
> Key: NIFI-5561
> URL: https://issues.apache.org/jira/browse/NIFI-5561
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Reporter: Pierre Villard
>Assignee: Pierre Villard
>Priority: Major
>
> I'd like to add component name as a way to filter events sent by the 
> SiteToSite Provenance Reporting task so that, for example, all events 
> generated by components containing "Prov" in the name are picked.
> This will be much easier to manage rather than component IDs as the ID of a 
> component could change when a workflow is promoted from one environment to 
> another in a CI/CD pipeline.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2973: NIFI-5561 - Add component name filtering to S2S Pro...

2018-08-30 Thread pvillard31
Github user pvillard31 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2973#discussion_r213933052
  
--- Diff: 
nifi-nar-bundles/nifi-extension-utils/nifi-reporting-utils/src/main/java/org/apache/nifi/reporting/util/provenance/ProvenanceEventConsumer.java
 ---
@@ -241,20 +255,24 @@ private long updateLastEventId(final 
List events, final S
 
 private boolean isFilteringEnabled() {
 return componentTypeRegex != null || !eventTypes.isEmpty() || 
!componentIds.isEmpty()
-|| componentTypeRegexExclude != null || 
!eventTypesExclude.isEmpty() || !componentIdsExclude.isEmpty();
+|| componentTypeRegexExclude != null || 
!eventTypesExclude.isEmpty() || !componentIdsExclude.isEmpty()
+|| componentNameRegex != null || componentNameRegexExclude 
!= null;
--- End diff --

Yeah agree, I added your two commits on this pull request. Thanks!


---