[GitHub] nifi issue #2682: NIFI-4731: BQ Processors and GCP library update.

2018-08-06 Thread danieljimenez
Github user danieljimenez commented on the issue:

https://github.com/apache/nifi/pull/2682
  
I'm not sure about the RAT plugin, but this is ready otherwise.


---


[jira] [Commented] (NIFI-4731) BigQuery processors

2018-08-06 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4731?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16570191#comment-16570191
 ] 

ASF GitHub Bot commented on NIFI-4731:
--

Github user danieljimenez commented on the issue:

https://github.com/apache/nifi/pull/2682
  
I'm not sure about the RAT plugin, but this is ready otherwise.


> BigQuery processors
> ---
>
> Key: NIFI-4731
> URL: https://issues.apache.org/jira/browse/NIFI-4731
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Reporter: Mikhail Sosonkin
>Priority: Major
>
> NIFI should have processors for putting data into BigQuery (Streaming and 
> Batch).
> Initial working processors can be found this repository: 
> https://github.com/nologic/nifi/tree/NIFI-4731/nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/bigquery
> I'd like to get them into Nifi proper.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2682: NIFI-4731: BQ Processors and GCP library update.

2018-08-06 Thread pvillard31
Github user pvillard31 commented on the issue:

https://github.com/apache/nifi/pull/2682
  
@danieljimenez 
Based on

[WARNING] Files with unapproved licenses:
  
/home/travis/build/apache/nifi/nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/test/java/org/apache/nifi/processors/gcp/bigquery/AbstractBigQueryIT.java
  
/home/travis/build/apache/nifi/nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/test/java/org/apache/nifi/processors/gcp/bigquery/PutBigQueryBatchIT.java

The two files are missing the headers with Apache license at the top. You 
can just copy/paste the header from another file.


---


[jira] [Commented] (NIFI-4731) BigQuery processors

2018-08-06 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4731?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16570221#comment-16570221
 ] 

ASF GitHub Bot commented on NIFI-4731:
--

Github user pvillard31 commented on the issue:

https://github.com/apache/nifi/pull/2682
  
@danieljimenez 
Based on

[WARNING] Files with unapproved licenses:
  
/home/travis/build/apache/nifi/nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/test/java/org/apache/nifi/processors/gcp/bigquery/AbstractBigQueryIT.java
  
/home/travis/build/apache/nifi/nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/test/java/org/apache/nifi/processors/gcp/bigquery/PutBigQueryBatchIT.java

The two files are missing the headers with Apache license at the top. You 
can just copy/paste the header from another file.


> BigQuery processors
> ---
>
> Key: NIFI-4731
> URL: https://issues.apache.org/jira/browse/NIFI-4731
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Reporter: Mikhail Sosonkin
>Priority: Major
>
> NIFI should have processors for putting data into BigQuery (Streaming and 
> Batch).
> Initial working processors can be found this repository: 
> https://github.com/nologic/nifi/tree/NIFI-4731/nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/bigquery
> I'd like to get them into Nifi proper.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2682: NIFI-4731: BQ Processors and GCP library update.

2018-08-06 Thread danieljimenez
Github user danieljimenez commented on the issue:

https://github.com/apache/nifi/pull/2682
  
@pvillard31 thanks, I've added that and force pushed an update.


---


[jira] [Commented] (NIFI-4731) BigQuery processors

2018-08-06 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4731?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16570254#comment-16570254
 ] 

ASF GitHub Bot commented on NIFI-4731:
--

Github user danieljimenez commented on the issue:

https://github.com/apache/nifi/pull/2682
  
@pvillard31 thanks, I've added that and force pushed an update.


> BigQuery processors
> ---
>
> Key: NIFI-4731
> URL: https://issues.apache.org/jira/browse/NIFI-4731
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Reporter: Mikhail Sosonkin
>Priority: Major
>
> NIFI should have processors for putting data into BigQuery (Streaming and 
> Batch).
> Initial working processors can be found this repository: 
> https://github.com/nologic/nifi/tree/NIFI-4731/nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/bigquery
> I'd like to get them into Nifi proper.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5465) HandleHttpRequest times out requests if it goes more than 30 seconds without receiving data, regardless of the configured Request Timeout

2018-08-06 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5465?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16570280#comment-16570280
 ] 

ASF subversion and git services commented on NIFI-5465:
---

Commit d1ab17580fe02766318e4a2d0ee5a76b0742d137 in nifi's branch 
refs/heads/master from [~markap14]
[ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=d1ab175 ]

NIFI-5465: Set the Idle Timeout on jetty connectors to the same as the Request 
Timeout.


> HandleHttpRequest times out requests if it goes more than 30 seconds without 
> receiving data, regardless of the configured Request Timeout
> -
>
> Key: NIFI-5465
> URL: https://issues.apache.org/jira/browse/NIFI-5465
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Extensions
>Reporter: Mark Payne
>Assignee: Mark Payne
>Priority: Major
> Fix For: 1.8.0
>
>
> The HTTP Context Map allows the user to configure the "Request Timeout" but 
> if data is written to the HTTP Request, then pauses for more than 30 seconds, 
> then writes more data, then the request will timeout. We should allow request 
> to go up to the configured "Request Timeout" before throwing a 
> TimeoutException. The stack trace seen is:
> {code:java}
> org.apache.nifi.processor.exception.FlowFileAccessException: Failed to import 
> data from 
> HttpInputOverHTTP@5f73086c[c=16384,q=0,[0]=null,s=ERROR:java.util.concurrent.TimeoutException:
>  Idle timeout expired: 30003/3 ms] for 
> StandardFlowFileRecord[uuid=2c04929e-cd59-46b6-8612-c437c0230591,claim=,offset=0,name=84593375189600,size=0]
>  due to org.apache.nifi.processor.exception.FlowFileAccessException: Unable 
> to create ContentClaim due to java.io.IOException: 
> java.util.concurrent.TimeoutException: Idle timeout expired: 30003/3 ms; 
> rolling back session: {} 
> org.apache.nifi.processor.exception.FlowFileAccessException: Failed to import 
> data from 
> HttpInputOverHTTP@5f73086c[c=16384,q=0,[0]=null,s=ERROR:java.util.concurrent.TimeoutException:
>  Idle timeout expired: 30003/3 ms] for 
> StandardFlowFileRecord[uuid=2c04929e-cd59-46b6-8612-c437c0230591,claim=,offset=0,name=84593375189600,size=0]
>  due to org.apache.nifi.processor.exception.FlowFileAccessException: Unable 
> to create ContentClaim due to java.io.IOException: 
> java.util.concurrent.TimeoutException: Idle timeout expired: 30003/3 ms 
> at 
> org.apache.nifi.controller.repository.StandardProcessSession.importFrom(StandardProcessSession.java:2942)
>  
> at 
> org.apache.nifi.processors.standard.HandleHttpRequest.onTrigger(HandleHttpRequest.java:507)
>  
> at 
> org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
>  
> at 
> org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1122)
>  
> at 
> org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:147)
>  
> at 
> org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47)
>  
> at 
> org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:128)
>  
> at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source) 
> at java.util.concurrent.FutureTask.runAndReset(Unknown Source) 
> at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(Unknown
>  Source) 
> at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(Unknown
>  Source) 
> at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) 
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) 
> at java.lang.Thread.run(Unknown Source) 
> Caused by: org.apache.nifi.processor.exception.FlowFileAccessException: 
> Unable to create ContentClaim due to java.io.IOException: 
> java.util.concurrent.TimeoutException: Idle timeout expired: 30003/3 ms 
> at 
> org.apache.nifi.controller.repository.StandardProcessSession.importFrom(StandardProcessSession.java:2935)
>  
> ... 13 common frames omitted 
> Caused by: java.io.IOException: java.util.concurrent.TimeoutException: Idle 
> timeout expired: 30003/3 ms 
> at 
> org.eclipse.jetty.server.HttpInput$ErrorState.noContent(HttpInput.java:1047) 
> at org.eclipse.jetty.server.HttpInput.read(HttpInput.java:307) 
> at java.io.InputStream.read(Unknown Source) 
> at org.apache.nifi.stream.io.StreamUtils.copy(StreamUtils.java:35) 
> at 
> org.apache.nifi.controller.repository.FileSystemRepository.importFrom(FileSystemRepository.java:734)
>  
> at 
> org.apache.nifi.controller.repository.StandardProcessSession.importFrom(StandardProcessSession.java:2932)
>  
> ... 13 common frames omitted 
> Cau

[GitHub] nifi pull request #2918: NIFI-5465: Set the Idle Timeout on jetty connectors...

2018-08-06 Thread markap14
Github user markap14 closed the pull request at:

https://github.com/apache/nifi/pull/2918


---


[jira] [Commented] (NIFI-5465) HandleHttpRequest times out requests if it goes more than 30 seconds without receiving data, regardless of the configured Request Timeout

2018-08-06 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5465?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16570281#comment-16570281
 ] 

ASF GitHub Bot commented on NIFI-5465:
--

Github user markap14 closed the pull request at:

https://github.com/apache/nifi/pull/2918


> HandleHttpRequest times out requests if it goes more than 30 seconds without 
> receiving data, regardless of the configured Request Timeout
> -
>
> Key: NIFI-5465
> URL: https://issues.apache.org/jira/browse/NIFI-5465
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Extensions
>Reporter: Mark Payne
>Assignee: Mark Payne
>Priority: Major
> Fix For: 1.8.0
>
>
> The HTTP Context Map allows the user to configure the "Request Timeout" but 
> if data is written to the HTTP Request, then pauses for more than 30 seconds, 
> then writes more data, then the request will timeout. We should allow request 
> to go up to the configured "Request Timeout" before throwing a 
> TimeoutException. The stack trace seen is:
> {code:java}
> org.apache.nifi.processor.exception.FlowFileAccessException: Failed to import 
> data from 
> HttpInputOverHTTP@5f73086c[c=16384,q=0,[0]=null,s=ERROR:java.util.concurrent.TimeoutException:
>  Idle timeout expired: 30003/3 ms] for 
> StandardFlowFileRecord[uuid=2c04929e-cd59-46b6-8612-c437c0230591,claim=,offset=0,name=84593375189600,size=0]
>  due to org.apache.nifi.processor.exception.FlowFileAccessException: Unable 
> to create ContentClaim due to java.io.IOException: 
> java.util.concurrent.TimeoutException: Idle timeout expired: 30003/3 ms; 
> rolling back session: {} 
> org.apache.nifi.processor.exception.FlowFileAccessException: Failed to import 
> data from 
> HttpInputOverHTTP@5f73086c[c=16384,q=0,[0]=null,s=ERROR:java.util.concurrent.TimeoutException:
>  Idle timeout expired: 30003/3 ms] for 
> StandardFlowFileRecord[uuid=2c04929e-cd59-46b6-8612-c437c0230591,claim=,offset=0,name=84593375189600,size=0]
>  due to org.apache.nifi.processor.exception.FlowFileAccessException: Unable 
> to create ContentClaim due to java.io.IOException: 
> java.util.concurrent.TimeoutException: Idle timeout expired: 30003/3 ms 
> at 
> org.apache.nifi.controller.repository.StandardProcessSession.importFrom(StandardProcessSession.java:2942)
>  
> at 
> org.apache.nifi.processors.standard.HandleHttpRequest.onTrigger(HandleHttpRequest.java:507)
>  
> at 
> org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
>  
> at 
> org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1122)
>  
> at 
> org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:147)
>  
> at 
> org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47)
>  
> at 
> org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:128)
>  
> at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source) 
> at java.util.concurrent.FutureTask.runAndReset(Unknown Source) 
> at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(Unknown
>  Source) 
> at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(Unknown
>  Source) 
> at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) 
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) 
> at java.lang.Thread.run(Unknown Source) 
> Caused by: org.apache.nifi.processor.exception.FlowFileAccessException: 
> Unable to create ContentClaim due to java.io.IOException: 
> java.util.concurrent.TimeoutException: Idle timeout expired: 30003/3 ms 
> at 
> org.apache.nifi.controller.repository.StandardProcessSession.importFrom(StandardProcessSession.java:2935)
>  
> ... 13 common frames omitted 
> Caused by: java.io.IOException: java.util.concurrent.TimeoutException: Idle 
> timeout expired: 30003/3 ms 
> at 
> org.eclipse.jetty.server.HttpInput$ErrorState.noContent(HttpInput.java:1047) 
> at org.eclipse.jetty.server.HttpInput.read(HttpInput.java:307) 
> at java.io.InputStream.read(Unknown Source) 
> at org.apache.nifi.stream.io.StreamUtils.copy(StreamUtils.java:35) 
> at 
> org.apache.nifi.controller.repository.FileSystemRepository.importFrom(FileSystemRepository.java:734)
>  
> at 
> org.apache.nifi.controller.repository.StandardProcessSession.importFrom(StandardProcessSession.java:2932)
>  
> ... 13 common frames omitted 
> Caused by: java.util.concurrent.TimeoutException: Idle timeout expired: 
> 30003/3 ms 
> at org.eclipse.jetty.io.IdleTimeout.checkIdleTimeout(IdleTimeout.java:166) 
> at org.eclipse.jetty.io.Idl

[jira] [Created] (NIFI-5490) Export lineage as JSON file

2018-08-06 Thread Pierre Villard (JIRA)
Pierre Villard created NIFI-5490:


 Summary: Export lineage as JSON file
 Key: NIFI-5490
 URL: https://issues.apache.org/jira/browse/NIFI-5490
 Project: Apache NiFi
  Issue Type: Improvement
  Components: Core UI
Reporter: Pierre Villard


At the moment, from the UI, when displaying the lineage for a provenance event, 
it's possible to download it as a SVG file. It'd be interesting to also allow 
for JSON export for offline analysis.

Note that it's already possible to do it using the dev tool of the browser or 
by manually requesting the provenance repository using the REST API.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (MINIFICPP-592) Restful lookups in RPG should be bypassed when cURL is disabled.

2018-08-06 Thread Mr TheSegfault (JIRA)
Mr TheSegfault created MINIFICPP-592:


 Summary: Restful lookups in RPG should be bypassed when cURL is 
disabled.
 Key: MINIFICPP-592
 URL: https://issues.apache.org/jira/browse/MINIFICPP-592
 Project: NiFi MiNiFi C++
  Issue Type: Test
Reporter: Mr TheSegfault
Assignee: Mr TheSegfault


Changes to the RPG that performed a lookup on the NiFi Rest API should be 
bypassed and the YAML config used when cURL support is disabled in a client. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2936: NIFI-5489: Add expression language support to AMQP process...

2018-08-06 Thread danieljimenez
Github user danieljimenez commented on the issue:

https://github.com/apache/nifi/pull/2936
  
After rerunning the Travis failures are gone.


---


[jira] [Commented] (NIFI-5489) Support Attribute Expressions with AMQP Processors

2018-08-06 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5489?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16570313#comment-16570313
 ] 

ASF GitHub Bot commented on NIFI-5489:
--

Github user danieljimenez commented on the issue:

https://github.com/apache/nifi/pull/2936
  
After rerunning the Travis failures are gone.


> Support Attribute Expressions with AMQP Processors
> --
>
> Key: NIFI-5489
> URL: https://issues.apache.org/jira/browse/NIFI-5489
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 1.7.1
>Reporter: Daniel
>Priority: Major
>
> Particularly the fields: host, virtualhost and username.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (NIFI-5491) PutHive3Streaming incorrectly handles bytes, shorts, and nested structs

2018-08-06 Thread Matt Burgess (JIRA)
Matt Burgess created NIFI-5491:
--

 Summary: PutHive3Streaming incorrectly handles bytes, shorts, and 
nested structs
 Key: NIFI-5491
 URL: https://issues.apache.org/jira/browse/NIFI-5491
 Project: Apache NiFi
  Issue Type: Bug
  Components: Extensions
Reporter: Matt Burgess


When trying to insert a record into a Hive table using PutHive3Streaming, if 
the table contains columns of types byte, short, or struct, then an error 
occurs and the records cannot be written.

This is due to a mismatch between the data types used in the NiFi Record API 
and the Hive ORC writer and StructObjectInspector. For byte and short, NiFi 
currently maintains an Integer value but Hive expects a Byte or Short, 
respectively. For structs, NiFi maintains a Map value but Hive expects a List 
or array.

NiFiRecordSerDe should handle the conversion of values for use by Hive 
Streaming.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (NIFI-5491) PutHive3Streaming incorrectly handles bytes, shorts, and nested structs

2018-08-06 Thread Matt Burgess (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-5491?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matt Burgess reassigned NIFI-5491:
--

Assignee: Matt Burgess

> PutHive3Streaming incorrectly handles bytes, shorts, and nested structs
> ---
>
> Key: NIFI-5491
> URL: https://issues.apache.org/jira/browse/NIFI-5491
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Extensions
>Reporter: Matt Burgess
>Assignee: Matt Burgess
>Priority: Major
>
> When trying to insert a record into a Hive table using PutHive3Streaming, if 
> the table contains columns of types byte, short, or struct, then an error 
> occurs and the records cannot be written.
> This is due to a mismatch between the data types used in the NiFi Record API 
> and the Hive ORC writer and StructObjectInspector. For byte and short, NiFi 
> currently maintains an Integer value but Hive expects a Byte or Short, 
> respectively. For structs, NiFi maintains a Map value but Hive expects a List 
> or array.
> NiFiRecordSerDe should handle the conversion of values for use by Hive 
> Streaming.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2930: NIFI-4434 Fixed recursive listing with a custom reg...

2018-08-06 Thread jtstorck
Github user jtstorck closed the pull request at:

https://github.com/apache/nifi/pull/2930


---


[jira] [Commented] (NIFI-4434) ListHDFS applies File Filter also to subdirectory names in recursive search

2018-08-06 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4434?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16570494#comment-16570494
 ] 

ASF GitHub Bot commented on NIFI-4434:
--

Github user jtstorck closed the pull request at:

https://github.com/apache/nifi/pull/2930


> ListHDFS applies File Filter also to subdirectory names in recursive search
> ---
>
> Key: NIFI-4434
> URL: https://issues.apache.org/jira/browse/NIFI-4434
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 1.3.0
>Reporter: Holger Frydrych
>Assignee: Jeff Storck
>Priority: Major
>
> The File Filter regex configured in the ListHDFS processor is applied not 
> just to files found, but also to subdirectories. 
> If you try to set up a recursive search to list e.g. all csv files in a 
> directory hierarchy via a regex like ".*\.csv", it will only pick up csv 
> files in the base directory, not in any subdirectory. This is because 
> subdirectories don't typically match that regex pattern.
> To fix this, either subdirectories should not be matched against the file 
> filter, or the file filter should be applied to the full path of all files 
> (relative to the base directory). The GetHDFS processor offers both options 
> via a switch.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2930: NIFI-4434 Fixed recursive listing with a custom regex filt...

2018-08-06 Thread jtstorck
Github user jtstorck commented on the issue:

https://github.com/apache/nifi/pull/2930
  
Accidently deleted my remote branch while updating this PR.  It doesn't 
look like I can reopen this PR.  I'll create another PR with the updated code.


---


[jira] [Commented] (NIFI-4434) ListHDFS applies File Filter also to subdirectory names in recursive search

2018-08-06 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4434?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16570498#comment-16570498
 ] 

ASF GitHub Bot commented on NIFI-4434:
--

Github user jtstorck commented on the issue:

https://github.com/apache/nifi/pull/2930
  
Accidently deleted my remote branch while updating this PR.  It doesn't 
look like I can reopen this PR.  I'll create another PR with the updated code.


> ListHDFS applies File Filter also to subdirectory names in recursive search
> ---
>
> Key: NIFI-4434
> URL: https://issues.apache.org/jira/browse/NIFI-4434
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 1.3.0
>Reporter: Holger Frydrych
>Assignee: Jeff Storck
>Priority: Major
>
> The File Filter regex configured in the ListHDFS processor is applied not 
> just to files found, but also to subdirectories. 
> If you try to set up a recursive search to list e.g. all csv files in a 
> directory hierarchy via a regex like ".*\.csv", it will only pick up csv 
> files in the base directory, not in any subdirectory. This is because 
> subdirectories don't typically match that regex pattern.
> To fix this, either subdirectories should not be matched against the file 
> filter, or the file filter should be applied to the full path of all files 
> (relative to the base directory). The GetHDFS processor offers both options 
> via a switch.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2937: NIFI-4434 Fixed recursive listing with a custom reg...

2018-08-06 Thread jtstorck
GitHub user jtstorck opened a pull request:

https://github.com/apache/nifi/pull/2937

NIFI-4434 Fixed recursive listing with a custom regex filter.

Filter modes are now supported to perform listings based on directory and 
file names, file-names only, and full path.

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [x] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [x] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [x] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [x] Is your initial contribution a single, squashed commit?

### For code changes:
- [x] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [x] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/jtstorck/nifi NIFI-4434

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/2937.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2937


commit 6f525bf2b84603f10fd52141e7bff6af68c61f6f
Author: Jeff Storck 
Date:   2018-08-01T17:13:40Z

NIFI-4434 Fixed recursive listing with a custom regex filter.
Filter modes are now supported to perform listings based on directory and 
file names, file-names only, and full path.




---


[jira] [Commented] (NIFI-4434) ListHDFS applies File Filter also to subdirectory names in recursive search

2018-08-06 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4434?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16570501#comment-16570501
 ] 

ASF GitHub Bot commented on NIFI-4434:
--

GitHub user jtstorck opened a pull request:

https://github.com/apache/nifi/pull/2937

NIFI-4434 Fixed recursive listing with a custom regex filter.

Filter modes are now supported to perform listings based on directory and 
file names, file-names only, and full path.

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [x] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [x] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [x] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [x] Is your initial contribution a single, squashed commit?

### For code changes:
- [x] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [x] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/jtstorck/nifi NIFI-4434

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/2937.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2937


commit 6f525bf2b84603f10fd52141e7bff6af68c61f6f
Author: Jeff Storck 
Date:   2018-08-01T17:13:40Z

NIFI-4434 Fixed recursive listing with a custom regex filter.
Filter modes are now supported to perform listings based on directory and 
file names, file-names only, and full path.




> ListHDFS applies File Filter also to subdirectory names in recursive search
> ---
>
> Key: NIFI-4434
> URL: https://issues.apache.org/jira/browse/NIFI-4434
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 1.3.0
>Reporter: Holger Frydrych
>Assignee: Jeff Storck
>Priority: Major
>
> The File Filter regex configured in the ListHDFS processor is applied not 
> just to files found, but also to subdirectories. 
> If you try to set up a recursive search to list e.g. all csv files in a 
> directory hierarchy via a regex like ".*\.csv", it will only pick up csv 
> files in the base directory, not in any subdirectory. This is because 
> subdirectories don't typically match that regex pattern.
> To fix this, either subdirectories should not be matched against the file 
> filter, or the file filter should be applied to the full path of all files 
> (relative to the base directory). The GetHDFS processor offers both options 
> via a switch.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2937: NIFI-4434 Fixed recursive listing with a custom regex filt...

2018-08-06 Thread jtstorck
Github user jtstorck commented on the issue:

https://github.com/apache/nifi/pull/2937
  
[PR 2930](https://github.com/apache/nifi/pull/2930) was closed due to the 
branch in my fork being removed before adding the new filter-mode-based 
changes.  @bbende @ottobackwards, this PR implements the use cases discussed in 
the previous PR:
- filename only
- filename and directory name
- full path


---


[jira] [Commented] (NIFI-4434) ListHDFS applies File Filter also to subdirectory names in recursive search

2018-08-06 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4434?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16570505#comment-16570505
 ] 

ASF GitHub Bot commented on NIFI-4434:
--

Github user jtstorck commented on the issue:

https://github.com/apache/nifi/pull/2937
  
[PR 2930](https://github.com/apache/nifi/pull/2930) was closed due to the 
branch in my fork being removed before adding the new filter-mode-based 
changes.  @bbende @ottobackwards, this PR implements the use cases discussed in 
the previous PR:
- filename only
- filename and directory name
- full path


> ListHDFS applies File Filter also to subdirectory names in recursive search
> ---
>
> Key: NIFI-4434
> URL: https://issues.apache.org/jira/browse/NIFI-4434
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 1.3.0
>Reporter: Holger Frydrych
>Assignee: Jeff Storck
>Priority: Major
>
> The File Filter regex configured in the ListHDFS processor is applied not 
> just to files found, but also to subdirectories. 
> If you try to set up a recursive search to list e.g. all csv files in a 
> directory hierarchy via a regex like ".*\.csv", it will only pick up csv 
> files in the base directory, not in any subdirectory. This is because 
> subdirectories don't typically match that regex pattern.
> To fix this, either subdirectories should not be matched against the file 
> filter, or the file filter should be applied to the full path of all files 
> (relative to the base directory). The GetHDFS processor offers both options 
> via a switch.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2938: NIFI-5491: Fixed PutHive3Streaming handling of Byte...

2018-08-06 Thread mattyb149
GitHub user mattyb149 opened a pull request:

https://github.com/apache/nifi/pull/2938

NIFI-5491: Fixed PutHive3Streaming handling of Byte, Short, and Struct

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [x] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [x] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [x] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [x] Is your initial contribution a single, squashed commit?

### For code changes:
- [x] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [x] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/mattyb149/nifi NIFI-5491

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/2938.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2938


commit 68f83ac1e2987e008c6b5ef7d5fcf6e18cc21c0c
Author: Matthew Burgess 
Date:   2018-08-06T17:34:39Z

NIFI-5491: Fixed PutHive3Streaming handling of Byte, Short, and Struct




---


[jira] [Commented] (NIFI-5491) PutHive3Streaming incorrectly handles bytes, shorts, and nested structs

2018-08-06 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5491?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16570531#comment-16570531
 ] 

ASF GitHub Bot commented on NIFI-5491:
--

GitHub user mattyb149 opened a pull request:

https://github.com/apache/nifi/pull/2938

NIFI-5491: Fixed PutHive3Streaming handling of Byte, Short, and Struct

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [x] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [x] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [x] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [x] Is your initial contribution a single, squashed commit?

### For code changes:
- [x] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [x] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/mattyb149/nifi NIFI-5491

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/2938.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2938


commit 68f83ac1e2987e008c6b5ef7d5fcf6e18cc21c0c
Author: Matthew Burgess 
Date:   2018-08-06T17:34:39Z

NIFI-5491: Fixed PutHive3Streaming handling of Byte, Short, and Struct




> PutHive3Streaming incorrectly handles bytes, shorts, and nested structs
> ---
>
> Key: NIFI-5491
> URL: https://issues.apache.org/jira/browse/NIFI-5491
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Extensions
>Reporter: Matt Burgess
>Assignee: Matt Burgess
>Priority: Major
>
> When trying to insert a record into a Hive table using PutHive3Streaming, if 
> the table contains columns of types byte, short, or struct, then an error 
> occurs and the records cannot be written.
> This is due to a mismatch between the data types used in the NiFi Record API 
> and the Hive ORC writer and StructObjectInspector. For byte and short, NiFi 
> currently maintains an Integer value but Hive expects a Byte or Short, 
> respectively. For structs, NiFi maintains a Map value but Hive expects a List 
> or array.
> NiFiRecordSerDe should handle the conversion of values for use by Hive 
> Streaming.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (NIFI-5491) PutHive3Streaming incorrectly handles bytes, shorts, and nested structs

2018-08-06 Thread Matt Burgess (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-5491?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matt Burgess updated NIFI-5491:
---
Status: Patch Available  (was: In Progress)

> PutHive3Streaming incorrectly handles bytes, shorts, and nested structs
> ---
>
> Key: NIFI-5491
> URL: https://issues.apache.org/jira/browse/NIFI-5491
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Extensions
>Reporter: Matt Burgess
>Assignee: Matt Burgess
>Priority: Major
>
> When trying to insert a record into a Hive table using PutHive3Streaming, if 
> the table contains columns of types byte, short, or struct, then an error 
> occurs and the records cannot be written.
> This is due to a mismatch between the data types used in the NiFi Record API 
> and the Hive ORC writer and StructObjectInspector. For byte and short, NiFi 
> currently maintains an Integer value but Hive expects a Byte or Short, 
> respectively. For structs, NiFi maintains a Map value but Hive expects a List 
> or array.
> NiFiRecordSerDe should handle the conversion of values for use by Hive 
> Streaming.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4434) ListHDFS applies File Filter also to subdirectory names in recursive search

2018-08-06 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4434?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16570533#comment-16570533
 ] 

ASF GitHub Bot commented on NIFI-4434:
--

Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2937
  
@jtstorck will review.
First quick thing is to ask if you have considered adding an 
additionDetails talking about why you would choose one strategy over another, 
maybe with simple examples?


> ListHDFS applies File Filter also to subdirectory names in recursive search
> ---
>
> Key: NIFI-4434
> URL: https://issues.apache.org/jira/browse/NIFI-4434
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 1.3.0
>Reporter: Holger Frydrych
>Assignee: Jeff Storck
>Priority: Major
>
> The File Filter regex configured in the ListHDFS processor is applied not 
> just to files found, but also to subdirectories. 
> If you try to set up a recursive search to list e.g. all csv files in a 
> directory hierarchy via a regex like ".*\.csv", it will only pick up csv 
> files in the base directory, not in any subdirectory. This is because 
> subdirectories don't typically match that regex pattern.
> To fix this, either subdirectories should not be matched against the file 
> filter, or the file filter should be applied to the full path of all files 
> (relative to the base directory). The GetHDFS processor offers both options 
> via a switch.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2937: NIFI-4434 Fixed recursive listing with a custom regex filt...

2018-08-06 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2937
  
@jtstorck will review.
First quick thing is to ask if you have considered adding an 
additionDetails talking about why you would choose one strategy over another, 
maybe with simple examples?


---


[GitHub] nifi pull request #2925: NIFI-5469 Additional italics and code formatting co...

2018-08-06 Thread zenfenan
Github user zenfenan commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2925#discussion_r207899982
  
--- Diff: nifi-docs/src/main/asciidoc/administration-guide.adoc ---
@@ -3256,9 +3361,9 @@ stream {
 
 image:s2s-rproxy-portnumber.svg["Port number to Node mapping"]
--- End diff --

I think this is a good opportunity to fix the screenshots as well. They 
aren't loading properly. The URL has changed, all the image URLs have to be 
changed from `src/main/asciidoc/` to 
`src/main/asciidoc/images/`


---


[GitHub] nifi pull request #2925: NIFI-5469 Additional italics and code formatting co...

2018-08-06 Thread zenfenan
Github user zenfenan commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2925#discussion_r207975946
  
--- Diff: nifi-docs/src/main/asciidoc/administration-guide.adoc ---
@@ -1486,35 +1588,35 @@ If no administrator action is taken, the 
configuration values remain unencrypted
 [[encrypt-config_tool]]
 === Encrypt-Config Tool
 
-The `encrypt-config` command line tool (invoked as 
`./bin/encrypt-config.sh` or `bin\encrypt-config.bat`) reads from a 
'nifi.properties' file with plaintext sensitive configuration values, prompts 
for a master password or raw hexadecimal key, and encrypts each value. It 
replaces the plain values with the protected value in the same file, or writes 
to a new 'nifi.properties' file if specified.
+The `encrypt-config` command line tool (invoked as 
`./bin/encrypt-config.sh` or `bin\encrypt-config.bat`) reads from a 
_nifi.properties_ file with plaintext sensitive configuration values, prompts 
for a master password or raw hexadecimal key, and encrypts each value. It 
replaces the plain values with the protected value in the same file, or writes 
to a new _nifi.properties_ file if specified.
--- End diff --

L1454 The link for `Bcrypt Spring Security` is broken. It has to be updated 
to 
https://docs.spring.io/spring-security/site/docs/current/api/org/springframework/security/crypto/bcrypt/BCrypt.html


---


[jira] [Commented] (NIFI-5469) Edits needed for LDAP and Kerberos login identity provider sections in Admin Guide

2018-08-06 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5469?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16570545#comment-16570545
 ] 

ASF GitHub Bot commented on NIFI-5469:
--

Github user zenfenan commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2925#discussion_r207899982
  
--- Diff: nifi-docs/src/main/asciidoc/administration-guide.adoc ---
@@ -3256,9 +3361,9 @@ stream {
 
 image:s2s-rproxy-portnumber.svg["Port number to Node mapping"]
--- End diff --

I think this is a good opportunity to fix the screenshots as well. They 
aren't loading properly. The URL has changed, all the image URLs have to be 
changed from `src/main/asciidoc/` to 
`src/main/asciidoc/images/`


> Edits needed for LDAP and Kerberos login identity provider sections in Admin 
> Guide
> --
>
> Key: NIFI-5469
> URL: https://issues.apache.org/jira/browse/NIFI-5469
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Documentation & Website
>Reporter: Andrew Lim
>Assignee: Andrew Lim
>Priority: Minor
>
> Going through the Authentication and Authorization sections of the Admin 
> Guide, I noticed the following improvements could be made:
>  * Removed “Kerberos Config File” property from kerberos-provider login 
> identity provider (this was done because the same property exists in 
> nifi.properties)
>  * Corrected the "LDAP-based Users/Groups Referencing User Attribute” login 
> identity provider example to refer to “member uid"
>  * Added titles to login identity provider examples for improved 
> readability/search
>  * Changed UserGroupProvider property examples from bulleted lists to tables
> Also, text formatting for references to config files, directories, etc.  need 
> to be made consistent.  For example, config files like _nifi.properties_, 
> _authorizers.xml_ should be italicized.  Directories, properties and default 
> values for properties should be monospaced.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5469) Edits needed for LDAP and Kerberos login identity provider sections in Admin Guide

2018-08-06 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5469?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16570546#comment-16570546
 ] 

ASF GitHub Bot commented on NIFI-5469:
--

Github user zenfenan commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2925#discussion_r207975946
  
--- Diff: nifi-docs/src/main/asciidoc/administration-guide.adoc ---
@@ -1486,35 +1588,35 @@ If no administrator action is taken, the 
configuration values remain unencrypted
 [[encrypt-config_tool]]
 === Encrypt-Config Tool
 
-The `encrypt-config` command line tool (invoked as 
`./bin/encrypt-config.sh` or `bin\encrypt-config.bat`) reads from a 
'nifi.properties' file with plaintext sensitive configuration values, prompts 
for a master password or raw hexadecimal key, and encrypts each value. It 
replaces the plain values with the protected value in the same file, or writes 
to a new 'nifi.properties' file if specified.
+The `encrypt-config` command line tool (invoked as 
`./bin/encrypt-config.sh` or `bin\encrypt-config.bat`) reads from a 
_nifi.properties_ file with plaintext sensitive configuration values, prompts 
for a master password or raw hexadecimal key, and encrypts each value. It 
replaces the plain values with the protected value in the same file, or writes 
to a new _nifi.properties_ file if specified.
--- End diff --

L1454 The link for `Bcrypt Spring Security` is broken. It has to be updated 
to 
https://docs.spring.io/spring-security/site/docs/current/api/org/springframework/security/crypto/bcrypt/BCrypt.html


> Edits needed for LDAP and Kerberos login identity provider sections in Admin 
> Guide
> --
>
> Key: NIFI-5469
> URL: https://issues.apache.org/jira/browse/NIFI-5469
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Documentation & Website
>Reporter: Andrew Lim
>Assignee: Andrew Lim
>Priority: Minor
>
> Going through the Authentication and Authorization sections of the Admin 
> Guide, I noticed the following improvements could be made:
>  * Removed “Kerberos Config File” property from kerberos-provider login 
> identity provider (this was done because the same property exists in 
> nifi.properties)
>  * Corrected the "LDAP-based Users/Groups Referencing User Attribute” login 
> identity provider example to refer to “member uid"
>  * Added titles to login identity provider examples for improved 
> readability/search
>  * Changed UserGroupProvider property examples from bulleted lists to tables
> Also, text formatting for references to config files, directories, etc.  need 
> to be made consistent.  For example, config files like _nifi.properties_, 
> _authorizers.xml_ should be italicized.  Directories, properties and default 
> values for properties should be monospaced.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2937: NIFI-4434 Fixed recursive listing with a custom reg...

2018-08-06 Thread ottobackwards
Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2937#discussion_r207983251
  
--- Diff: 
nifi-nar-bundles/nifi-hadoop-bundle/nifi-hdfs-processors/src/main/java/org/apache/nifi/processors/hadoop/ListHDFS.java
 ---
@@ -462,11 +523,15 @@ private String getPerms(final FsAction action) {
 
 private PathFilter createPathFilter(final ProcessContext context) {
 final Pattern filePattern = 
Pattern.compile(context.getProperty(FILE_FILTER).getValue());
--- End diff --

Does this need to support expression language?


---


[jira] [Commented] (NIFI-4434) ListHDFS applies File Filter also to subdirectory names in recursive search

2018-08-06 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4434?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16570588#comment-16570588
 ] 

ASF GitHub Bot commented on NIFI-4434:
--

Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2937#discussion_r207983251
  
--- Diff: 
nifi-nar-bundles/nifi-hadoop-bundle/nifi-hdfs-processors/src/main/java/org/apache/nifi/processors/hadoop/ListHDFS.java
 ---
@@ -462,11 +523,15 @@ private String getPerms(final FsAction action) {
 
 private PathFilter createPathFilter(final ProcessContext context) {
 final Pattern filePattern = 
Pattern.compile(context.getProperty(FILE_FILTER).getValue());
--- End diff --

Does this need to support expression language?


> ListHDFS applies File Filter also to subdirectory names in recursive search
> ---
>
> Key: NIFI-4434
> URL: https://issues.apache.org/jira/browse/NIFI-4434
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 1.3.0
>Reporter: Holger Frydrych
>Assignee: Jeff Storck
>Priority: Major
>
> The File Filter regex configured in the ListHDFS processor is applied not 
> just to files found, but also to subdirectories. 
> If you try to set up a recursive search to list e.g. all csv files in a 
> directory hierarchy via a regex like ".*\.csv", it will only pick up csv 
> files in the base directory, not in any subdirectory. This is because 
> subdirectories don't typically match that regex pattern.
> To fix this, either subdirectories should not be matched against the file 
> filter, or the file filter should be applied to the full path of all files 
> (relative to the base directory). The GetHDFS processor offers both options 
> via a switch.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5491) PutHive3Streaming incorrectly handles bytes, shorts, and nested structs

2018-08-06 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5491?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16570595#comment-16570595
 ] 

ASF GitHub Bot commented on NIFI-5491:
--

Github user markap14 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2938#discussion_r207985881
  
--- Diff: 
nifi-nar-bundles/nifi-hive-bundle/nifi-hive3-processors/src/main/java/org/apache/hive/streaming/NiFiRecordSerDe.java
 ---
@@ -205,7 +212,9 @@ private Object extractCurrentField(Record record, 
RecordField field, TypeInfo fi
 val = record.getAsString(fieldName);
 break;
 case BINARY:
-val = 
AvroTypeUtil.convertByteArray(record.getAsArray(fieldName)).array();
+Object[] array = record.getAsArray(fieldName);
+if (array == null) return null;
--- End diff --

I'm surprised this passes checkstyle. I thought we required the {} with the 
body to start on the next line... is preferred syntax anyway I think.


> PutHive3Streaming incorrectly handles bytes, shorts, and nested structs
> ---
>
> Key: NIFI-5491
> URL: https://issues.apache.org/jira/browse/NIFI-5491
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Extensions
>Reporter: Matt Burgess
>Assignee: Matt Burgess
>Priority: Major
>
> When trying to insert a record into a Hive table using PutHive3Streaming, if 
> the table contains columns of types byte, short, or struct, then an error 
> occurs and the records cannot be written.
> This is due to a mismatch between the data types used in the NiFi Record API 
> and the Hive ORC writer and StructObjectInspector. For byte and short, NiFi 
> currently maintains an Integer value but Hive expects a Byte or Short, 
> respectively. For structs, NiFi maintains a Map value but Hive expects a List 
> or array.
> NiFiRecordSerDe should handle the conversion of values for use by Hive 
> Streaming.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5491) PutHive3Streaming incorrectly handles bytes, shorts, and nested structs

2018-08-06 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5491?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16570597#comment-16570597
 ] 

ASF GitHub Bot commented on NIFI-5491:
--

Github user markap14 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2938#discussion_r207985525
  
--- Diff: 
nifi-nar-bundles/nifi-hive-bundle/nifi-hive3-processors/src/main/java/org/apache/hive/streaming/NiFiRecordSerDe.java
 ---
@@ -227,8 +236,32 @@ private Object extractCurrentField(Record record, 
RecordField field, TypeInfo fi
 val = 
DataTypeUtils.convertRecordFieldtoObject(record.getValue(fieldName), 
field.getDataType());
 break;
 case STRUCT:
-val = 
DataTypeUtils.convertRecordFieldtoObject(record.getValue(fieldName), 
field.getDataType());
-break;
+// For some reason the Hive StandardStructObjectInspector 
expects the object corresponding to a "struct" to be an array or List rather 
than a Map.
+// Do the conversion here, calling extractCurrentField 
recursively to traverse any nested structs.
+Record r = (Record) record.getValue(fieldName);
--- End diff --

We typically like to avoid single-letter variable names :)


> PutHive3Streaming incorrectly handles bytes, shorts, and nested structs
> ---
>
> Key: NIFI-5491
> URL: https://issues.apache.org/jira/browse/NIFI-5491
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Extensions
>Reporter: Matt Burgess
>Assignee: Matt Burgess
>Priority: Major
>
> When trying to insert a record into a Hive table using PutHive3Streaming, if 
> the table contains columns of types byte, short, or struct, then an error 
> occurs and the records cannot be written.
> This is due to a mismatch between the data types used in the NiFi Record API 
> and the Hive ORC writer and StructObjectInspector. For byte and short, NiFi 
> currently maintains an Integer value but Hive expects a Byte or Short, 
> respectively. For structs, NiFi maintains a Map value but Hive expects a List 
> or array.
> NiFiRecordSerDe should handle the conversion of values for use by Hive 
> Streaming.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2938: NIFI-5491: Fixed PutHive3Streaming handling of Byte...

2018-08-06 Thread markap14
Github user markap14 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2938#discussion_r207986292
  
--- Diff: 
nifi-nar-bundles/nifi-hive-bundle/nifi-hive3-processors/src/main/java/org/apache/hive/streaming/NiFiRecordSerDe.java
 ---
@@ -227,8 +236,32 @@ private Object extractCurrentField(Record record, 
RecordField field, TypeInfo fi
 val = 
DataTypeUtils.convertRecordFieldtoObject(record.getValue(fieldName), 
field.getDataType());
 break;
 case STRUCT:
-val = 
DataTypeUtils.convertRecordFieldtoObject(record.getValue(fieldName), 
field.getDataType());
-break;
+// For some reason the Hive StandardStructObjectInspector 
expects the object corresponding to a "struct" to be an array or List rather 
than a Map.
+// Do the conversion here, calling extractCurrentField 
recursively to traverse any nested structs.
+Record r = (Record) record.getValue(fieldName);
+if (r == null) {
+return null;
+}
+try {
+RecordSchema recordSchema = r.getSchema();
+List recordFields = 
recordSchema.getFields();
+if (recordFields == null || recordFields.isEmpty()) {
+return new ArrayList<>(0);
+}
+// This List will hold the values of the entries in 
the Map
+List structList = new 
ArrayList<>(recordFields.size());
+StructTypeInfo typeInfo = (StructTypeInfo) 
schema.getStructFieldTypeInfo(fieldName);
+for (RecordField f : recordFields) {
+String fName = f.getFieldName();
+String normalizedFieldName = fName.toLowerCase();
+structList.add(extractCurrentField(r, f, 
typeInfo.getStructFieldTypeInfo(normalizedFieldName)));
+}
+return structList;
+} catch (Exception e) {
+log.warn("Error [{}] parsing Record [{}].", new 
Object[]{e.getLocalizedMessage(), r}, e);
--- End diff --

I would generally recommend using e.toString() instead of 
e.getLocalizedMessage() simply because many Exceptions don't include a message 
text, such as IllegalArgumentException, etc., so the message in that case comes 
across as "Error [] parsing Record..." or "Error [null] parsing Record..." 
which is confusing.


---


[GitHub] nifi pull request #2938: NIFI-5491: Fixed PutHive3Streaming handling of Byte...

2018-08-06 Thread markap14
Github user markap14 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2938#discussion_r207981236
  
--- Diff: 
nifi-nar-bundles/nifi-hive-bundle/nifi-hive3-processors/src/main/java/org/apache/hive/streaming/NiFiRecordSerDe.java
 ---
@@ -227,8 +236,32 @@ private Object extractCurrentField(Record record, 
RecordField field, TypeInfo fi
 val = 
DataTypeUtils.convertRecordFieldtoObject(record.getValue(fieldName), 
field.getDataType());
 break;
 case STRUCT:
-val = 
DataTypeUtils.convertRecordFieldtoObject(record.getValue(fieldName), 
field.getDataType());
-break;
+// For some reason the Hive StandardStructObjectInspector 
expects the object corresponding to a "struct" to be an array or List rather 
than a Map.
+// Do the conversion here, calling extractCurrentField 
recursively to traverse any nested structs.
+Record r = (Record) record.getValue(fieldName);
+if (r == null) {
+return null;
+}
+try {
+RecordSchema recordSchema = r.getSchema();
+List recordFields = 
recordSchema.getFields();
+if (recordFields == null || recordFields.isEmpty()) {
+return new ArrayList<>(0);
--- End diff --

should just return Collections.emptyList()


---


[GitHub] nifi pull request #2938: NIFI-5491: Fixed PutHive3Streaming handling of Byte...

2018-08-06 Thread markap14
Github user markap14 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2938#discussion_r207985881
  
--- Diff: 
nifi-nar-bundles/nifi-hive-bundle/nifi-hive3-processors/src/main/java/org/apache/hive/streaming/NiFiRecordSerDe.java
 ---
@@ -205,7 +212,9 @@ private Object extractCurrentField(Record record, 
RecordField field, TypeInfo fi
 val = record.getAsString(fieldName);
 break;
 case BINARY:
-val = 
AvroTypeUtil.convertByteArray(record.getAsArray(fieldName)).array();
+Object[] array = record.getAsArray(fieldName);
+if (array == null) return null;
--- End diff --

I'm surprised this passes checkstyle. I thought we required the {} with the 
body to start on the next line... is preferred syntax anyway I think.


---


[jira] [Commented] (NIFI-5491) PutHive3Streaming incorrectly handles bytes, shorts, and nested structs

2018-08-06 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5491?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16570594#comment-16570594
 ] 

ASF GitHub Bot commented on NIFI-5491:
--

Github user markap14 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2938#discussion_r207981236
  
--- Diff: 
nifi-nar-bundles/nifi-hive-bundle/nifi-hive3-processors/src/main/java/org/apache/hive/streaming/NiFiRecordSerDe.java
 ---
@@ -227,8 +236,32 @@ private Object extractCurrentField(Record record, 
RecordField field, TypeInfo fi
 val = 
DataTypeUtils.convertRecordFieldtoObject(record.getValue(fieldName), 
field.getDataType());
 break;
 case STRUCT:
-val = 
DataTypeUtils.convertRecordFieldtoObject(record.getValue(fieldName), 
field.getDataType());
-break;
+// For some reason the Hive StandardStructObjectInspector 
expects the object corresponding to a "struct" to be an array or List rather 
than a Map.
+// Do the conversion here, calling extractCurrentField 
recursively to traverse any nested structs.
+Record r = (Record) record.getValue(fieldName);
+if (r == null) {
+return null;
+}
+try {
+RecordSchema recordSchema = r.getSchema();
+List recordFields = 
recordSchema.getFields();
+if (recordFields == null || recordFields.isEmpty()) {
+return new ArrayList<>(0);
--- End diff --

should just return Collections.emptyList()


> PutHive3Streaming incorrectly handles bytes, shorts, and nested structs
> ---
>
> Key: NIFI-5491
> URL: https://issues.apache.org/jira/browse/NIFI-5491
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Extensions
>Reporter: Matt Burgess
>Assignee: Matt Burgess
>Priority: Major
>
> When trying to insert a record into a Hive table using PutHive3Streaming, if 
> the table contains columns of types byte, short, or struct, then an error 
> occurs and the records cannot be written.
> This is due to a mismatch between the data types used in the NiFi Record API 
> and the Hive ORC writer and StructObjectInspector. For byte and short, NiFi 
> currently maintains an Integer value but Hive expects a Byte or Short, 
> respectively. For structs, NiFi maintains a Map value but Hive expects a List 
> or array.
> NiFiRecordSerDe should handle the conversion of values for use by Hive 
> Streaming.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5491) PutHive3Streaming incorrectly handles bytes, shorts, and nested structs

2018-08-06 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5491?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16570596#comment-16570596
 ] 

ASF GitHub Bot commented on NIFI-5491:
--

Github user markap14 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2938#discussion_r207986292
  
--- Diff: 
nifi-nar-bundles/nifi-hive-bundle/nifi-hive3-processors/src/main/java/org/apache/hive/streaming/NiFiRecordSerDe.java
 ---
@@ -227,8 +236,32 @@ private Object extractCurrentField(Record record, 
RecordField field, TypeInfo fi
 val = 
DataTypeUtils.convertRecordFieldtoObject(record.getValue(fieldName), 
field.getDataType());
 break;
 case STRUCT:
-val = 
DataTypeUtils.convertRecordFieldtoObject(record.getValue(fieldName), 
field.getDataType());
-break;
+// For some reason the Hive StandardStructObjectInspector 
expects the object corresponding to a "struct" to be an array or List rather 
than a Map.
+// Do the conversion here, calling extractCurrentField 
recursively to traverse any nested structs.
+Record r = (Record) record.getValue(fieldName);
+if (r == null) {
+return null;
+}
+try {
+RecordSchema recordSchema = r.getSchema();
+List recordFields = 
recordSchema.getFields();
+if (recordFields == null || recordFields.isEmpty()) {
+return new ArrayList<>(0);
+}
+// This List will hold the values of the entries in 
the Map
+List structList = new 
ArrayList<>(recordFields.size());
+StructTypeInfo typeInfo = (StructTypeInfo) 
schema.getStructFieldTypeInfo(fieldName);
+for (RecordField f : recordFields) {
+String fName = f.getFieldName();
+String normalizedFieldName = fName.toLowerCase();
+structList.add(extractCurrentField(r, f, 
typeInfo.getStructFieldTypeInfo(normalizedFieldName)));
+}
+return structList;
+} catch (Exception e) {
+log.warn("Error [{}] parsing Record [{}].", new 
Object[]{e.getLocalizedMessage(), r}, e);
--- End diff --

I would generally recommend using e.toString() instead of 
e.getLocalizedMessage() simply because many Exceptions don't include a message 
text, such as IllegalArgumentException, etc., so the message in that case comes 
across as "Error [] parsing Record..." or "Error [null] parsing Record..." 
which is confusing.


> PutHive3Streaming incorrectly handles bytes, shorts, and nested structs
> ---
>
> Key: NIFI-5491
> URL: https://issues.apache.org/jira/browse/NIFI-5491
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Extensions
>Reporter: Matt Burgess
>Assignee: Matt Burgess
>Priority: Major
>
> When trying to insert a record into a Hive table using PutHive3Streaming, if 
> the table contains columns of types byte, short, or struct, then an error 
> occurs and the records cannot be written.
> This is due to a mismatch between the data types used in the NiFi Record API 
> and the Hive ORC writer and StructObjectInspector. For byte and short, NiFi 
> currently maintains an Integer value but Hive expects a Byte or Short, 
> respectively. For structs, NiFi maintains a Map value but Hive expects a List 
> or array.
> NiFiRecordSerDe should handle the conversion of values for use by Hive 
> Streaming.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2938: NIFI-5491: Fixed PutHive3Streaming handling of Byte...

2018-08-06 Thread markap14
Github user markap14 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2938#discussion_r207985525
  
--- Diff: 
nifi-nar-bundles/nifi-hive-bundle/nifi-hive3-processors/src/main/java/org/apache/hive/streaming/NiFiRecordSerDe.java
 ---
@@ -227,8 +236,32 @@ private Object extractCurrentField(Record record, 
RecordField field, TypeInfo fi
 val = 
DataTypeUtils.convertRecordFieldtoObject(record.getValue(fieldName), 
field.getDataType());
 break;
 case STRUCT:
-val = 
DataTypeUtils.convertRecordFieldtoObject(record.getValue(fieldName), 
field.getDataType());
-break;
+// For some reason the Hive StandardStructObjectInspector 
expects the object corresponding to a "struct" to be an array or List rather 
than a Map.
+// Do the conversion here, calling extractCurrentField 
recursively to traverse any nested structs.
+Record r = (Record) record.getValue(fieldName);
--- End diff --

We typically like to avoid single-letter variable names :)


---


[GitHub] nifi issue #2231: NIFI-4521 MS SQL CDC Processor

2018-08-06 Thread patricker
Github user patricker commented on the issue:

https://github.com/apache/nifi/pull/2231
  
@mattyb149 Ready when you are. I have plans to add in new functionality to 
support 'SQL Server Change Tracking', which is a simpler form of change 
tracking. Would really like to see this PR merged in prior to making any future 
changes.


---


[jira] [Commented] (NIFI-4521) MS SQL CDC Processor

2018-08-06 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4521?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16570598#comment-16570598
 ] 

ASF GitHub Bot commented on NIFI-4521:
--

Github user patricker commented on the issue:

https://github.com/apache/nifi/pull/2231
  
@mattyb149 Ready when you are. I have plans to add in new functionality to 
support 'SQL Server Change Tracking', which is a simpler form of change 
tracking. Would really like to see this PR merged in prior to making any future 
changes.


> MS SQL CDC Processor
> 
>
> Key: NIFI-4521
> URL: https://issues.apache.org/jira/browse/NIFI-4521
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Reporter: Peter Wicks
>Assignee: Peter Wicks
>Priority: Major
>
> Creation of a new processor that reads Change Data Capture details from 
> Microsoft SQL Server and outputs the changes a Records.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2938: NIFI-5491: Fixed PutHive3Streaming handling of Byte...

2018-08-06 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/2938


---


[jira] [Commented] (NIFI-5491) PutHive3Streaming incorrectly handles bytes, shorts, and nested structs

2018-08-06 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5491?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16570609#comment-16570609
 ] 

ASF GitHub Bot commented on NIFI-5491:
--

Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/2938


> PutHive3Streaming incorrectly handles bytes, shorts, and nested structs
> ---
>
> Key: NIFI-5491
> URL: https://issues.apache.org/jira/browse/NIFI-5491
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Extensions
>Reporter: Matt Burgess
>Assignee: Matt Burgess
>Priority: Major
> Fix For: 1.8.0
>
>
> When trying to insert a record into a Hive table using PutHive3Streaming, if 
> the table contains columns of types byte, short, or struct, then an error 
> occurs and the records cannot be written.
> This is due to a mismatch between the data types used in the NiFi Record API 
> and the Hive ORC writer and StructObjectInspector. For byte and short, NiFi 
> currently maintains an Integer value but Hive expects a Byte or Short, 
> respectively. For structs, NiFi maintains a Map value but Hive expects a List 
> or array.
> NiFiRecordSerDe should handle the conversion of values for use by Hive 
> Streaming.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (NIFI-5491) PutHive3Streaming incorrectly handles bytes, shorts, and nested structs

2018-08-06 Thread Mark Payne (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-5491?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mark Payne updated NIFI-5491:
-
   Resolution: Fixed
Fix Version/s: 1.8.0
   Status: Resolved  (was: Patch Available)

> PutHive3Streaming incorrectly handles bytes, shorts, and nested structs
> ---
>
> Key: NIFI-5491
> URL: https://issues.apache.org/jira/browse/NIFI-5491
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Extensions
>Reporter: Matt Burgess
>Assignee: Matt Burgess
>Priority: Major
> Fix For: 1.8.0
>
>
> When trying to insert a record into a Hive table using PutHive3Streaming, if 
> the table contains columns of types byte, short, or struct, then an error 
> occurs and the records cannot be written.
> This is due to a mismatch between the data types used in the NiFi Record API 
> and the Hive ORC writer and StructObjectInspector. For byte and short, NiFi 
> currently maintains an Integer value but Hive expects a Byte or Short, 
> respectively. For structs, NiFi maintains a Map value but Hive expects a List 
> or array.
> NiFiRecordSerDe should handle the conversion of values for use by Hive 
> Streaming.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5491) PutHive3Streaming incorrectly handles bytes, shorts, and nested structs

2018-08-06 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5491?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16570607#comment-16570607
 ] 

ASF subversion and git services commented on NIFI-5491:
---

Commit 9ee2316ff676acc413f28f86e8d9924947cb63ea in nifi's branch 
refs/heads/master from [~ca9mbu]
[ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=9ee2316 ]

NIFI-5491: Fixed PutHive3Streaming handling of Byte, Short, and Struct

This closes #2938.

Signed-off-by: Mark Payne 


> PutHive3Streaming incorrectly handles bytes, shorts, and nested structs
> ---
>
> Key: NIFI-5491
> URL: https://issues.apache.org/jira/browse/NIFI-5491
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Extensions
>Reporter: Matt Burgess
>Assignee: Matt Burgess
>Priority: Major
> Fix For: 1.8.0
>
>
> When trying to insert a record into a Hive table using PutHive3Streaming, if 
> the table contains columns of types byte, short, or struct, then an error 
> occurs and the records cannot be written.
> This is due to a mismatch between the data types used in the NiFi Record API 
> and the Hive ORC writer and StructObjectInspector. For byte and short, NiFi 
> currently maintains an Integer value but Hive expects a Byte or Short, 
> respectively. For structs, NiFi maintains a Map value but Hive expects a List 
> or array.
> NiFiRecordSerDe should handle the conversion of values for use by Hive 
> Streaming.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2846: NIFI-5381 Add GetSFTP and PutSFTP Unit Tests

2018-08-06 Thread joewitt
Github user joewitt commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2846#discussion_r207997339
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-nar/src/main/resources/META-INF/NOTICE
 ---
@@ -226,6 +226,11 @@ The following binary components are provided under the 
Apache Software License v
 
 Copyright 2018 simple-syslog-5424 authors.
 
+  (ASLv2) Apache MINA
--- End diff --

dont need this - test changes/deps don't impact nar contents


---


[jira] [Commented] (NIFI-5381) Add SFTP Unit Tests using SFTP Server

2018-08-06 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5381?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16570659#comment-16570659
 ] 

ASF GitHub Bot commented on NIFI-5381:
--

Github user joewitt commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2846#discussion_r207997339
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-nar/src/main/resources/META-INF/NOTICE
 ---
@@ -226,6 +226,11 @@ The following binary components are provided under the 
Apache Software License v
 
 Copyright 2018 simple-syslog-5424 authors.
 
+  (ASLv2) Apache MINA
--- End diff --

dont need this - test changes/deps don't impact nar contents


> Add SFTP Unit Tests using SFTP Server
> -
>
> Key: NIFI-5381
> URL: https://issues.apache.org/jira/browse/NIFI-5381
> Project: Apache NiFi
>  Issue Type: Sub-task
>Reporter: Peter Wicks
>Assignee: Peter Wicks
>Priority: Minor
>
> Unit Tests only. Create Unit Tests for SFTP using local/in-memory SFTP server.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (MINIFICPP-593) Create test for non-cURL support in docker

2018-08-06 Thread Mr TheSegfault (JIRA)
Mr TheSegfault created MINIFICPP-593:


 Summary: Create test for non-cURL support in docker
 Key: MINIFICPP-593
 URL: https://issues.apache.org/jira/browse/MINIFICPP-593
 Project: NiFi MiNiFi C++
  Issue Type: Sub-task
Reporter: Mr TheSegfault
Assignee: Mr TheSegfault


Create test for non-cURL support in docker



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi-minifi-cpp pull request #389: MINIFICPP-592: Update RPG to fall back wh...

2018-08-06 Thread phrocker
GitHub user phrocker opened a pull request:

https://github.com/apache/nifi-minifi-cpp/pull/389

MINIFICPP-592: Update RPG to fall back when cURL is not enable

MINIFICPP-593 was created to facilitate some docker testing. 


Thank you for submitting a contribution to Apache NiFi - MiNiFi C++.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [ ] Is there a JIRA ticket associated with this PR? Is it referenced
 in the commit message?

- [ ] Does your PR title start with MINIFI- where  is the JIRA 
number you are trying to resolve? Pay particular attention to the hyphen "-" 
character.

- [ ] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [ ] Is your initial contribution a single, squashed commit?

### For code changes:
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)?
- [ ] If applicable, have you updated the LICENSE file?
- [ ] If applicable, have you updated the NOTICE file?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/phrocker/nifi-minifi-cpp MINIFICPP-592

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi-minifi-cpp/pull/389.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #389


commit 0283c00edb797e7f60c50d0e6dda9b62e9b909e9
Author: Marc Parisi 
Date:   2018-08-06T19:49:04Z

MINIFICPP-592: Update RPG to fall back when cURL is not enable




---


[GitHub] nifi-minifi-cpp issue #389: MINIFICPP-592: Update RPG to fall back when cURL...

2018-08-06 Thread phrocker
Github user phrocker commented on the issue:

https://github.com/apache/nifi-minifi-cpp/pull/389
  
MINIFICPP-593 was created to facilitate docker testing. Happy to merge that 
effort into this one. I don't like the lack of testing but we can't really do 
this type of test without some containerization. 


---


[jira] [Commented] (MINIFICPP-592) Restful lookups in RPG should be bypassed when cURL is disabled.

2018-08-06 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/MINIFICPP-592?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16570697#comment-16570697
 ] 

ASF GitHub Bot commented on MINIFICPP-592:
--

GitHub user phrocker opened a pull request:

https://github.com/apache/nifi-minifi-cpp/pull/389

MINIFICPP-592: Update RPG to fall back when cURL is not enable

MINIFICPP-593 was created to facilitate some docker testing. 


Thank you for submitting a contribution to Apache NiFi - MiNiFi C++.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [ ] Is there a JIRA ticket associated with this PR? Is it referenced
 in the commit message?

- [ ] Does your PR title start with MINIFI- where  is the JIRA 
number you are trying to resolve? Pay particular attention to the hyphen "-" 
character.

- [ ] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [ ] Is your initial contribution a single, squashed commit?

### For code changes:
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)?
- [ ] If applicable, have you updated the LICENSE file?
- [ ] If applicable, have you updated the NOTICE file?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/phrocker/nifi-minifi-cpp MINIFICPP-592

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi-minifi-cpp/pull/389.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #389


commit 0283c00edb797e7f60c50d0e6dda9b62e9b909e9
Author: Marc Parisi 
Date:   2018-08-06T19:49:04Z

MINIFICPP-592: Update RPG to fall back when cURL is not enable




> Restful lookups in RPG should be bypassed when cURL is disabled.
> 
>
> Key: MINIFICPP-592
> URL: https://issues.apache.org/jira/browse/MINIFICPP-592
> Project: NiFi MiNiFi C++
>  Issue Type: Test
>Reporter: Mr TheSegfault
>Assignee: Mr TheSegfault
>Priority: Major
>
> Changes to the RPG that performed a lookup on the NiFi Rest API should be 
> bypassed and the YAML config used when cURL support is disabled in a client. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (MINIFICPP-592) Restful lookups in RPG should be bypassed when cURL is disabled.

2018-08-06 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/MINIFICPP-592?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16570698#comment-16570698
 ] 

ASF GitHub Bot commented on MINIFICPP-592:
--

Github user phrocker commented on the issue:

https://github.com/apache/nifi-minifi-cpp/pull/389
  
MINIFICPP-593 was created to facilitate docker testing. Happy to merge that 
effort into this one. I don't like the lack of testing but we can't really do 
this type of test without some containerization. 


> Restful lookups in RPG should be bypassed when cURL is disabled.
> 
>
> Key: MINIFICPP-592
> URL: https://issues.apache.org/jira/browse/MINIFICPP-592
> Project: NiFi MiNiFi C++
>  Issue Type: Test
>Reporter: Mr TheSegfault
>Assignee: Mr TheSegfault
>Priority: Major
>
> Changes to the RPG that performed a lookup on the NiFi Rest API should be 
> bypassed and the YAML config used when cURL support is disabled in a client. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2925: NIFI-5469 Additional italics and code formatting co...

2018-08-06 Thread andrewmlim
Github user andrewmlim commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2925#discussion_r208012302
  
--- Diff: nifi-docs/src/main/asciidoc/administration-guide.adoc ---
@@ -3256,9 +3361,9 @@ stream {
 
 image:s2s-rproxy-portnumber.svg["Port number to Node mapping"]
--- End diff --

Hi @zenfenan,
Can you clarify what you mean by "aren't loading properly"?  I'm trying to 
understand what is broken since the screenshots are properly displayed in the 
generated docs.


---


[GitHub] nifi pull request #2925: NIFI-5469 Additional italics and code formatting co...

2018-08-06 Thread andrewmlim
Github user andrewmlim commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2925#discussion_r208012356
  
--- Diff: nifi-docs/src/main/asciidoc/administration-guide.adoc ---
@@ -1486,35 +1588,35 @@ If no administrator action is taken, the 
configuration values remain unencrypted
 [[encrypt-config_tool]]
 === Encrypt-Config Tool
 
-The `encrypt-config` command line tool (invoked as 
`./bin/encrypt-config.sh` or `bin\encrypt-config.bat`) reads from a 
'nifi.properties' file with plaintext sensitive configuration values, prompts 
for a master password or raw hexadecimal key, and encrypts each value. It 
replaces the plain values with the protected value in the same file, or writes 
to a new 'nifi.properties' file if specified.
+The `encrypt-config` command line tool (invoked as 
`./bin/encrypt-config.sh` or `bin\encrypt-config.bat`) reads from a 
_nifi.properties_ file with plaintext sensitive configuration values, prompts 
for a master password or raw hexadecimal key, and encrypts each value. It 
replaces the plain values with the protected value in the same file, or writes 
to a new _nifi.properties_ file if specified.
--- End diff --

Good catch.  Will update.


---


[jira] [Commented] (NIFI-5469) Edits needed for LDAP and Kerberos login identity provider sections in Admin Guide

2018-08-06 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5469?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16570708#comment-16570708
 ] 

ASF GitHub Bot commented on NIFI-5469:
--

Github user andrewmlim commented on the issue:

https://github.com/apache/nifi/pull/2925
  
Thanks for reviewing @zenfenan !


> Edits needed for LDAP and Kerberos login identity provider sections in Admin 
> Guide
> --
>
> Key: NIFI-5469
> URL: https://issues.apache.org/jira/browse/NIFI-5469
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Documentation & Website
>Reporter: Andrew Lim
>Assignee: Andrew Lim
>Priority: Minor
>
> Going through the Authentication and Authorization sections of the Admin 
> Guide, I noticed the following improvements could be made:
>  * Removed “Kerberos Config File” property from kerberos-provider login 
> identity provider (this was done because the same property exists in 
> nifi.properties)
>  * Corrected the "LDAP-based Users/Groups Referencing User Attribute” login 
> identity provider example to refer to “member uid"
>  * Added titles to login identity provider examples for improved 
> readability/search
>  * Changed UserGroupProvider property examples from bulleted lists to tables
> Also, text formatting for references to config files, directories, etc.  need 
> to be made consistent.  For example, config files like _nifi.properties_, 
> _authorizers.xml_ should be italicized.  Directories, properties and default 
> values for properties should be monospaced.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2925: NIFI-5469 Additional italics and code formatting correctio...

2018-08-06 Thread andrewmlim
Github user andrewmlim commented on the issue:

https://github.com/apache/nifi/pull/2925
  
Thanks for reviewing @zenfenan !


---


[jira] [Commented] (NIFI-5469) Edits needed for LDAP and Kerberos login identity provider sections in Admin Guide

2018-08-06 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5469?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16570706#comment-16570706
 ] 

ASF GitHub Bot commented on NIFI-5469:
--

Github user andrewmlim commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2925#discussion_r208012302
  
--- Diff: nifi-docs/src/main/asciidoc/administration-guide.adoc ---
@@ -3256,9 +3361,9 @@ stream {
 
 image:s2s-rproxy-portnumber.svg["Port number to Node mapping"]
--- End diff --

Hi @zenfenan,
Can you clarify what you mean by "aren't loading properly"?  I'm trying to 
understand what is broken since the screenshots are properly displayed in the 
generated docs.


> Edits needed for LDAP and Kerberos login identity provider sections in Admin 
> Guide
> --
>
> Key: NIFI-5469
> URL: https://issues.apache.org/jira/browse/NIFI-5469
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Documentation & Website
>Reporter: Andrew Lim
>Assignee: Andrew Lim
>Priority: Minor
>
> Going through the Authentication and Authorization sections of the Admin 
> Guide, I noticed the following improvements could be made:
>  * Removed “Kerberos Config File” property from kerberos-provider login 
> identity provider (this was done because the same property exists in 
> nifi.properties)
>  * Corrected the "LDAP-based Users/Groups Referencing User Attribute” login 
> identity provider example to refer to “member uid"
>  * Added titles to login identity provider examples for improved 
> readability/search
>  * Changed UserGroupProvider property examples from bulleted lists to tables
> Also, text formatting for references to config files, directories, etc.  need 
> to be made consistent.  For example, config files like _nifi.properties_, 
> _authorizers.xml_ should be italicized.  Directories, properties and default 
> values for properties should be monospaced.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5469) Edits needed for LDAP and Kerberos login identity provider sections in Admin Guide

2018-08-06 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5469?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16570707#comment-16570707
 ] 

ASF GitHub Bot commented on NIFI-5469:
--

Github user andrewmlim commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2925#discussion_r208012356
  
--- Diff: nifi-docs/src/main/asciidoc/administration-guide.adoc ---
@@ -1486,35 +1588,35 @@ If no administrator action is taken, the 
configuration values remain unencrypted
 [[encrypt-config_tool]]
 === Encrypt-Config Tool
 
-The `encrypt-config` command line tool (invoked as 
`./bin/encrypt-config.sh` or `bin\encrypt-config.bat`) reads from a 
'nifi.properties' file with plaintext sensitive configuration values, prompts 
for a master password or raw hexadecimal key, and encrypts each value. It 
replaces the plain values with the protected value in the same file, or writes 
to a new 'nifi.properties' file if specified.
+The `encrypt-config` command line tool (invoked as 
`./bin/encrypt-config.sh` or `bin\encrypt-config.bat`) reads from a 
_nifi.properties_ file with plaintext sensitive configuration values, prompts 
for a master password or raw hexadecimal key, and encrypts each value. It 
replaces the plain values with the protected value in the same file, or writes 
to a new _nifi.properties_ file if specified.
--- End diff --

Good catch.  Will update.


> Edits needed for LDAP and Kerberos login identity provider sections in Admin 
> Guide
> --
>
> Key: NIFI-5469
> URL: https://issues.apache.org/jira/browse/NIFI-5469
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Documentation & Website
>Reporter: Andrew Lim
>Assignee: Andrew Lim
>Priority: Minor
>
> Going through the Authentication and Authorization sections of the Admin 
> Guide, I noticed the following improvements could be made:
>  * Removed “Kerberos Config File” property from kerberos-provider login 
> identity provider (this was done because the same property exists in 
> nifi.properties)
>  * Corrected the "LDAP-based Users/Groups Referencing User Attribute” login 
> identity provider example to refer to “member uid"
>  * Added titles to login identity provider examples for improved 
> readability/search
>  * Changed UserGroupProvider property examples from bulleted lists to tables
> Also, text formatting for references to config files, directories, etc.  need 
> to be made consistent.  For example, config files like _nifi.properties_, 
> _authorizers.xml_ should be italicized.  Directories, properties and default 
> values for properties should be monospaced.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi-minifi-cpp pull request #388: MINIFICPP-590: Fix zero length files at s...

2018-08-06 Thread apiri
Github user apiri commented on a diff in the pull request:

https://github.com/apache/nifi-minifi-cpp/pull/388#discussion_r207919579
  
--- Diff: extensions/pcap/CapturePacket.cpp ---
@@ -54,13 +54,15 @@ namespace processors {
 std::shared_ptr CapturePacket::id_generator_ = 
utils::IdGenerator::getIdGenerator();
 core::Property CapturePacket::BaseDir("Base Directory", "Scratch directory 
for PCAP files", "/tmp/");
 core::Property CapturePacket::BatchSize("Batch Size", "The number of 
packets to combine within a given PCAP", "50");
+core::Property CapturePacket::NetworkControllers("Network Controllers", 
"List of network controllers to attach to -- each may be a regex", ".*");
--- End diff --

the way this is worded is a little awkward.  I understand this could 
resolve to multiple interfaces with the regex but this would make me think I 
could do a csv of regexes although it seems that this would be done by having 
multiple instances of that property.  is the intent to have dynamic properties 
for this?


---


[GitHub] nifi-minifi-cpp pull request #388: MINIFICPP-590: Fix zero length files at s...

2018-08-06 Thread apiri
Github user apiri commented on a diff in the pull request:

https://github.com/apache/nifi-minifi-cpp/pull/388#discussion_r207762481
  
--- Diff: extensions/pcap/CapturePacket.cpp ---
@@ -54,13 +54,15 @@ namespace processors {
 std::shared_ptr CapturePacket::id_generator_ = 
utils::IdGenerator::getIdGenerator();
 core::Property CapturePacket::BaseDir("Base Directory", "Scratch directory 
for PCAP files", "/tmp/");
 core::Property CapturePacket::BatchSize("Batch Size", "The number of 
packets to combine within a given PCAP", "50");
+core::Property CapturePacket::NetworkControllers("Network Controllers", 
"List of network controllers to attach to -- each may be a regex", ".*");
 core::Property CapturePacket::CaptureBluetooth("Capture Bluetooth", "True 
indicates that we support bluetooth interfaces", "false");
 
 const char *CapturePacket::ProcessorName = "CapturePacket";
 
 std::string CapturePacket::generate_new_pcap(const std::string &base_path) 
{
   std::string path = base_path;
   // can use relaxed for a counter
+  //int cnt = num_.fetch_add(1, std::memory_order_relaxed);
--- End diff --

we can nix this


---


[jira] [Commented] (MINIFICPP-590) Resolve zero length flow files at startup of pcap processor.

2018-08-06 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/MINIFICPP-590?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16570762#comment-16570762
 ] 

ASF GitHub Bot commented on MINIFICPP-590:
--

Github user apiri commented on a diff in the pull request:

https://github.com/apache/nifi-minifi-cpp/pull/388#discussion_r207762481
  
--- Diff: extensions/pcap/CapturePacket.cpp ---
@@ -54,13 +54,15 @@ namespace processors {
 std::shared_ptr CapturePacket::id_generator_ = 
utils::IdGenerator::getIdGenerator();
 core::Property CapturePacket::BaseDir("Base Directory", "Scratch directory 
for PCAP files", "/tmp/");
 core::Property CapturePacket::BatchSize("Batch Size", "The number of 
packets to combine within a given PCAP", "50");
+core::Property CapturePacket::NetworkControllers("Network Controllers", 
"List of network controllers to attach to -- each may be a regex", ".*");
 core::Property CapturePacket::CaptureBluetooth("Capture Bluetooth", "True 
indicates that we support bluetooth interfaces", "false");
 
 const char *CapturePacket::ProcessorName = "CapturePacket";
 
 std::string CapturePacket::generate_new_pcap(const std::string &base_path) 
{
   std::string path = base_path;
   // can use relaxed for a counter
+  //int cnt = num_.fetch_add(1, std::memory_order_relaxed);
--- End diff --

we can nix this


> Resolve zero length flow files at startup of pcap processor. 
> -
>
> Key: MINIFICPP-590
> URL: https://issues.apache.org/jira/browse/MINIFICPP-590
> Project: NiFi MiNiFi C++
>  Issue Type: Bug
>Reporter: Mr TheSegfault
>Assignee: Mr TheSegfault
>Priority: Major
> Attachments: Screen Shot 2018-08-03 at 1.56.42 PM.png
>
>
> Batching feature in PCAP appears to break at some points. Investigate this. 
> May be a memory issue on certain versions of GCC



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (MINIFICPP-590) Resolve zero length flow files at startup of pcap processor.

2018-08-06 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/MINIFICPP-590?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16570763#comment-16570763
 ] 

ASF GitHub Bot commented on MINIFICPP-590:
--

Github user apiri commented on a diff in the pull request:

https://github.com/apache/nifi-minifi-cpp/pull/388#discussion_r207919579
  
--- Diff: extensions/pcap/CapturePacket.cpp ---
@@ -54,13 +54,15 @@ namespace processors {
 std::shared_ptr CapturePacket::id_generator_ = 
utils::IdGenerator::getIdGenerator();
 core::Property CapturePacket::BaseDir("Base Directory", "Scratch directory 
for PCAP files", "/tmp/");
 core::Property CapturePacket::BatchSize("Batch Size", "The number of 
packets to combine within a given PCAP", "50");
+core::Property CapturePacket::NetworkControllers("Network Controllers", 
"List of network controllers to attach to -- each may be a regex", ".*");
--- End diff --

the way this is worded is a little awkward.  I understand this could 
resolve to multiple interfaces with the regex but this would make me think I 
could do a csv of regexes although it seems that this would be done by having 
multiple instances of that property.  is the intent to have dynamic properties 
for this?


> Resolve zero length flow files at startup of pcap processor. 
> -
>
> Key: MINIFICPP-590
> URL: https://issues.apache.org/jira/browse/MINIFICPP-590
> Project: NiFi MiNiFi C++
>  Issue Type: Bug
>Reporter: Mr TheSegfault
>Assignee: Mr TheSegfault
>Priority: Major
> Attachments: Screen Shot 2018-08-03 at 1.56.42 PM.png
>
>
> Batching feature in PCAP appears to break at some points. Investigate this. 
> May be a memory issue on certain versions of GCC



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi-minifi-cpp pull request #388: MINIFICPP-590: Fix zero length files at s...

2018-08-06 Thread phrocker
Github user phrocker commented on a diff in the pull request:

https://github.com/apache/nifi-minifi-cpp/pull/388#discussion_r208026845
  
--- Diff: extensions/pcap/CapturePacket.cpp ---
@@ -54,13 +54,15 @@ namespace processors {
 std::shared_ptr CapturePacket::id_generator_ = 
utils::IdGenerator::getIdGenerator();
 core::Property CapturePacket::BaseDir("Base Directory", "Scratch directory 
for PCAP files", "/tmp/");
 core::Property CapturePacket::BatchSize("Batch Size", "The number of 
packets to combine within a given PCAP", "50");
+core::Property CapturePacket::NetworkControllers("Network Controllers", 
"List of network controllers to attach to -- each may be a regex", ".*");
--- End diff --

This is meant to convey a YAML list ( 
https://docs.ansible.com/ansible/latest/reference_appendices/YAMLSyntax.html ), 
however as this isn't intended to be YAML specific I will clarify this. I 
thought about making it dynamic but did not see great value. Do you think this 
should be dynamic?


---


[jira] [Commented] (MINIFICPP-590) Resolve zero length flow files at startup of pcap processor.

2018-08-06 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/MINIFICPP-590?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16570765#comment-16570765
 ] 

ASF GitHub Bot commented on MINIFICPP-590:
--

Github user phrocker commented on a diff in the pull request:

https://github.com/apache/nifi-minifi-cpp/pull/388#discussion_r208026845
  
--- Diff: extensions/pcap/CapturePacket.cpp ---
@@ -54,13 +54,15 @@ namespace processors {
 std::shared_ptr CapturePacket::id_generator_ = 
utils::IdGenerator::getIdGenerator();
 core::Property CapturePacket::BaseDir("Base Directory", "Scratch directory 
for PCAP files", "/tmp/");
 core::Property CapturePacket::BatchSize("Batch Size", "The number of 
packets to combine within a given PCAP", "50");
+core::Property CapturePacket::NetworkControllers("Network Controllers", 
"List of network controllers to attach to -- each may be a regex", ".*");
--- End diff --

This is meant to convey a YAML list ( 
https://docs.ansible.com/ansible/latest/reference_appendices/YAMLSyntax.html ), 
however as this isn't intended to be YAML specific I will clarify this. I 
thought about making it dynamic but did not see great value. Do you think this 
should be dynamic?


> Resolve zero length flow files at startup of pcap processor. 
> -
>
> Key: MINIFICPP-590
> URL: https://issues.apache.org/jira/browse/MINIFICPP-590
> Project: NiFi MiNiFi C++
>  Issue Type: Bug
>Reporter: Mr TheSegfault
>Assignee: Mr TheSegfault
>Priority: Major
> Attachments: Screen Shot 2018-08-03 at 1.56.42 PM.png
>
>
> Batching feature in PCAP appears to break at some points. Investigate this. 
> May be a memory issue on certain versions of GCC



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi-minifi-cpp pull request #388: MINIFICPP-590: Fix zero length files at s...

2018-08-06 Thread phrocker
Github user phrocker commented on a diff in the pull request:

https://github.com/apache/nifi-minifi-cpp/pull/388#discussion_r208027287
  
--- Diff: extensions/pcap/CapturePacket.cpp ---
@@ -54,13 +54,15 @@ namespace processors {
 std::shared_ptr CapturePacket::id_generator_ = 
utils::IdGenerator::getIdGenerator();
 core::Property CapturePacket::BaseDir("Base Directory", "Scratch directory 
for PCAP files", "/tmp/");
 core::Property CapturePacket::BatchSize("Batch Size", "The number of 
packets to combine within a given PCAP", "50");
+core::Property CapturePacket::NetworkControllers("Network Controllers", 
"List of network controllers to attach to -- each may be a regex", ".*");
 core::Property CapturePacket::CaptureBluetooth("Capture Bluetooth", "True 
indicates that we support bluetooth interfaces", "false");
 
 const char *CapturePacket::ProcessorName = "CapturePacket";
 
 std::string CapturePacket::generate_new_pcap(const std::string &base_path) 
{
   std::string path = base_path;
   // can use relaxed for a counter
+  //int cnt = num_.fetch_add(1, std::memory_order_relaxed);
--- End diff --

err thanks can't believe I missed that. 


---


[jira] [Commented] (MINIFICPP-590) Resolve zero length flow files at startup of pcap processor.

2018-08-06 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/MINIFICPP-590?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16570766#comment-16570766
 ] 

ASF GitHub Bot commented on MINIFICPP-590:
--

Github user phrocker commented on a diff in the pull request:

https://github.com/apache/nifi-minifi-cpp/pull/388#discussion_r208027287
  
--- Diff: extensions/pcap/CapturePacket.cpp ---
@@ -54,13 +54,15 @@ namespace processors {
 std::shared_ptr CapturePacket::id_generator_ = 
utils::IdGenerator::getIdGenerator();
 core::Property CapturePacket::BaseDir("Base Directory", "Scratch directory 
for PCAP files", "/tmp/");
 core::Property CapturePacket::BatchSize("Batch Size", "The number of 
packets to combine within a given PCAP", "50");
+core::Property CapturePacket::NetworkControllers("Network Controllers", 
"List of network controllers to attach to -- each may be a regex", ".*");
 core::Property CapturePacket::CaptureBluetooth("Capture Bluetooth", "True 
indicates that we support bluetooth interfaces", "false");
 
 const char *CapturePacket::ProcessorName = "CapturePacket";
 
 std::string CapturePacket::generate_new_pcap(const std::string &base_path) 
{
   std::string path = base_path;
   // can use relaxed for a counter
+  //int cnt = num_.fetch_add(1, std::memory_order_relaxed);
--- End diff --

err thanks can't believe I missed that. 


> Resolve zero length flow files at startup of pcap processor. 
> -
>
> Key: MINIFICPP-590
> URL: https://issues.apache.org/jira/browse/MINIFICPP-590
> Project: NiFi MiNiFi C++
>  Issue Type: Bug
>Reporter: Mr TheSegfault
>Assignee: Mr TheSegfault
>Priority: Major
> Attachments: Screen Shot 2018-08-03 at 1.56.42 PM.png
>
>
> Batching feature in PCAP appears to break at some points. Investigate this. 
> May be a memory issue on certain versions of GCC



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2823: NIFI-5350 Add a way to provide arbitrary Java options in s...

2018-08-06 Thread mosermw
Github user mosermw commented on the issue:

https://github.com/apache/nifi/pull/2823
  
Seems like this could be useful, and certainly doesn't hurt.  +1 from me.


---


[jira] [Commented] (NIFI-5350) Add a way to provide arbitrary Java options in shell scripts

2018-08-06 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5350?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16570775#comment-16570775
 ] 

ASF subversion and git services commented on NIFI-5350:
---

Commit a19134f32560ae044cb898a7654ada31338e20c2 in nifi's branch 
refs/heads/master from [~lars_francke]
[ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=a19134f ]

NIFI-5350 Add a way to provide arbitrary Java options in shell scripts

Signed-off-by: Mike Moser 

This closes #2823


> Add a way to provide arbitrary Java options in shell scripts
> 
>
> Key: NIFI-5350
> URL: https://issues.apache.org/jira/browse/NIFI-5350
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Tools and Build
>Affects Versions: 1.7.0
>Reporter: Lars Francke
>Assignee: Lars Francke
>Priority: Minor
>
> I wanted to change the location of the bootstrap config file which can be 
> done using the System property {{org.apache.nifi.bootstrap.config.file}}. 
> Unfortunately there's no easy way to set that using the default {{nifi.sh}} 
> script.
> It can be done using the {{BOOTSTRAP_DEBUG_PARAMS}} environment variable but 
> that doesn't feel right and can break if anyone actually uses that variable.
> I suggest adding an optional environment variable {{BOOTSTRAP_JAVA_OPTS}} 
> that can be used to pass in extra options to Java



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5350) Add a way to provide arbitrary Java options in shell scripts

2018-08-06 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5350?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16570772#comment-16570772
 ] 

ASF GitHub Bot commented on NIFI-5350:
--

Github user mosermw commented on the issue:

https://github.com/apache/nifi/pull/2823
  
Seems like this could be useful, and certainly doesn't hurt.  +1 from me.


> Add a way to provide arbitrary Java options in shell scripts
> 
>
> Key: NIFI-5350
> URL: https://issues.apache.org/jira/browse/NIFI-5350
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Tools and Build
>Affects Versions: 1.7.0
>Reporter: Lars Francke
>Assignee: Lars Francke
>Priority: Minor
>
> I wanted to change the location of the bootstrap config file which can be 
> done using the System property {{org.apache.nifi.bootstrap.config.file}}. 
> Unfortunately there's no easy way to set that using the default {{nifi.sh}} 
> script.
> It can be done using the {{BOOTSTRAP_DEBUG_PARAMS}} environment variable but 
> that doesn't feel right and can break if anyone actually uses that variable.
> I suggest adding an optional environment variable {{BOOTSTRAP_JAVA_OPTS}} 
> that can be used to pass in extra options to Java



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2823: NIFI-5350 Add a way to provide arbitrary Java optio...

2018-08-06 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/2823


---


[jira] [Commented] (NIFI-5350) Add a way to provide arbitrary Java options in shell scripts

2018-08-06 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5350?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16570777#comment-16570777
 ] 

ASF GitHub Bot commented on NIFI-5350:
--

Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/2823


> Add a way to provide arbitrary Java options in shell scripts
> 
>
> Key: NIFI-5350
> URL: https://issues.apache.org/jira/browse/NIFI-5350
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Tools and Build
>Affects Versions: 1.7.0
>Reporter: Lars Francke
>Assignee: Lars Francke
>Priority: Minor
> Fix For: 1.8.0
>
>
> I wanted to change the location of the bootstrap config file which can be 
> done using the System property {{org.apache.nifi.bootstrap.config.file}}. 
> Unfortunately there's no easy way to set that using the default {{nifi.sh}} 
> script.
> It can be done using the {{BOOTSTRAP_DEBUG_PARAMS}} environment variable but 
> that doesn't feel right and can break if anyone actually uses that variable.
> I suggest adding an optional environment variable {{BOOTSTRAP_JAVA_OPTS}} 
> that can be used to pass in extra options to Java



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (NIFI-5350) Add a way to provide arbitrary Java options in shell scripts

2018-08-06 Thread Michael Moser (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-5350?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Michael Moser resolved NIFI-5350.
-
   Resolution: Fixed
Fix Version/s: 1.8.0

> Add a way to provide arbitrary Java options in shell scripts
> 
>
> Key: NIFI-5350
> URL: https://issues.apache.org/jira/browse/NIFI-5350
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Tools and Build
>Affects Versions: 1.7.0
>Reporter: Lars Francke
>Assignee: Lars Francke
>Priority: Minor
> Fix For: 1.8.0
>
>
> I wanted to change the location of the bootstrap config file which can be 
> done using the System property {{org.apache.nifi.bootstrap.config.file}}. 
> Unfortunately there's no easy way to set that using the default {{nifi.sh}} 
> script.
> It can be done using the {{BOOTSTRAP_DEBUG_PARAMS}} environment variable but 
> that doesn't feel right and can break if anyone actually uses that variable.
> I suggest adding an optional environment variable {{BOOTSTRAP_JAVA_OPTS}} 
> that can be used to pass in extra options to Java



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2823: NIFI-5350 Add a way to provide arbitrary Java options in s...

2018-08-06 Thread lfrancke
Github user lfrancke commented on the issue:

https://github.com/apache/nifi/pull/2823
  
Thanks for taking a look & committing!


---


[jira] [Commented] (NIFI-5350) Add a way to provide arbitrary Java options in shell scripts

2018-08-06 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5350?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16570808#comment-16570808
 ] 

ASF GitHub Bot commented on NIFI-5350:
--

Github user lfrancke commented on the issue:

https://github.com/apache/nifi/pull/2823
  
Thanks for taking a look & committing!


> Add a way to provide arbitrary Java options in shell scripts
> 
>
> Key: NIFI-5350
> URL: https://issues.apache.org/jira/browse/NIFI-5350
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Tools and Build
>Affects Versions: 1.7.0
>Reporter: Lars Francke
>Assignee: Lars Francke
>Priority: Minor
> Fix For: 1.8.0
>
>
> I wanted to change the location of the bootstrap config file which can be 
> done using the System property {{org.apache.nifi.bootstrap.config.file}}. 
> Unfortunately there's no easy way to set that using the default {{nifi.sh}} 
> script.
> It can be done using the {{BOOTSTRAP_DEBUG_PARAMS}} environment variable but 
> that doesn't feel right and can break if anyone actually uses that variable.
> I suggest adding an optional environment variable {{BOOTSTRAP_JAVA_OPTS}} 
> that can be used to pass in extra options to Java



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2846: NIFI-5381 Add GetSFTP and PutSFTP Unit Tests

2018-08-06 Thread joewitt
Github user joewitt commented on the issue:

https://github.com/apache/nifi/pull/2846
  
+1 if the change to the NOTICE is removed.  Not needed in nar notice since 
those are only included in test scope (not in nar)


---


[jira] [Commented] (NIFI-5381) Add SFTP Unit Tests using SFTP Server

2018-08-06 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5381?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16570818#comment-16570818
 ] 

ASF GitHub Bot commented on NIFI-5381:
--

Github user joewitt commented on the issue:

https://github.com/apache/nifi/pull/2846
  
+1 if the change to the NOTICE is removed.  Not needed in nar notice since 
those are only included in test scope (not in nar)


> Add SFTP Unit Tests using SFTP Server
> -
>
> Key: NIFI-5381
> URL: https://issues.apache.org/jira/browse/NIFI-5381
> Project: Apache NiFi
>  Issue Type: Sub-task
>Reporter: Peter Wicks
>Assignee: Peter Wicks
>Priority: Minor
>
> Unit Tests only. Create Unit Tests for SFTP using local/in-memory SFTP server.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi-minifi-cpp pull request #388: MINIFICPP-590: Fix zero length files at s...

2018-08-06 Thread apiri
Github user apiri commented on a diff in the pull request:

https://github.com/apache/nifi-minifi-cpp/pull/388#discussion_r208041393
  
--- Diff: extensions/pcap/CapturePacket.cpp ---
@@ -54,13 +54,15 @@ namespace processors {
 std::shared_ptr CapturePacket::id_generator_ = 
utils::IdGenerator::getIdGenerator();
 core::Property CapturePacket::BaseDir("Base Directory", "Scratch directory 
for PCAP files", "/tmp/");
 core::Property CapturePacket::BatchSize("Batch Size", "The number of 
packets to combine within a given PCAP", "50");
+core::Property CapturePacket::NetworkControllers("Network Controllers", 
"List of network controllers to attach to -- each may be a regex", ".*");
--- End diff --

I would agree that dynamic doesn't really seem like the right fit and 
arguably a misuse of what they are supposed to represent.  The clarification is 
definitely helpful though concerning the expectation of the YAML list.  
Admittedly this comment stemmed from my NiFI leaning perspective where all 
property values are inherently just strings 


---


[jira] [Commented] (MINIFICPP-590) Resolve zero length flow files at startup of pcap processor.

2018-08-06 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/MINIFICPP-590?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16570831#comment-16570831
 ] 

ASF GitHub Bot commented on MINIFICPP-590:
--

Github user apiri commented on a diff in the pull request:

https://github.com/apache/nifi-minifi-cpp/pull/388#discussion_r208041393
  
--- Diff: extensions/pcap/CapturePacket.cpp ---
@@ -54,13 +54,15 @@ namespace processors {
 std::shared_ptr CapturePacket::id_generator_ = 
utils::IdGenerator::getIdGenerator();
 core::Property CapturePacket::BaseDir("Base Directory", "Scratch directory 
for PCAP files", "/tmp/");
 core::Property CapturePacket::BatchSize("Batch Size", "The number of 
packets to combine within a given PCAP", "50");
+core::Property CapturePacket::NetworkControllers("Network Controllers", 
"List of network controllers to attach to -- each may be a regex", ".*");
--- End diff --

I would agree that dynamic doesn't really seem like the right fit and 
arguably a misuse of what they are supposed to represent.  The clarification is 
definitely helpful though concerning the expectation of the YAML list.  
Admittedly this comment stemmed from my NiFI leaning perspective where all 
property values are inherently just strings 


> Resolve zero length flow files at startup of pcap processor. 
> -
>
> Key: MINIFICPP-590
> URL: https://issues.apache.org/jira/browse/MINIFICPP-590
> Project: NiFi MiNiFi C++
>  Issue Type: Bug
>Reporter: Mr TheSegfault
>Assignee: Mr TheSegfault
>Priority: Major
> Attachments: Screen Shot 2018-08-03 at 1.56.42 PM.png
>
>
> Batching feature in PCAP appears to break at some points. Investigate this. 
> May be a memory issue on certain versions of GCC



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (MINIFICPP-594) CapturePCAP predates PROCESSORS.md and thus was never added

2018-08-06 Thread Mr TheSegfault (JIRA)
Mr TheSegfault created MINIFICPP-594:


 Summary: CapturePCAP predates PROCESSORS.md and thus was never 
added
 Key: MINIFICPP-594
 URL: https://issues.apache.org/jira/browse/MINIFICPP-594
 Project: NiFi MiNiFi C++
  Issue Type: Improvement
Reporter: Mr TheSegfault
Assignee: Mr TheSegfault


NetworkControllers



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi-minifi-cpp issue #388: MINIFICPP-590: Fix zero length files at startup

2018-08-06 Thread phrocker
Github user phrocker commented on the issue:

https://github.com/apache/nifi-minifi-cpp/pull/388
  
@apiri also created https://issues.apache.org/jira/browse/MINIFICPP-594 
since I think the documentation was created after the CapturePacket processor. 
Happy to combine the PRs if you prefer, but otherwise will submit that as a 
separate PR tomorow . 


---


[jira] [Commented] (MINIFICPP-590) Resolve zero length flow files at startup of pcap processor.

2018-08-06 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/MINIFICPP-590?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16570913#comment-16570913
 ] 

ASF GitHub Bot commented on MINIFICPP-590:
--

Github user phrocker commented on the issue:

https://github.com/apache/nifi-minifi-cpp/pull/388
  
@apiri also created https://issues.apache.org/jira/browse/MINIFICPP-594 
since I think the documentation was created after the CapturePacket processor. 
Happy to combine the PRs if you prefer, but otherwise will submit that as a 
separate PR tomorow . 


> Resolve zero length flow files at startup of pcap processor. 
> -
>
> Key: MINIFICPP-590
> URL: https://issues.apache.org/jira/browse/MINIFICPP-590
> Project: NiFi MiNiFi C++
>  Issue Type: Bug
>Reporter: Mr TheSegfault
>Assignee: Mr TheSegfault
>Priority: Major
> Attachments: Screen Shot 2018-08-03 at 1.56.42 PM.png
>
>
> Batching feature in PCAP appears to break at some points. Investigate this. 
> May be a memory issue on certain versions of GCC



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2937: NIFI-4434 Fixed recursive listing with a custom reg...

2018-08-06 Thread jtstorck
Github user jtstorck commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2937#discussion_r208072541
  
--- Diff: 
nifi-nar-bundles/nifi-hadoop-bundle/nifi-hdfs-processors/src/main/java/org/apache/nifi/processors/hadoop/ListHDFS.java
 ---
@@ -462,11 +523,15 @@ private String getPerms(final FsAction action) {
 
 private PathFilter createPathFilter(final ProcessContext context) {
 final Pattern filePattern = 
Pattern.compile(context.getProperty(FILE_FILTER).getValue());
--- End diff --

@ottobackwards The FILE_FILTER property does not currently support 
expression language.  The processor could be updated to enable EL for the 
property, but that is outside the scope of this PR.


---


[jira] [Commented] (NIFI-4434) ListHDFS applies File Filter also to subdirectory names in recursive search

2018-08-06 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4434?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16570998#comment-16570998
 ] 

ASF GitHub Bot commented on NIFI-4434:
--

Github user jtstorck commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2937#discussion_r208072541
  
--- Diff: 
nifi-nar-bundles/nifi-hadoop-bundle/nifi-hdfs-processors/src/main/java/org/apache/nifi/processors/hadoop/ListHDFS.java
 ---
@@ -462,11 +523,15 @@ private String getPerms(final FsAction action) {
 
 private PathFilter createPathFilter(final ProcessContext context) {
 final Pattern filePattern = 
Pattern.compile(context.getProperty(FILE_FILTER).getValue());
--- End diff --

@ottobackwards The FILE_FILTER property does not currently support 
expression language.  The processor could be updated to enable EL for the 
property, but that is outside the scope of this PR.


> ListHDFS applies File Filter also to subdirectory names in recursive search
> ---
>
> Key: NIFI-4434
> URL: https://issues.apache.org/jira/browse/NIFI-4434
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 1.3.0
>Reporter: Holger Frydrych
>Assignee: Jeff Storck
>Priority: Major
>
> The File Filter regex configured in the ListHDFS processor is applied not 
> just to files found, but also to subdirectories. 
> If you try to set up a recursive search to list e.g. all csv files in a 
> directory hierarchy via a regex like ".*\.csv", it will only pick up csv 
> files in the base directory, not in any subdirectory. This is because 
> subdirectories don't typically match that regex pattern.
> To fix this, either subdirectories should not be matched against the file 
> filter, or the file filter should be applied to the full path of all files 
> (relative to the base directory). The GetHDFS processor offers both options 
> via a switch.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2937: NIFI-4434 Fixed recursive listing with a custom regex filt...

2018-08-06 Thread jtstorck
Github user jtstorck commented on the issue:

https://github.com/apache/nifi/pull/2937
  
@ottobackwards Regarding documentation for the filter modes, descriptions 
have been created for the allowable values.  Do these descriptions not seem 
adequate for the functionality of each mode?


---


[jira] [Commented] (NIFI-4434) ListHDFS applies File Filter also to subdirectory names in recursive search

2018-08-06 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4434?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16571001#comment-16571001
 ] 

ASF GitHub Bot commented on NIFI-4434:
--

Github user jtstorck commented on the issue:

https://github.com/apache/nifi/pull/2937
  
@ottobackwards Regarding documentation for the filter modes, descriptions 
have been created for the allowable values.  Do these descriptions not seem 
adequate for the functionality of each mode?


> ListHDFS applies File Filter also to subdirectory names in recursive search
> ---
>
> Key: NIFI-4434
> URL: https://issues.apache.org/jira/browse/NIFI-4434
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 1.3.0
>Reporter: Holger Frydrych
>Assignee: Jeff Storck
>Priority: Major
>
> The File Filter regex configured in the ListHDFS processor is applied not 
> just to files found, but also to subdirectories. 
> If you try to set up a recursive search to list e.g. all csv files in a 
> directory hierarchy via a regex like ".*\.csv", it will only pick up csv 
> files in the base directory, not in any subdirectory. This is because 
> subdirectories don't typically match that regex pattern.
> To fix this, either subdirectories should not be matched against the file 
> filter, or the file filter should be applied to the full path of all files 
> (relative to the base directory). The GetHDFS processor offers both options 
> via a switch.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5400) NiFiHostnameVerifier should be replaced

2018-08-06 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5400?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16571066#comment-16571066
 ] 

ASF subversion and git services commented on NIFI-5400:
---

Commit 8106af699c4dba21c03b1d8ba6c3e702dc7b1380 in nifi's branch 
refs/heads/master from thenatog
[ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=8106af6 ]

NIFI-5400 - Changed the hostname verifier from the custom NiFi verifier to the 
Apache http-client DefaultHostnameVerifier
Removed NiFiHostnameVerifier. Removed NiFi WebUtils usage of 
NiFiHostnameVerifier.
Added unit tests for the DefaultHostnameVerifier to WebUtils.java
Added groovy-eclipse-compiler definition to nifi-web-utils/pom.xml to execute 
Groovy unit tests.

This closes #2919.

Co-authored-by: Andy LoPresto 
Signed-off-by: Andy LoPresto 


> NiFiHostnameVerifier should be replaced
> ---
>
> Key: NIFI-5400
> URL: https://issues.apache.org/jira/browse/NIFI-5400
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core Framework
>Affects Versions: 1.7.0
>Reporter: Andy LoPresto
>Priority: Major
>  Labels: certificate, hostname, security, tls
>
> The {{NiFiHostnameVerifier}} does not handle wildcard certificates or complex 
> {{SubjectAlternativeNames}}. It should be replaced with a more full-featured 
> implementation, like {{OkHostnameVerifier}} from {{okhttp}} or 
> {{DefaultHostnameVerifier}} from {{http-client}}. Either of these options 
> requires introducing a new Maven dependency to {{nifi-commons}} and requires 
> further investigation. 
> *Note: * the {{sun.net.www.protocol.httpsDefaultHostnameVerifier}} simply 
> returns {{false}} on all inputs and is not a valid solution. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5400) NiFiHostnameVerifier should be replaced

2018-08-06 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5400?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16571067#comment-16571067
 ] 

ASF GitHub Bot commented on NIFI-5400:
--

Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/2919


> NiFiHostnameVerifier should be replaced
> ---
>
> Key: NIFI-5400
> URL: https://issues.apache.org/jira/browse/NIFI-5400
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core Framework
>Affects Versions: 1.7.0
>Reporter: Andy LoPresto
>Priority: Major
>  Labels: certificate, hostname, security, tls
>
> The {{NiFiHostnameVerifier}} does not handle wildcard certificates or complex 
> {{SubjectAlternativeNames}}. It should be replaced with a more full-featured 
> implementation, like {{OkHostnameVerifier}} from {{okhttp}} or 
> {{DefaultHostnameVerifier}} from {{http-client}}. Either of these options 
> requires introducing a new Maven dependency to {{nifi-commons}} and requires 
> further investigation. 
> *Note: * the {{sun.net.www.protocol.httpsDefaultHostnameVerifier}} simply 
> returns {{false}} on all inputs and is not a valid solution. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2919: NIFI-5400 - Changed the hostname verifier from the ...

2018-08-06 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/2919


---


[GitHub] nifi issue #2919: NIFI-5400 - Changed the hostname verifier from the custom ...

2018-08-06 Thread alopresto
Github user alopresto commented on the issue:

https://github.com/apache/nifi/pull/2919
  
I merged this but made two changes. First, the `nifi-web-utils` tests were 
not running, because they are Groovy tests and there is nothing in 
`src/test/java`. Without a file (even empty) in that directory, the Groovy 
tests do not get picked up (neither compiled nor run). I added the 
`groovy-eclipse-compiler` plugin to `nifi-web-utils/pom.xml` to ensure this is 
run. That commit is 
[5c0232c](https://github.com/alopresto/nifi/commit/5c0232c9dd8009dc69bc5adb1fb1ef7942832911).
 

Second, there was a warning about a duplicate definition of `httpclient` 
dependency in `nifi-web-utils/pom.xml`. I removed it, and that commit is 
[5f538c6](https://github.com/alopresto/nifi/commit/5f538c69f1aebc0b6b0d6dbabf0f09c8e9854a57).
 

Both of those commits were rebased onto Nathan's rebased commits as well. 

A gist demonstrating the issue is 
[here](https://gist.github.com/alopresto/184f3631ec44a4c036d323d622ea97aa). 

Ran `contrib-check` and all tests pass. +1, merging. 


---


[jira] [Commented] (NIFI-5400) NiFiHostnameVerifier should be replaced

2018-08-06 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5400?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16571071#comment-16571071
 ] 

ASF GitHub Bot commented on NIFI-5400:
--

Github user alopresto commented on the issue:

https://github.com/apache/nifi/pull/2919
  
I merged this but made two changes. First, the `nifi-web-utils` tests were 
not running, because they are Groovy tests and there is nothing in 
`src/test/java`. Without a file (even empty) in that directory, the Groovy 
tests do not get picked up (neither compiled nor run). I added the 
`groovy-eclipse-compiler` plugin to `nifi-web-utils/pom.xml` to ensure this is 
run. That commit is 
[5c0232c](https://github.com/alopresto/nifi/commit/5c0232c9dd8009dc69bc5adb1fb1ef7942832911).
 

Second, there was a warning about a duplicate definition of `httpclient` 
dependency in `nifi-web-utils/pom.xml`. I removed it, and that commit is 
[5f538c6](https://github.com/alopresto/nifi/commit/5f538c69f1aebc0b6b0d6dbabf0f09c8e9854a57).
 

Both of those commits were rebased onto Nathan's rebased commits as well. 

A gist demonstrating the issue is 
[here](https://gist.github.com/alopresto/184f3631ec44a4c036d323d622ea97aa). 

Ran `contrib-check` and all tests pass. +1, merging. 


> NiFiHostnameVerifier should be replaced
> ---
>
> Key: NIFI-5400
> URL: https://issues.apache.org/jira/browse/NIFI-5400
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core Framework
>Affects Versions: 1.7.0
>Reporter: Andy LoPresto
>Priority: Major
>  Labels: certificate, hostname, security, tls
>
> The {{NiFiHostnameVerifier}} does not handle wildcard certificates or complex 
> {{SubjectAlternativeNames}}. It should be replaced with a more full-featured 
> implementation, like {{OkHostnameVerifier}} from {{okhttp}} or 
> {{DefaultHostnameVerifier}} from {{http-client}}. Either of these options 
> requires introducing a new Maven dependency to {{nifi-commons}} and requires 
> further investigation. 
> *Note: * the {{sun.net.www.protocol.httpsDefaultHostnameVerifier}} simply 
> returns {{false}} on all inputs and is not a valid solution. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (NIFI-5400) NiFiHostnameVerifier should be replaced

2018-08-06 Thread Andy LoPresto (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-5400?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Andy LoPresto reassigned NIFI-5400:
---

Assignee: Nathan Gough

> NiFiHostnameVerifier should be replaced
> ---
>
> Key: NIFI-5400
> URL: https://issues.apache.org/jira/browse/NIFI-5400
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core Framework
>Affects Versions: 1.7.0
>Reporter: Andy LoPresto
>Assignee: Nathan Gough
>Priority: Major
>  Labels: certificate, hostname, security, tls
> Fix For: 1.8.0
>
>
> The {{NiFiHostnameVerifier}} does not handle wildcard certificates or complex 
> {{SubjectAlternativeNames}}. It should be replaced with a more full-featured 
> implementation, like {{OkHostnameVerifier}} from {{okhttp}} or 
> {{DefaultHostnameVerifier}} from {{http-client}}. Either of these options 
> requires introducing a new Maven dependency to {{nifi-commons}} and requires 
> further investigation. 
> *Note: * the {{sun.net.www.protocol.httpsDefaultHostnameVerifier}} simply 
> returns {{false}} on all inputs and is not a valid solution. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (NIFI-5400) NiFiHostnameVerifier should be replaced

2018-08-06 Thread Andy LoPresto (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-5400?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Andy LoPresto updated NIFI-5400:

Status: Patch Available  (was: Open)

> NiFiHostnameVerifier should be replaced
> ---
>
> Key: NIFI-5400
> URL: https://issues.apache.org/jira/browse/NIFI-5400
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core Framework
>Affects Versions: 1.7.0
>Reporter: Andy LoPresto
>Priority: Major
>  Labels: certificate, hostname, security, tls
>
> The {{NiFiHostnameVerifier}} does not handle wildcard certificates or complex 
> {{SubjectAlternativeNames}}. It should be replaced with a more full-featured 
> implementation, like {{OkHostnameVerifier}} from {{okhttp}} or 
> {{DefaultHostnameVerifier}} from {{http-client}}. Either of these options 
> requires introducing a new Maven dependency to {{nifi-commons}} and requires 
> further investigation. 
> *Note: * the {{sun.net.www.protocol.httpsDefaultHostnameVerifier}} simply 
> returns {{false}} on all inputs and is not a valid solution. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (NIFI-5400) NiFiHostnameVerifier should be replaced

2018-08-06 Thread Andy LoPresto (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-5400?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Andy LoPresto updated NIFI-5400:

   Resolution: Fixed
Fix Version/s: 1.8.0
   Status: Resolved  (was: Patch Available)

> NiFiHostnameVerifier should be replaced
> ---
>
> Key: NIFI-5400
> URL: https://issues.apache.org/jira/browse/NIFI-5400
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core Framework
>Affects Versions: 1.7.0
>Reporter: Andy LoPresto
>Assignee: Nathan Gough
>Priority: Major
>  Labels: certificate, hostname, security, tls
> Fix For: 1.8.0
>
>
> The {{NiFiHostnameVerifier}} does not handle wildcard certificates or complex 
> {{SubjectAlternativeNames}}. It should be replaced with a more full-featured 
> implementation, like {{OkHostnameVerifier}} from {{okhttp}} or 
> {{DefaultHostnameVerifier}} from {{http-client}}. Either of these options 
> requires introducing a new Maven dependency to {{nifi-commons}} and requires 
> further investigation. 
> *Note: * the {{sun.net.www.protocol.httpsDefaultHostnameVerifier}} simply 
> returns {{false}} on all inputs and is not a valid solution. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (NIFI-5492) UDF in Expression Language

2018-08-06 Thread Ed Berezitsky (JIRA)
Ed Berezitsky created NIFI-5492:
---

 Summary: UDF in Expression Language
 Key: NIFI-5492
 URL: https://issues.apache.org/jira/browse/NIFI-5492
 Project: Apache NiFi
  Issue Type: Wish
  Components: Core Framework
Reporter: Ed Berezitsky
Assignee: Ed Berezitsky


Set of functions available to use in expression language is limited by 
predefined ones.

This request is to provide an ability to plug in custom/user defined functions.

For example:

${{color:#FF}*exec*{color}('com.example.MyUDF', 'param1', 'param2')}

Should be able to support:
 # Multiple, not limited number of parameters (including zero params)
 # Param data types should  support all EL data types (dates, whole numbers, 
decimals, strings, booleans)

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (NIFI-5492) UDF in Expression Language

2018-08-06 Thread Ed Berezitsky (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-5492?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ed Berezitsky updated NIFI-5492:

Description: 
Set of functions available to use in expression language is limited by 
predefined ones.

This request is to provide an ability to plug in custom/user defined functions.

For example:

${*exec*('com.example.MyUDF', 'param1', 'param2')}

Should be able to support:
 # Multiple, not limited number of parameters (including zero params)
 # Param data types should  support all EL data types (dates, whole numbers, 
decimals, strings, booleans)

 

  was:
Set of functions available to use in expression language is limited by 
predefined ones.

This request is to provide an ability to plug in custom/user defined functions.

For example:

${{color:#FF}*exec*{color}('com.example.MyUDF', 'param1', 'param2')}

Should be able to support:
 # Multiple, not limited number of parameters (including zero params)
 # Param data types should  support all EL data types (dates, whole numbers, 
decimals, strings, booleans)

 


> UDF in Expression Language
> --
>
> Key: NIFI-5492
> URL: https://issues.apache.org/jira/browse/NIFI-5492
> Project: Apache NiFi
>  Issue Type: Wish
>  Components: Core Framework
>Reporter: Ed Berezitsky
>Assignee: Ed Berezitsky
>Priority: Major
>  Labels: features
>
> Set of functions available to use in expression language is limited by 
> predefined ones.
> This request is to provide an ability to plug in custom/user defined 
> functions.
> For example:
> ${*exec*('com.example.MyUDF', 'param1', 'param2')}
> Should be able to support:
>  # Multiple, not limited number of parameters (including zero params)
>  # Param data types should  support all EL data types (dates, whole numbers, 
> decimals, strings, booleans)
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi-registry pull request #131: NIFIREG-186: Adding Ranger authorizer

2018-08-06 Thread ijokarumawak
Github user ijokarumawak commented on a diff in the pull request:

https://github.com/apache/nifi-registry/pull/131#discussion_r208113908
  
--- Diff: nifi-registry-extensions/nifi-registry-ranger-extension/README.md 
---
@@ -0,0 +1,116 @@
+
+# NiFi Registry Ranger extension
+
+This extension provides `org.apache.nifi.registry.ranger.RangerAuthorizer` 
class for NiFi Registry to authorize user requests by access policies defined 
at [Apache Ranger](https://ranger.apache.org/).
+
+## Prerequisites
+
+* Apache Ranger 1.2.0 or later is needed.
--- End diff --

This sentence may need to be updated once the improvement at Ranger project 
is committed.


---


[jira] [Commented] (NIFIREG-186) Create Authorizer implementation that uses Apache Ranger

2018-08-06 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFIREG-186?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16571175#comment-16571175
 ] 

ASF GitHub Bot commented on NIFIREG-186:


Github user ijokarumawak commented on a diff in the pull request:

https://github.com/apache/nifi-registry/pull/131#discussion_r208113908
  
--- Diff: nifi-registry-extensions/nifi-registry-ranger-extension/README.md 
---
@@ -0,0 +1,116 @@
+
+# NiFi Registry Ranger extension
+
+This extension provides `org.apache.nifi.registry.ranger.RangerAuthorizer` 
class for NiFi Registry to authorize user requests by access policies defined 
at [Apache Ranger](https://ranger.apache.org/).
+
+## Prerequisites
+
+* Apache Ranger 1.2.0 or later is needed.
--- End diff --

This sentence may need to be updated once the improvement at Ranger project 
is committed.


> Create Authorizer implementation that uses Apache Ranger
> 
>
> Key: NIFIREG-186
> URL: https://issues.apache.org/jira/browse/NIFIREG-186
> Project: NiFi Registry
>  Issue Type: Improvement
>Reporter: Koji Kawamura
>Assignee: Koji Kawamura
>Priority: Major
>
> In addition to the standard file-based Authorizer, we should provide an 
> Authorizer implementation that uses Apache Ranger, so that users implement 
> centralized authorization against both NiFi and NiFi Registry.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


  1   2   >