[GitHub] nifi pull request #2889: NIFI-5426: Use NIO.2 API for ListFile

2018-07-16 Thread mgaido91
Github user mgaido91 closed the pull request at:

https://github.com/apache/nifi/pull/2889


---


[GitHub] nifi issue #2889: NIFI-5426: Use NIO.2 API for ListFile

2018-07-15 Thread mgaido91
Github user mgaido91 commented on the issue:

https://github.com/apache/nifi/pull/2889
  
Hi @markap14. I saw the comment on the JIRA that you merged this to master. 
Shall I close this PR then? thanks!


---


[GitHub] nifi pull request #2889: NIFI-5426: Use NIO.2 API for ListFile

2018-07-13 Thread mgaido91
GitHub user mgaido91 opened a pull request:

https://github.com/apache/nifi/pull/2889

NIFI-5426: Use NIO.2 API for ListFile

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [ ] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [ ] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [ ] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [ ] Is your initial contribution a single, squashed commit?

### For code changes:
- [ ] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/mgaido91/nifi NIFI-5426

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/2889.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2889


commit 1ce4b51d91d7101d40f59f462929d637adb2c84d
Author: Marco Gaido 
Date:   2018-07-13T14:00:02Z

NIFI-5426: Use NIO.2 API for ListFile




---


[GitHub] nifi issue #2789: NIFI-3242: Avoid double scheduling of a task due to quartz...

2018-06-13 Thread mgaido91
Github user mgaido91 commented on the issue:

https://github.com/apache/nifi/pull/2789
  
cc @markap14 


---


[GitHub] nifi pull request #2789: NIFI-3242: Avoid double scheduling of a task due to...

2018-06-13 Thread mgaido91
GitHub user mgaido91 opened a pull request:

https://github.com/apache/nifi/pull/2789

NIFI-3242: Avoid double scheduling of a task due to quartz imprecision

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [x] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [x] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [x] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [x] Is your initial contribution a single, squashed commit?

### For code changes:
- [x] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/mgaido91/nifi NIFI-3242

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/2789.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2789






---


[GitHub] nifi issue #2768: NIFI-5278: fixes JSON escaping of code parameter in Execut...

2018-06-09 Thread mgaido91
Github user mgaido91 commented on the issue:

https://github.com/apache/nifi/pull/2768
  
LGTM


---


[GitHub] nifi pull request #2768: NIFI-5278: fixes JSON escaping of code parameter in...

2018-06-08 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2768#discussion_r194107658
  
--- Diff: 
nifi-nar-bundles/nifi-spark-bundle/nifi-livy-processors/src/test/java/org/apache/nifi/processors/livy/ExecuteSparkInteractiveTestBase.java
 ---
@@ -64,33 +66,37 @@ public void handle(String target, Request baseRequest, 
HttpServletRequest reques
 }
 session1Requests++;
 }
-
-response.setContentLength(responseBody.length());
-
-try (PrintWriter writer = response.getWriter()) {
-writer.print(responseBody);
-writer.flush();
-}
-
 } else if ("POST".equalsIgnoreCase(request.getMethod())) {
-
-String responseBody = "{}";
-response.setContentType("application/json");
-
-if ("/sessions".equalsIgnoreCase(target)) {
-responseBody = "{\"id\": 1, \"kind\": \"spark\", 
\"state\": \"idle\"}";
-} else if 
("/sessions/1/statements".equalsIgnoreCase(target)) {
-responseBody = "{\"id\": 7}";
+String requestBody = IOUtils.toString(request.getReader());
+try {
+System.out.println("requestBody: " + requestBody);
+
+new ObjectMapper().readTree(requestBody);
--- End diff --

may you please add some comments explaining what and why you are doing 
this? It is clear since we are in the context of this PR, but for future 
readers I think a comment would be very helpful.


---


[GitHub] nifi pull request #2768: NIFI-5278: fixes JSON escaping of code parameter in...

2018-06-08 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2768#discussion_r194106518
  
--- Diff: 
nifi-nar-bundles/nifi-spark-bundle/nifi-livy-processors/src/test/java/org/apache/nifi/processors/livy/ExecuteSparkInteractiveTestBase.java
 ---
@@ -64,33 +66,37 @@ public void handle(String target, Request baseRequest, 
HttpServletRequest reques
 }
 session1Requests++;
 }
-
-response.setContentLength(responseBody.length());
-
-try (PrintWriter writer = response.getWriter()) {
-writer.print(responseBody);
-writer.flush();
-}
-
 } else if ("POST".equalsIgnoreCase(request.getMethod())) {
-
-String responseBody = "{}";
-response.setContentType("application/json");
-
-if ("/sessions".equalsIgnoreCase(target)) {
-responseBody = "{\"id\": 1, \"kind\": \"spark\", 
\"state\": \"idle\"}";
-} else if 
("/sessions/1/statements".equalsIgnoreCase(target)) {
-responseBody = "{\"id\": 7}";
+String requestBody = IOUtils.toString(request.getReader());
+try {
+System.out.println("requestBody: " + requestBody);
--- End diff --

I think this is a leftover from your tests and should be removed.


---


[GitHub] nifi pull request #2768: NIFI-5278: fixes JSON escaping of code parameter in...

2018-06-08 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2768#discussion_r194108030
  
--- Diff: 
nifi-nar-bundles/nifi-spark-bundle/nifi-livy-processors/src/test/java/org/apache/nifi/processors/livy/TestExecuteSparkInteractive.java
 ---
@@ -85,16 +85,17 @@ private static TestServer createServer() throws 
IOException {
 
 @Test
 public void testSparkSession() throws Exception {
-
 addHandler(new LivyAPIHandler());
 
-runner.enqueue("print \"hello world\"");
+String code = "print \"hello world\" //'?!<>[]{}()$&*=%;.|_-\\";
--- End diff --

instead of changing the existing UT, what about creating a new one for this 
specific case? It is good that every UT has very little scope so that failures 
can clearly indicate to the developer what he/she broke applying a patch...


---


[GitHub] nifi pull request #2768: NIFI-5278: fixes JSON escaping of code parameter in...

2018-06-08 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2768#discussion_r194108434
  
--- Diff: 
nifi-nar-bundles/nifi-spark-bundle/nifi-livy-processors/src/test/java/org/apache/nifi/processors/livy/TestExecuteSparkInteractiveSSL.java
 ---
@@ -109,13 +109,15 @@ private static TestServer createServer() throws 
IOException {
 public void testSslSparkSession() throws Exception {
 addHandler(new LivyAPIHandler());
 
-runner.enqueue("print \"hello world\"");
+String code = "print \"hello world\" //'?!<>[]{}()$&*=%;.|_-\\";
--- End diff --

I think that adding a UT for the non-SSL case is enough, isn't it? There is 
no difference among SSL and non-SSL about escaping and the content IIUC.


---


[GitHub] nifi issue #2754: NIFI-5262: Retrieve file attributes only once in ListFile

2018-06-06 Thread mgaido91
Github user mgaido91 commented on the issue:

https://github.com/apache/nifi/pull/2754
  
thanks @pvillard31 


---


[GitHub] nifi pull request #2754: NIFI-5262: Retrieve file attributes only once in Li...

2018-06-04 Thread mgaido91
GitHub user mgaido91 opened a pull request:

https://github.com/apache/nifi/pull/2754

NIFI-5262: Retrieve file attributes only once in ListFile

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [x] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [x] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [x] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [x] Is your initial contribution a single, squashed commit?

### For code changes:
- [x] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/mgaido91/nifi NIFI-5262

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/2754.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2754


commit 9ab780f7f8ba0dd0782796b09e734102ffb9d42f
Author: Marco Gaido 
Date:   2018-06-04T11:16:49Z

[NIFI-5262] Retrieve file attributes only once in ListFile




---


[GitHub] nifi issue #2733: NIFI-5228: Allow user to choose whether or not to add File...

2018-05-23 Thread mgaido91
Github user mgaido91 commented on the issue:

https://github.com/apache/nifi/pull/2733
  
LGTM, thanks


---


[GitHub] nifi pull request #2733: NIFI-5228: Allow user to choose whether or not to a...

2018-05-23 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2733#discussion_r190156015
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/ListFile.java
 ---
@@ -255,43 +262,47 @@ public void onScheduled(final ProcessContext context) 
{
 final Path absPath = filePath.toAbsolutePath();
 final String absPathString = absPath.getParent().toString() + 
File.separator;
 
+final DateFormat formatter = new 
SimpleDateFormat(FILE_MODIFY_DATE_ATTR_FORMAT, Locale.US);
--- End diff --

nit: what about creating a `ThreadLocal` global variable for this, in order 
to avoid creating one instance for each item?


---


[GitHub] nifi pull request #2733: NIFI-5228: Allow user to choose whether or not to a...

2018-05-23 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2733#discussion_r190155739
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/ListFile.java
 ---
@@ -255,43 +262,47 @@ public void onScheduled(final ProcessContext context) 
{
 final Path absPath = filePath.toAbsolutePath();
 final String absPathString = absPath.getParent().toString() + 
File.separator;
 
+final DateFormat formatter = new 
SimpleDateFormat(FILE_MODIFY_DATE_ATTR_FORMAT, Locale.US);
+
 attributes.put(CoreAttributes.PATH.key(), relativePathString);
 attributes.put(CoreAttributes.FILENAME.key(), 
fileInfo.getFileName());
 attributes.put(CoreAttributes.ABSOLUTE_PATH.key(), absPathString);
+attributes.put(FILE_SIZE_ATTRIBUTE, 
Long.toString(fileInfo.getSize()));
+attributes.put(FILE_LAST_MODIFY_TIME_ATTRIBUTE, 
formatter.format(new Date(fileInfo.getLastModifiedTime(;
+
+if (context.getProperty(INCLUDE_FILE_ATTRIBUTES).asBoolean()) {
--- End diff --

nit: maybe we can read this in the `onScheduled` method


---


[GitHub] nifi issue #2657: NIFI-5109 Reset justElectedPrimaryNode flag right after re...

2018-05-22 Thread mgaido91
Github user mgaido91 commented on the issue:

https://github.com/apache/nifi/pull/2657
  
LGTM, thanks @viazovskyi.

@ijokarumawak @joewitt any other comments?


---


[GitHub] nifi issue #2657: NIFI-5109 Reset justElectedPrimaryNode flag right after re...

2018-05-21 Thread mgaido91
Github user mgaido91 commented on the issue:

https://github.com/apache/nifi/pull/2657
  
@viazovskyi I suggest you something like:
```
git checkout nifi/master
git checkout -b nifi-5109_2
// do your changes on this branch and commit them
git branch -D nifi-5109
git branch -m nifi-5109
git push -f -u origin nifi-5109
```
If you want details/explanation on each command let me know.  Or let me 
know if I can help you somehow. Thanks.


---


[GitHub] nifi issue #2657: NIFI-5109 Reset justElectedPrimaryNode flag right after re...

2018-05-21 Thread mgaido91
Github user mgaido91 commented on the issue:

https://github.com/apache/nifi/pull/2657
  
@viazovskyi I think that the easiest way is probably creating a new branch 
locally from the master, reapply your changes on it and force push to this 
branch.


---


[GitHub] nifi issue #2657: NIFI-5109 Reset justElectedPrimaryNode flag right after re...

2018-05-21 Thread mgaido91
Github user mgaido91 commented on the issue:

https://github.com/apache/nifi/pull/2657
  
@viazovskyi your last merge screwed a bit this PR. May you please restore 
it? Thanks.


---


[GitHub] nifi pull request #2657: NIFI-5109 Reset justElectedPrimaryNode flag right a...

2018-05-20 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2657#discussion_r189462936
  
--- Diff: 
nifi-nar-bundles/nifi-extension-utils/nifi-processor-utils/src/main/java/org/apache/nifi/processor/util/list/AbstractListProcessor.java
 ---
@@ -375,7 +376,7 @@ public void onTrigger(final ProcessContext context, 
final ProcessSession session
 // If our determined timestamp is the same as that 
of our last listing, skip this execution as there are no updates
 if 
(minTimestampToListMillis.equals(this.lastListedLatestEntryTimestampMillis)) {
 context.yield();
--- End diff --

no I just meant moving this line `context.yield();` in the `if` you 
introduced, before the return statement.


---


[GitHub] nifi pull request #2657: NIFI-5109 Reset justElectedPrimaryNode flag right a...

2018-05-20 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2657#discussion_r189455715
  
--- Diff: 
nifi-nar-bundles/nifi-extension-utils/nifi-processor-utils/src/main/java/org/apache/nifi/processor/util/list/AbstractListProcessor.java
 ---
@@ -375,7 +376,7 @@ public void onTrigger(final ProcessContext context, 
final ProcessSession session
 // If our determined timestamp is the same as that 
of our last listing, skip this execution as there are no updates
 if 
(minTimestampToListMillis.equals(this.lastListedLatestEntryTimestampMillis)) {
 context.yield();
--- End diff --

nit: I'd prefer if we move this before the return statement


---


[GitHub] nifi pull request #2657: NIFI-5109 Reset justElectedPrimaryNode flag right a...

2018-05-20 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2657#discussion_r189455719
  
--- Diff: 
nifi-nar-bundles/nifi-extension-utils/nifi-processor-utils/src/main/java/org/apache/nifi/processor/util/list/AbstractListProcessor.java
 ---
@@ -356,6 +356,7 @@ private EntityListing deserialize(final String 
serializedState) throws JsonParse
 @Override
 public void onTrigger(final ProcessContext context, final 
ProcessSession session) throws ProcessException {
 Long minTimestampToListMillis = 
lastListedLatestEntryTimestampMillis;
+boolean yielded = false;
--- End diff --

nit: I'd propose a more meaningful name, like `isUpToDate` or something 
similar.


---


[GitHub] nifi issue #2657: NIFI-5109 Reset justElectedPrimaryNode flag right after re...

2018-05-17 Thread mgaido91
Github user mgaido91 commented on the issue:

https://github.com/apache/nifi/pull/2657
  
@viazovskyi do you think it is possible to add a UT for this?


---


[GitHub] nifi pull request #2657: NIFI-5109 Reset justElectedPrimaryNode flag right a...

2018-05-17 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2657#discussion_r188866746
  
--- Diff: 
nifi-nar-bundles/nifi-extension-utils/nifi-processor-utils/src/main/java/org/apache/nifi/processor/util/list/AbstractListProcessor.java
 ---
@@ -375,7 +375,7 @@ public void onTrigger(final ProcessContext context, 
final ProcessSession session
 // If our determined timestamp is the same as that 
of our last listing, skip this execution as there are no updates
 if 
(minTimestampToListMillis.equals(this.lastListedLatestEntryTimestampMillis)) {
 context.yield();
-return;
+break;
--- End diff --

sorry, I just realize that now, with this fix we are performing an extra 
listing which is not needed. Maybe it is better to just have return as before, 
but prepend a `justElectedPrimaryNode = false;`. So I am thinking of 
substituting this line with:

```
justElectedPrimaryNode = false;
return;
```

what do you think @ijokarumawak @joewitt  @viazovskyi ?
Thanks


---


[GitHub] nifi issue #2657: NIFI-5109 Reset justElectedPrimaryNode flag right after re...

2018-05-16 Thread mgaido91
Github user mgaido91 commented on the issue:

https://github.com/apache/nifi/pull/2657
  
I also agree on @ijokarumawak's suggestion, may you please update the PR 
accordingly @viazovskyi ?


---


[GitHub] nifi issue #2630: NIFI-5041 Adds SPNEGO authentication to LivySessionControl...

2018-04-12 Thread mgaido91
Github user mgaido91 commented on the issue:

https://github.com/apache/nifi/pull/2630
  
@ottobackwards yes, I agree and since the build passing is the one in 
France, I assume we are downloading from an European mirror. Can we change this 
according to the place of build?


---


[GitHub] nifi issue #2606: NIFI-5043: TailFile in Multifile mode should not open new ...

2018-04-05 Thread mgaido91
Github user mgaido91 commented on the issue:

https://github.com/apache/nifi/pull/2606
  
Thank you for taking a look at this and reviewing it, @markap14!


---


[GitHub] nifi pull request #2606: NIFI-5043: TailFile in Multifile mode should not op...

2018-04-05 Thread mgaido91
GitHub user mgaido91 opened a pull request:

https://github.com/apache/nifi/pull/2606

NIFI-5043: TailFile in Multifile mode should not open new readers in 
onTrigger

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [x] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [x] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [x] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [x] Is your initial contribution a single, squashed commit?

### For code changes:
- [x] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes? 
**Manual tests (running lsof while the processor runs)**
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/mgaido91/nifi NIFI-5043

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/2606.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2606


commit d688846af44f2647d4d3e38508f9af4ca503632f
Author: Marco Gaido <marcogaido91@...>
Date:   2018-04-05T11:44:55Z

NIFI-5043: TailFile in Multifile mode should not open new readers in 
onTrigger




---


[GitHub] nifi issue #2565: NIFI-4631: Use java.nio.file.Files in ListFile to improve ...

2018-03-23 Thread mgaido91
Github user mgaido91 commented on the issue:

https://github.com/apache/nifi/pull/2565
  
@markap14 I don't know the reason but it might be because I run it on a 
deeply nested structure made basically all of folders. Probably the new 
solution is particularly more efficient in that case.


---


[GitHub] nifi pull request #2565: NIFI-4631: Use java.nio.file.Files in ListFile to i...

2018-03-19 Thread mgaido91
GitHub user mgaido91 opened a pull request:

https://github.com/apache/nifi/pull/2565

NIFI-4631: Use java.nio.file.Files in ListFile to improve performance

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [x] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [x] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [x] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [x] Is your initial contribution a single, squashed commit?

### For code changes:
- [x] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes? 
Performed a perf test on a random generated tree of about 100.000 nested 
directories: with the new code it takes ~2.5s against ~8s of the old code.
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? NA 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly? NA
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly? NA
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties? NA

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered? NA

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.

For the reason of the choice of the implementation, please refer to 
https://github.com/brettryan/io-recurse-tests and 
https://stackoverflow.com/questions/2056221/recursively-list-files-in-java.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/mgaido91/nifi NIFI-4631

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/2565.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2565


commit fc039b7738a65a4559d68da768ba3b38077a61ea
Author: Marco Gaido <marcogaido91@...>
Date:   2018-03-19T13:48:27Z

NIFI-4631: Use java.nio.file.Files in ListFile to improve performance




---


[GitHub] nifi issue #2544: NIFI-4959: Remove flowfiles and close connection for Bad R...

2018-03-16 Thread mgaido91
Github user mgaido91 commented on the issue:

https://github.com/apache/nifi/pull/2544
  
Thanks for your help @markap14


---


[GitHub] nifi issue #2544: NIFI-4959: Remove flowfiles and close connection for Bad R...

2018-03-16 Thread mgaido91
Github user mgaido91 commented on the issue:

https://github.com/apache/nifi/pull/2544
  
the test error is unrelated:
```
[INFO] --- maven-remote-resources-plugin:1.5:process 
(process-resource-bundles) @ nifi-hive-processors ---
[ERROR] Tests run: 4, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 
13.251 s <<< FAILURE! - in org.apache.nifi.jms.processors.PublishJMSTest
[ERROR] 
validateSuccessfulPublishAndTransferToSuccessWithEL(org.apache.nifi.jms.processors.PublishJMSTest)
  Time elapsed: 10.007 s  <<< ERROR!
org.junit.runners.model.TestTimedOutException: test timed out after 1 
milliseconds
at 
org.apache.nifi.jms.processors.PublishJMSTest.validateSuccessfulPublishAndTransferToSuccessWithEL(PublishJMSTest.java:103)
```


---


[GitHub] nifi pull request #2544: NIFI-4959: Remove flowfiles and close connection fo...

2018-03-16 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2544#discussion_r175079382
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/HandleHttpRequest.java
 ---
@@ -520,6 +521,27 @@ public void onTrigger(final ProcessContext context, 
final ProcessSession session
 new Object[]{request.getRemoteAddr(), e});
 session.remove(flowFile);
 return;
+} catch (final FlowFileAccessException e) {
+// some bad requests can produce a IOException on the HTTP 
stream, which makes a FlowFileAccessException to
+// be thrown. We should handle these cases here, while other 
FlowFileAccessException are re-thrown
+if (!(e.getCause() != null && e.getCause() instanceof 
FlowFileAccessException
--- End diff --

yes, thank you very much for your comment and your help @markap14 . I am 
updating the PR accordingly. Thanks.


---


[GitHub] nifi pull request #2544: NIFI-4959: Remove flowfiles and close connection fo...

2018-03-14 Thread mgaido91
GitHub user mgaido91 opened a pull request:

https://github.com/apache/nifi/pull/2544

NIFI-4959: Remove flowfiles and close connection for Bad Requests causing 
IOException



Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [x] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [x] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [x] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [x] Is your initial contribution a single, squashed commit?

### For code changes:
- [x] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes? Tested 
manually
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? NA
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly? NA
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly? NA
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties? NA

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/mgaido91/nifi NIFI-4959

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/2544.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2544


commit 15f0a04a1b2b4600ccf2b43b83f0ec157dbfd500
Author: Marco Gaido <marcogaido91@...>
Date:   2018-03-14T15:27:17Z

NIFI-4959: Remove flowfiles and close connection for Bad Requests causing 
IOException




---


[GitHub] nifi pull request #2504: NIFI-4773: Fixed column type map initialization in ...

2018-03-02 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2504#discussion_r171887381
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/QueryDatabaseTable.java
 ---
@@ -197,6 +198,12 @@ public void setup(final ProcessContext context) {
 maxValueProperties = 
getDefaultMaxValueProperties(context.getProperties());
 }
 
+@OnStopped
+public void stop() {
+// Reset the column type map in case properties change
+setupComplete.set(false);
--- End diff --

I see, thanks for your explaination


---


[GitHub] nifi pull request #2504: NIFI-4773: Fixed column type map initialization in ...

2018-03-02 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2504#discussion_r171882320
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/QueryDatabaseTable.java
 ---
@@ -197,6 +198,12 @@ public void setup(final ProcessContext context) {
 maxValueProperties = 
getDefaultMaxValueProperties(context.getProperties());
 }
 
+@OnStopped
+public void stop() {
+// Reset the column type map in case properties change
+setupComplete.set(false);
--- End diff --

can't we just do the setup in `@OnScheduled` and move the setupComplete 
flag only to `GenerateTableFetch` or remove it? I think the code would be more 
straightforward like this. What do you think?


---


[GitHub] nifi issue #2201: NIFI-4367 Fix on processor for permit deriving script clas...

2018-02-21 Thread mgaido91
Github user mgaido91 commented on the issue:

https://github.com/apache/nifi/pull/2201
  
@mattyb149 I think that the test error is unrelated. I run the failing test 
locally and it passes. Do you have further comments for this PR?


---


[GitHub] nifi issue #2201: NIFI-4367 Fix on processor for permit deriving script clas...

2018-02-15 Thread mgaido91
Github user mgaido91 commented on the issue:

https://github.com/apache/nifi/pull/2201
  
kindly ping @frett27 


---


[GitHub] nifi issue #2226: NIFI-4080: Added EL support to fields in ValidateCSV

2018-02-08 Thread mgaido91
Github user mgaido91 commented on the issue:

https://github.com/apache/nifi/pull/2226
  
+1, locally it builds and passes all the tests


---


[GitHub] nifi issue #2445: NIFI-4834: Updated AbstractJMSProcessor to use a separate ...

2018-02-07 Thread mgaido91
Github user mgaido91 commented on the issue:

https://github.com/apache/nifi/pull/2445
  
on OSX they are passing too, so it may be a platform related error


---


[GitHub] nifi issue #2445: NIFI-4834: Updated AbstractJMSProcessor to use a separate ...

2018-02-07 Thread mgaido91
Github user mgaido91 commented on the issue:

https://github.com/apache/nifi/pull/2445
  
this is causing some test failures on travis, but I am not able to 
reproduce the failures locally 
(https://api.travis-ci.org/v3/job/338096661/log.txt). Maybe we can revert it 
for the moment and merge it again after understanding and fixing the issue?


---


[GitHub] nifi issue #2226: NIFI-4080: Added EL support to fields in ValidateCSV

2018-02-07 Thread mgaido91
Github user mgaido91 commented on the issue:

https://github.com/apache/nifi/pull/2226
  
thanks @mattyb149. The test failures are not related to this PR. They are 
caused by 3ca7c3e7a19c7f496a214d9b93ace43b1507c9ec. I am investigating them. I 
am not sure if it is possible to re-trigger a build for this PR, thanks.


---


[GitHub] nifi issue #2226: NIFI-4080: Added EL support to fields in ValidateCSV

2018-02-01 Thread mgaido91
Github user mgaido91 commented on the issue:

https://github.com/apache/nifi/pull/2226
  
@mattyb149 there is a test failure because I just cherry-picked my changes 
on top of your PR, so when I run the tests, I didn't run the test you added 
here (sorry, my bad). This test is not valid anymore, since we cannot validate 
the schema since we have it only when we get the flow files. May you please 
remove `testValidateWithEL` then please? Thanks.


---


[GitHub] nifi issue #2201: NIFI-4367 Fix on processor for permit deriving script clas...

2018-01-29 Thread mgaido91
Github user mgaido91 commented on the issue:

https://github.com/apache/nifi/pull/2201
  
@frett27 I did some more checks and it seems to work. Are you ok if I send 
a PR on your branch to improve the UT? Or can you please update this PR 
according to the comments (and in order to make it passing the tests)? Thanks.


---


[GitHub] nifi issue #2201: NIFI-4367 Fix on processor for permit deriving script clas...

2018-01-25 Thread mgaido91
Github user mgaido91 commented on the issue:

https://github.com/apache/nifi/pull/2201
  
@frett27 I tried your fix, but if you add custom properties to the scripted 
processor, then it results as invalid with the message:
```
... is not a supported property.
```
Thus a more complete UT might help discovering all the problems and solve 
all of them, because this fix at the moment seems not to be enough to make 
`InvokeScriptedProcessor` working with `AbstractProcessor`.


---


[GitHub] nifi pull request #2201: NIFI-4367 Fix on processor for permit deriving scri...

2018-01-24 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2201#discussion_r163571877
  
--- Diff: 
nifi-nar-bundles/nifi-scripting-bundle/nifi-scripting-processors/src/test/resources/groovy/test_implementingabstractProcessor.groovy
 ---
@@ -0,0 +1,40 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+import org.apache.nifi.components.PropertyDescriptor
+import org.apache.nifi.components.ValidationContext
+import org.apache.nifi.components.ValidationResult
+import org.apache.nifi.logging.ComponentLog
+import org.apache.nifi.processor.AbstractProcessor
+import org.apache.nifi.processor.ProcessContext
+import org.apache.nifi.processor.ProcessSession
+import org.apache.nifi.processor.ProcessSessionFactory
+import org.apache.nifi.processor.ProcessorInitializationContext
+import org.apache.nifi.processor.Relationship
+import org.apache.nifi.processor.exception.ProcessException
+
+class testImplementingAbstractProcessor extends AbstractProcessor {
+   
+   def ComponentLog log
--- End diff --

I think indentation is a bit strange in this file...


---


[GitHub] nifi pull request #2201: NIFI-4367 Fix on processor for permit deriving scri...

2018-01-24 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2201#discussion_r163569718
  
--- Diff: 
nifi-nar-bundles/nifi-scripting-bundle/nifi-scripting-processors/src/main/java/org/apache/nifi/processors/script/InvokeScriptedProcessor.java
 ---
@@ -461,8 +463,13 @@ public File getKerberosConfigurationFile() {
 // if there was existing validation errors and the processor 
loaded successfully
 if (currentValidationResults.isEmpty() && instance != null) {
 try {
-// defer to the underlying processor for validation
-final Collection instanceResults = 
instance.validate(context);
+// defer to the underlying processor for validation, 
without the
+// invokescriptedprocessor properties
+final Set innerPropertyDescriptor = 
new HashSet(scriptingComponentHelper.getDescriptors());
+
+ValidationContext innerValidationContext = new 
FilteredPropertiesValidationContextAdapter(context, innerPropertyDescriptor);
+final Collection instanceResults = 
instance.validate(innerValidationContext);
--- End diff --

I think we should do the same in the `onTrigger` method too


---


[GitHub] nifi pull request #2201: NIFI-4367 Fix on processor for permit deriving scri...

2018-01-24 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2201#discussion_r163568827
  
--- Diff: 
nifi-nar-bundles/nifi-scripting-bundle/nifi-scripting-processors/src/main/java/org/apache/nifi/script/impl/FilteredPropertiesValidationContextAdapter.java
 ---
@@ -0,0 +1,92 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.script.impl;
+
+import java.util.ArrayList;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.Map;
+import java.util.Set;
+
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.PropertyValue;
+import org.apache.nifi.components.ValidationContext;
+
+/**
+ * filter properties in the ValidationContext, proxy approach, for 
removing unwanted properties
+ *
+ * @author pfreydiere
+ *
+ */
+public class FilteredPropertiesValidationContextAdapter extends 
ValidationContextAdapter {
+
+private Set removedProperties;
+private Set removedPropertyNames;
+
+public FilteredPropertiesValidationContextAdapter(ValidationContext 
validationContext, Set removedProperties) {
+super(validationContext);
+this.removedProperties = removedProperties;
+Set keys = new HashSet<>();
+for (PropertyDescriptor p : removedProperties) {
+keys.add(p.getName());
+}
+this.removedPropertyNames = keys;
+}
+
+@Override
+public Map<String, String> getAllProperties() {
+HashMap<String, String> returnedProperties = new 
HashMap<>(super.getAllProperties());
--- End diff --

I think that the `returnedProperties` can be computed once in the 
constructor and reused, instead of being recreated anytime


---


[GitHub] nifi pull request #2426: NIFI-4790: support HTTPS Proxy in InvokeHTTP

2018-01-24 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2426#discussion_r163554821
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/InvokeHTTP.java
 ---
@@ -213,11 +219,22 @@
 
 public static final PropertyDescriptor PROP_SSL_CONTEXT_SERVICE = new 
PropertyDescriptor.Builder()
 .name("SSL Context Service")
-.description("The SSL Context Service used to provide client 
certificate information for TLS/SSL (https) connections.")
+.description("The SSL Context Service used to provide client 
certificate information for TLS/SSL (https) connections."
++ " It is also used to connect to HTTPS Proxy.")
 .required(false)
 .identifiesControllerService(SSLContextService.class)
 .build();
 
+public static final PropertyDescriptor PROP_PROXY_TYPE = new 
PropertyDescriptor.Builder()
+.name("Proxy Type")
+.displayName("Proxy Type")
+.description("The type of the proxy we are connecting to.")
+.required(true)
+.allowableValues(HTTP, HTTPS)
+.defaultValue(HTTP.getValue())
+.expressionLanguageSupported(true)
--- End diff --

thanks, I'll make it like this


---


[GitHub] nifi issue #2294: NIFI-3538 Added DeleteHBaseRow

2018-01-24 Thread mgaido91
Github user mgaido91 commented on the issue:

https://github.com/apache/nifi/pull/2294
  
LGTM


---


[GitHub] nifi issue #2294: NIFI-3538 Added DeleteHBaseRow

2018-01-24 Thread mgaido91
Github user mgaido91 commented on the issue:

https://github.com/apache/nifi/pull/2294
  
thanks @MikeThomsen. Actually I was referring at [this 
comment](https://github.com/apache/nifi/pull/2294#discussion_r160140592). I 
think that is my last comment. Overall it LGTM.

On the batch size, I think that the committers who will have to review this 
before merging can express their opinion. Your opinion makes sense, but it 
involves a consistency issue: we need a tradeoff and I think only committers 
can decide what to do.


---


[GitHub] nifi issue #2426: NIFI-4790: support HTTPS Proxy in InvokeHTTP

2018-01-24 Thread mgaido91
Github user mgaido91 commented on the issue:

https://github.com/apache/nifi/pull/2426
  
@ijokarumawak thanks, added the EL support.
I am not sure why Appveyor is failing. The failure seems not related to the 
PR.


---


[GitHub] nifi issue #2294: NIFI-3538 Added DeleteHBaseRow

2018-01-23 Thread mgaido91
Github user mgaido91 commented on the issue:

https://github.com/apache/nifi/pull/2294
  
@MikeThomsen sorry may you please address the last comment? And add the 
charset parameter? Thanks.


---


[GitHub] nifi pull request #2426: NIFI-4790: support HTTPS Proxy in InvokeHTTP

2018-01-23 Thread mgaido91
GitHub user mgaido91 opened a pull request:

https://github.com/apache/nifi/pull/2426

NIFI-4790: support HTTPS Proxy in InvokeHTTP

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [x] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [x] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [x] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [x] Is your initial contribution a single, squashed commit?

### For code changes:
- [x] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [x] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/mgaido91/nifi NIFI-4790

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/2426.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2426


commit 51c867ad9685063acd970d594c875fc8abe11836
Author: Marco Gaido <marcogaido91@...>
Date:   2018-01-19T10:02:56Z

NIFI-4790: support HTTPS Proxy in InvokeHTTP




---


[GitHub] nifi issue #2401: NIFI-4759 Fixed a bug that left a hard-coded reference to ...

2018-01-14 Thread mgaido91
Github user mgaido91 commented on the issue:

https://github.com/apache/nifi/pull/2401
  
@joewitt @pvillard31 maybe you can help reviewing this too, thanks.


---


[GitHub] nifi issue #2401: NIFI-4759 Fixed a bug that left a hard-coded reference to ...

2018-01-14 Thread mgaido91
Github user mgaido91 commented on the issue:

https://github.com/apache/nifi/pull/2401
  
LGTM, thanks.


---


[GitHub] nifi pull request #2401: NIFI-4759 Fixed a bug that left a hard-coded refere...

2018-01-12 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2401#discussion_r161261787
  
--- Diff: 
nifi-nar-bundles/nifi-mongodb-bundle/nifi-mongodb-processors/src/test/java/org/apache/nifi/processors/mongodb/PutMongoTest.java
 ---
@@ -256,4 +257,74 @@ public void testUpsertWithOperators() throws Exception 
{
 Assert.assertEquals("Msg had wrong value", msg, "Hi");
 }
 }
+
+/*
+ * Start NIFI-4759 Regression Tests
+ *
+ * 2 issues with ID field:
+ *
+ * * Assumed _id is the update key, causing failures when the user 
configured a different one in the UI.
+ * * Treated _id as a string even when it is an ObjectID sent from 
another processor as a string value.
+ *
+ * Expected behavior:
+ *
+ * * update key field should work no matter what (legal) value it is 
set to be.
+ * * _ids that are ObjectID should become real ObjectIDs when added to 
Mongo.
+ * * _ids that are arbitrary strings should be still go in as strings.
+ *
+ */
+@Test
+public void testNiFi_4759_Regressions() {
+String[] upserts = new String[]{
+"{\n" +
--- End diff --

What about writing the JSON document on one line?


---


[GitHub] nifi pull request #2401: NIFI-4759 Fixed a bug that left a hard-coded refere...

2018-01-12 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2401#discussion_r161261619
  
--- Diff: 
nifi-nar-bundles/nifi-mongodb-bundle/nifi-mongodb-processors/src/test/java/org/apache/nifi/processors/mongodb/PutMongoTest.java
 ---
@@ -256,4 +257,74 @@ public void testUpsertWithOperators() throws Exception 
{
 Assert.assertEquals("Msg had wrong value", msg, "Hi");
 }
 }
+
+/*
+ * Start NIFI-4759 Regression Tests
+ *
+ * 2 issues with ID field:
+ *
+ * * Assumed _id is the update key, causing failures when the user 
configured a different one in the UI.
+ * * Treated _id as a string even when it is an ObjectID sent from 
another processor as a string value.
+ *
+ * Expected behavior:
+ *
+ * * update key field should work no matter what (legal) value it is 
set to be.
+ * * _ids that are ObjectID should become real ObjectIDs when added to 
Mongo.
+ * * _ids that are arbitrary strings should be still go in as strings.
+ *
+ */
+@Test
+public void testNiFi_4759_Regressions() {
+String[] upserts = new String[]{
+"{\n" +
+"\t\"_id\": \"12345\",\n" +
+"\t\"$set\": {\n" +
+"\t\t\"msg\": \"Hello, world\"\n" +
+"\t}\n" +
+"}",
+
+"{\n" +
+"\t\"_id\": \"5a5617b9c1f5de6d8276e87d\",\n" +
+"\t\"$set\": {\n" +
+"\t\t\"msg\": \"Hello, world\"\n" +
+"\t}\n" +
+"}",
+
+"{\n" +
+"\t\"updateKey\": \"12345\",\n" +
+"\t\"$set\": {\n" +
+"\t\t\"msg\": \"Hello, world\"\n" +
+"\t}\n" +
+"}",
+};
+
+String[] updateKeyProps = new String[] { "_id", "_id", "updateKey" 
};
+Object[] updateKeys = new Object[] { "12345", new 
ObjectId("5a5617b9c1f5de6d8276e87d"), "12345" };
+int index = 0;
+
+runner.setProperty(PutMongo.UPDATE_MODE, 
PutMongo.UPDATE_WITH_OPERATORS);
+runner.setProperty(PutMongo.MODE, "update");
+runner.setProperty(PutMongo.UPSERT, "true");
+
+for (String upsert : upserts) {
+runner.setProperty(PutMongo.UPDATE_QUERY_KEY, 
updateKeyProps[index]);
+for (int x = 0; x < 5; x++) {
--- End diff --

What about using  2 instead of 5?


---


[GitHub] nifi pull request #2401: NIFI-4759 Fixed a bug that left a hard-coded refere...

2018-01-12 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2401#discussion_r161262183
  
--- Diff: 
nifi-nar-bundles/nifi-mongodb-bundle/nifi-mongodb-processors/src/test/java/org/apache/nifi/processors/mongodb/PutMongoTest.java
 ---
@@ -256,4 +257,74 @@ public void testUpsertWithOperators() throws Exception 
{
 Assert.assertEquals("Msg had wrong value", msg, "Hi");
 }
 }
+
+/*
+ * Start NIFI-4759 Regression Tests
+ *
+ * 2 issues with ID field:
+ *
+ * * Assumed _id is the update key, causing failures when the user 
configured a different one in the UI.
+ * * Treated _id as a string even when it is an ObjectID sent from 
another processor as a string value.
+ *
+ * Expected behavior:
+ *
+ * * update key field should work no matter what (legal) value it is 
set to be.
+ * * _ids that are ObjectID should become real ObjectIDs when added to 
Mongo.
+ * * _ids that are arbitrary strings should be still go in as strings.
+ *
+ */
+@Test
+public void testNiFi_4759_Regressions() {
+String[] upserts = new String[]{
+"{\n" +
+"\t\"_id\": \"12345\",\n" +
+"\t\"$set\": {\n" +
+"\t\t\"msg\": \"Hello, world\"\n" +
+"\t}\n" +
+"}",
+
+"{\n" +
+"\t\"_id\": \"5a5617b9c1f5de6d8276e87d\",\n" +
+"\t\"$set\": {\n" +
+"\t\t\"msg\": \"Hello, world\"\n" +
+"\t}\n" +
+"}",
+
+"{\n" +
+"\t\"updateKey\": \"12345\",\n" +
+"\t\"$set\": {\n" +
+"\t\t\"msg\": \"Hello, world\"\n" +
+"\t}\n" +
+"}",
+};
+
+String[] updateKeyProps = new String[] { "_id", "_id", "updateKey" 
};
+Object[] updateKeys = new Object[] { "12345", new 
ObjectId("5a5617b9c1f5de6d8276e87d"), "12345" };
+int index = 0;
+
+runner.setProperty(PutMongo.UPDATE_MODE, 
PutMongo.UPDATE_WITH_OPERATORS);
+runner.setProperty(PutMongo.MODE, "update");
+runner.setProperty(PutMongo.UPSERT, "true");
+
+for (String upsert : upserts) {
+runner.setProperty(PutMongo.UPDATE_QUERY_KEY, 
updateKeyProps[index]);
+for (int x = 0; x < 5; x++) {
+runner.enqueue(upsert);
+}
+runner.run(5, true, true);
+runner.assertTransferCount(PutMongo.REL_FAILURE, 0);
+runner.assertTransferCount(PutMongo.REL_SUCCESS, 5);
+
+Document query = new Document(updateKeyProps[index], 
updateKeys[index]);
+Document result = collection.find(query).first();
+Assert.assertNotNull("Result was null", result);
+Assert.assertEquals("Count was wrong", 1, 
collection.count(query));
+runner.clearTransferState();
+index++;
+}
+}
+
+/*
--- End diff --

Nit: I'd remove this and the empty line above...


---


[GitHub] nifi pull request #2392: NIFI-4759 Fixed a bug that left a hard-coded refere...

2018-01-12 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2392#discussion_r161213967
  
--- Diff: 
nifi-nar-bundles/nifi-mongodb-bundle/nifi-mongodb-processors/src/test/java/org/apache/nifi/processors/mongodb/PutMongoTest.java
 ---
@@ -256,4 +258,72 @@ public void testUpsertWithOperators() throws Exception 
{
 Assert.assertEquals("Msg had wrong value", msg, "Hi");
 }
 }
+
+/*
+ * Start NIFI-4759 Regression Tests
+ *
+ * 2 issues with ID field:
+ *
+ * * Assumed _id is the update key, causing failures when the user 
configured a different one in the UI.
+ * * Treated _id as a string even when it is an ObjectID sent from 
another processor as a string value.
+ *
+ * Expected behavior:
+ *
+ * * update key field should work no matter what (legal) value it is 
set to be.
+ * * _ids that are ObjectID should become real ObjectIDs when added to 
Mongo.
+ * * _ids that are arbitrary strings should be still go in as strings.
+ *
+ */
+
+@Test
+public void testNiFi_4759_Regressions() {
+String[] upserts = new String[]{
+"{\n" +
--- End diff --

I just meant that is hard to read and understand the content of this JSON. 
I think that something like:
```
String[] upserts = new String[]{
"{...}",
"{...}",
"{...}"};
```
would improve readability.


---


[GitHub] nifi pull request #2392: NIFI-4759 Fixed a bug that left a hard-coded refere...

2018-01-12 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2392#discussion_r161213577
  
--- Diff: 
nifi-nar-bundles/nifi-mongodb-bundle/nifi-mongodb-processors/src/test/java/org/apache/nifi/processors/mongodb/PutMongoTest.java
 ---
@@ -256,4 +258,72 @@ public void testUpsertWithOperators() throws Exception 
{
 Assert.assertEquals("Msg had wrong value", msg, "Hi");
 }
 }
+
+/*
+ * Start NIFI-4759 Regression Tests
+ *
+ * 2 issues with ID field:
+ *
+ * * Assumed _id is the update key, causing failures when the user 
configured a different one in the UI.
+ * * Treated _id as a string even when it is an ObjectID sent from 
another processor as a string value.
+ *
+ * Expected behavior:
+ *
+ * * update key field should work no matter what (legal) value it is 
set to be.
+ * * _ids that are ObjectID should become real ObjectIDs when added to 
Mongo.
+ * * _ids that are arbitrary strings should be still go in as strings.
+ *
+ */
+
+@Test
+public void testNiFi_4759_Regressions() {
+String[] upserts = new String[]{
+"{\n" +
+"\t\"_id\": \"12345\",\n" +
+"\t\"$set\": {\n" +
+"\t\t\"msg\": \"Hello, world\"\n" +
+"\t}\n" +
+"}",
+"{\n" +
+"\t\"_id\": \"5a5617b9c1f5de6d8276e87d\",\n" +
+"\t\"$set\": {\n" +
+"\t\t\"msg\": \"Hello, world\"\n" +
+"\t}\n" +
+"}",
+"{\n" +
+"\t\"updateKey\": \"12345\",\n" +
+"\t\"$set\": {\n" +
+"\t\t\"msg\": \"Hello, world\"\n" +
+"\t}\n" +
+"}",
+};
+
+String[] updateKeyProps = new String[] { "_id", "_id", "updateKey" 
};
+Object[] updateKeys = new Object[] { "12345", new 
ObjectId("5a5617b9c1f5de6d8276e87d"), "12345" };
+int index = 0;
+
+for (String upsert : upserts) {
+runner.setProperty(PutMongo.UPDATE_MODE, 
PutMongo.UPDATE_WITH_OPERATORS);
+runner.setProperty(PutMongo.MODE, "update");
+runner.setProperty(PutMongo.UPSERT, "true");
+runner.setProperty(PutMongo.UPDATE_QUERY_KEY, 
updateKeyProps[index]);
+for (int x = 0; x < 5; x++) {
--- End diff --

then 2 is enough, isn't it?


---


[GitHub] nifi pull request #2392: NIFI-4759 Fixed a bug that left a hard-coded refere...

2018-01-12 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2392#discussion_r161213509
  
--- Diff: 
nifi-nar-bundles/nifi-mongodb-bundle/nifi-mongodb-processors/src/test/java/org/apache/nifi/processors/mongodb/PutMongoTest.java
 ---
@@ -166,6 +167,7 @@ public void testUpdateDoesNotInsert() throws Exception {
 byte[] bytes = documentToByteArray(doc);
 
 runner.setProperty(PutMongo.MODE, "update");
+runner.setProperty(PutMongo.UPSERT, "false");
--- End diff --

can we remove it then, please?


---


[GitHub] nifi pull request #2394: NIFI-4764: Add tooltips to status bar icons

2018-01-11 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2394#discussion_r160996254
  
--- Diff: 
nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-ui/src/main/webapp/js/nf/canvas/nf-process-group.js
 ---
@@ -847,6 +870,7 @@
 .text(function (d) {
 return d.activeRemotePortCount;
 });
+transmittingCount.append("title").text("Transmitting 
Remote Process Groups");
--- End diff --

no, unfortunately every time we update the text with the function above, 
the title previously added is removed (I tried it). So, if we don't put it 
here, but where you suggest, it won't show.

Otherwise, we can change a bit the structure and include both the icon and 
the text in a div and put the title in the div.


---


[GitHub] nifi pull request #2392: NIFI-4759 Fixed a bug that left a hard-coded refere...

2018-01-11 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2392#discussion_r160982784
  
--- Diff: 
nifi-nar-bundles/nifi-mongodb-bundle/nifi-mongodb-processors/src/test/java/org/apache/nifi/processors/mongodb/PutMongoTest.java
 ---
@@ -166,6 +167,7 @@ public void testUpdateDoesNotInsert() throws Exception {
 byte[] bytes = documentToByteArray(doc);
 
 runner.setProperty(PutMongo.MODE, "update");
+runner.setProperty(PutMongo.UPSERT, "false");
--- End diff --

why do we need to add this?


---


[GitHub] nifi pull request #2392: NIFI-4759 Fixed a bug that left a hard-coded refere...

2018-01-11 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2392#discussion_r160975782
  
--- Diff: 
nifi-nar-bundles/nifi-mongodb-bundle/nifi-mongodb-processors/src/test/java/org/apache/nifi/processors/mongodb/PutMongoTest.java
 ---
@@ -256,4 +258,72 @@ public void testUpsertWithOperators() throws Exception 
{
 Assert.assertEquals("Msg had wrong value", msg, "Hi");
 }
 }
+
+/*
+ * Start NIFI-4759 Regression Tests
+ *
+ * 2 issues with ID field:
+ *
+ * * Assumed _id is the update key, causing failures when the user 
configured a different one in the UI.
+ * * Treated _id as a string even when it is an ObjectID sent from 
another processor as a string value.
+ *
+ * Expected behavior:
+ *
+ * * update key field should work no matter what (legal) value it is 
set to be.
+ * * _ids that are ObjectID should become real ObjectIDs when added to 
Mongo.
+ * * _ids that are arbitrary strings should be still go in as strings.
+ *
+ */
+
+@Test
+public void testNiFi_4759_Regressions() {
+String[] upserts = new String[]{
+"{\n" +
+"\t\"_id\": \"12345\",\n" +
+"\t\"$set\": {\n" +
+"\t\t\"msg\": \"Hello, world\"\n" +
+"\t}\n" +
+"}",
+"{\n" +
+"\t\"_id\": \"5a5617b9c1f5de6d8276e87d\",\n" +
+"\t\"$set\": {\n" +
+"\t\t\"msg\": \"Hello, world\"\n" +
+"\t}\n" +
+"}",
+"{\n" +
+"\t\"updateKey\": \"12345\",\n" +
+"\t\"$set\": {\n" +
+"\t\t\"msg\": \"Hello, world\"\n" +
+"\t}\n" +
+"}",
+};
+
+String[] updateKeyProps = new String[] { "_id", "_id", "updateKey" 
};
+Object[] updateKeys = new Object[] { "12345", new 
ObjectId("5a5617b9c1f5de6d8276e87d"), "12345" };
+int index = 0;
+
+for (String upsert : upserts) {
+runner.setProperty(PutMongo.UPDATE_MODE, 
PutMongo.UPDATE_WITH_OPERATORS);
+runner.setProperty(PutMongo.MODE, "update");
+runner.setProperty(PutMongo.UPSERT, "true");
+runner.setProperty(PutMongo.UPDATE_QUERY_KEY, 
updateKeyProps[index]);
+for (int x = 0; x < 5; x++) {
+runner.enqueue(upsert);
+}
+runner.run(5, true, true);
+runner.assertTransferCount(PutMongo.REL_FAILURE, 0);
+runner.assertTransferCount(PutMongo.REL_SUCCESS, 5);
+
+Document query = new Document(updateKeyProps[index], 
updateKeys[index]);
+Document result = collection.find(query).first();
+Assert.assertNotNull("Result was null", result);
+Assert.assertEquals("Count was wrong", 1, 
collection.count(query));
+runner.clearTransferState();
+index++;
+}
+}
+
+/*
--- End diff --

this comment can be removed, it seems useless to me


---


[GitHub] nifi pull request #2392: NIFI-4759 Fixed a bug that left a hard-coded refere...

2018-01-11 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2392#discussion_r160975388
  
--- Diff: 
nifi-nar-bundles/nifi-mongodb-bundle/nifi-mongodb-processors/src/test/java/org/apache/nifi/processors/mongodb/PutMongoTest.java
 ---
@@ -256,4 +258,72 @@ public void testUpsertWithOperators() throws Exception 
{
 Assert.assertEquals("Msg had wrong value", msg, "Hi");
 }
 }
+
+/*
+ * Start NIFI-4759 Regression Tests
+ *
+ * 2 issues with ID field:
+ *
+ * * Assumed _id is the update key, causing failures when the user 
configured a different one in the UI.
+ * * Treated _id as a string even when it is an ObjectID sent from 
another processor as a string value.
+ *
+ * Expected behavior:
+ *
+ * * update key field should work no matter what (legal) value it is 
set to be.
+ * * _ids that are ObjectID should become real ObjectIDs when added to 
Mongo.
+ * * _ids that are arbitrary strings should be still go in as strings.
+ *
+ */
+
+@Test
+public void testNiFi_4759_Regressions() {
+String[] upserts = new String[]{
+"{\n" +
+"\t\"_id\": \"12345\",\n" +
+"\t\"$set\": {\n" +
+"\t\t\"msg\": \"Hello, world\"\n" +
+"\t}\n" +
+"}",
+"{\n" +
+"\t\"_id\": \"5a5617b9c1f5de6d8276e87d\",\n" +
+"\t\"$set\": {\n" +
+"\t\t\"msg\": \"Hello, world\"\n" +
+"\t}\n" +
+"}",
+"{\n" +
+"\t\"updateKey\": \"12345\",\n" +
+"\t\"$set\": {\n" +
+"\t\t\"msg\": \"Hello, world\"\n" +
+"\t}\n" +
+"}",
+};
+
+String[] updateKeyProps = new String[] { "_id", "_id", "updateKey" 
};
+Object[] updateKeys = new Object[] { "12345", new 
ObjectId("5a5617b9c1f5de6d8276e87d"), "12345" };
+int index = 0;
+
+for (String upsert : upserts) {
+runner.setProperty(PutMongo.UPDATE_MODE, 
PutMongo.UPDATE_WITH_OPERATORS);
+runner.setProperty(PutMongo.MODE, "update");
+runner.setProperty(PutMongo.UPSERT, "true");
+runner.setProperty(PutMongo.UPDATE_QUERY_KEY, 
updateKeyProps[index]);
+for (int x = 0; x < 5; x++) {
--- End diff --

why doing this 5 times?


---


[GitHub] nifi pull request #2392: NIFI-4759 Fixed a bug that left a hard-coded refere...

2018-01-11 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2392#discussion_r160975264
  
--- Diff: 
nifi-nar-bundles/nifi-mongodb-bundle/nifi-mongodb-processors/src/test/java/org/apache/nifi/processors/mongodb/PutMongoTest.java
 ---
@@ -256,4 +258,72 @@ public void testUpsertWithOperators() throws Exception 
{
 Assert.assertEquals("Msg had wrong value", msg, "Hi");
 }
 }
+
+/*
+ * Start NIFI-4759 Regression Tests
+ *
+ * 2 issues with ID field:
+ *
+ * * Assumed _id is the update key, causing failures when the user 
configured a different one in the UI.
+ * * Treated _id as a string even when it is an ObjectID sent from 
another processor as a string value.
+ *
+ * Expected behavior:
+ *
+ * * update key field should work no matter what (legal) value it is 
set to be.
+ * * _ids that are ObjectID should become real ObjectIDs when added to 
Mongo.
+ * * _ids that are arbitrary strings should be still go in as strings.
+ *
+ */
+
--- End diff --

nit: useless blank line


---


[GitHub] nifi pull request #2392: NIFI-4759 Fixed a bug that left a hard-coded refere...

2018-01-11 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2392#discussion_r160975171
  
--- Diff: 
nifi-nar-bundles/nifi-mongodb-bundle/nifi-mongodb-processors/src/main/java/org/apache/nifi/processors/mongodb/PutMongo.java
 ---
@@ -173,13 +166,25 @@ public void process(final InputStream in) throws 
IOException {
 // update
 final boolean upsert = 
context.getProperty(UPSERT).asBoolean();
 final String updateKey = 
context.getProperty(UPDATE_QUERY_KEY).getValue();
-final Document query = new Document(updateKey, 
((Map)doc).get(updateKey));
+final Document query;
--- End diff --

this can be moved later where it is created


---


[GitHub] nifi pull request #2392: NIFI-4759 Fixed a bug that left a hard-coded refere...

2018-01-11 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2392#discussion_r160974686
  
--- Diff: 
nifi-nar-bundles/nifi-mongodb-bundle/nifi-mongodb-processors/src/main/java/org/apache/nifi/processors/mongodb/PutMongo.java
 ---
@@ -173,13 +166,25 @@ public void process(final InputStream in) throws 
IOException {
 // update
 final boolean upsert = 
context.getProperty(UPSERT).asBoolean();
 final String updateKey = 
context.getProperty(UPDATE_QUERY_KEY).getValue();
-final Document query = new Document(updateKey, 
((Map)doc).get(updateKey));
+final Document query;
+
+Object keyVal = ((Map)doc).get(updateKey);
+if (updateKey.equals("_id")) {
--- End diff --

add ` && ObjectId.isValid(keyVal)`


---


[GitHub] nifi pull request #2392: NIFI-4759 Fixed a bug that left a hard-coded refere...

2018-01-11 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2392#discussion_r160973857
  
--- Diff: 
nifi-nar-bundles/nifi-mongodb-bundle/nifi-mongodb-processors/src/main/java/org/apache/nifi/processors/mongodb/PutMongo.java
 ---
@@ -173,13 +166,25 @@ public void process(final InputStream in) throws 
IOException {
 // update
 final boolean upsert = 
context.getProperty(UPSERT).asBoolean();
 final String updateKey = 
context.getProperty(UPDATE_QUERY_KEY).getValue();
-final Document query = new Document(updateKey, 
((Map)doc).get(updateKey));
+final Document query;
+
+Object keyVal = ((Map)doc).get(updateKey);
+if (updateKey.equals("_id")) {
+try {
+keyVal = new ObjectId((String) keyVal);
+} catch (Exception ex) {
+getLogger().error("{} is not a valid ObjectID, 
using raw value.", new Object[]{keyVal});
+}
+}
+
+query = new Document(updateKey, keyVal);
 
 if (updateMode.equals(UPDATE_WITH_DOC.getValue())) {
+
--- End diff --

nit: useless blank line


---


[GitHub] nifi pull request #2392: NIFI-4759 Fixed a bug that left a hard-coded refere...

2018-01-11 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2392#discussion_r160973717
  
--- Diff: 
nifi-nar-bundles/nifi-mongodb-bundle/nifi-mongodb-processors/src/main/java/org/apache/nifi/processors/mongodb/PutMongo.java
 ---
@@ -173,13 +166,25 @@ public void process(final InputStream in) throws 
IOException {
 // update
 final boolean upsert = 
context.getProperty(UPSERT).asBoolean();
 final String updateKey = 
context.getProperty(UPDATE_QUERY_KEY).getValue();
-final Document query = new Document(updateKey, 
((Map)doc).get(updateKey));
+final Document query;
+
+Object keyVal = ((Map)doc).get(updateKey);
+if (updateKey.equals("_id")) {
+try {
+keyVal = new ObjectId((String) keyVal);
+} catch (Exception ex) {
+getLogger().error("{} is not a valid ObjectID, 
using raw value.", new Object[]{keyVal});
--- End diff --

what about `keyVal + "is not a valid ObjectID, using raw value."`


---


[GitHub] nifi pull request #2392: NIFI-4759 Fixed a bug that left a hard-coded refere...

2018-01-11 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2392#discussion_r160972295
  
--- Diff: 
nifi-nar-bundles/nifi-mongodb-bundle/nifi-mongodb-processors/src/test/java/org/apache/nifi/processors/mongodb/PutMongoTest.java
 ---
@@ -256,4 +258,72 @@ public void testUpsertWithOperators() throws Exception 
{
 Assert.assertEquals("Msg had wrong value", msg, "Hi");
 }
 }
+
+/*
+ * Start NIFI-4759 Regression Tests
+ *
+ * 2 issues with ID field:
+ *
+ * * Assumed _id is the update key, causing failures when the user 
configured a different one in the UI.
+ * * Treated _id as a string even when it is an ObjectID sent from 
another processor as a string value.
+ *
+ * Expected behavior:
+ *
+ * * update key field should work no matter what (legal) value it is 
set to be.
+ * * _ids that are ObjectID should become real ObjectIDs when added to 
Mongo.
+ * * _ids that are arbitrary strings should be still go in as strings.
+ *
+ */
+
+@Test
+public void testNiFi_4759_Regressions() {
+String[] upserts = new String[]{
+"{\n" +
--- End diff --

I think having the 3 JSONs one for each row would improve readability


---


[GitHub] nifi pull request #2392: NIFI-4759 Fixed a bug that left a hard-coded refere...

2018-01-11 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2392#discussion_r160971847
  
--- Diff: 
nifi-nar-bundles/nifi-mongodb-bundle/nifi-mongodb-processors/src/test/java/org/apache/nifi/processors/mongodb/PutMongoTest.java
 ---
@@ -256,4 +258,72 @@ public void testUpsertWithOperators() throws Exception 
{
 Assert.assertEquals("Msg had wrong value", msg, "Hi");
 }
 }
+
+/*
+ * Start NIFI-4759 Regression Tests
+ *
+ * 2 issues with ID field:
+ *
+ * * Assumed _id is the update key, causing failures when the user 
configured a different one in the UI.
+ * * Treated _id as a string even when it is an ObjectID sent from 
another processor as a string value.
+ *
+ * Expected behavior:
+ *
+ * * update key field should work no matter what (legal) value it is 
set to be.
+ * * _ids that are ObjectID should become real ObjectIDs when added to 
Mongo.
+ * * _ids that are arbitrary strings should be still go in as strings.
+ *
+ */
+
+@Test
+public void testNiFi_4759_Regressions() {
+String[] upserts = new String[]{
+"{\n" +
+"\t\"_id\": \"12345\",\n" +
+"\t\"$set\": {\n" +
+"\t\t\"msg\": \"Hello, world\"\n" +
+"\t}\n" +
+"}",
+"{\n" +
+"\t\"_id\": \"5a5617b9c1f5de6d8276e87d\",\n" +
+"\t\"$set\": {\n" +
+"\t\t\"msg\": \"Hello, world\"\n" +
+"\t}\n" +
+"}",
+"{\n" +
+"\t\"updateKey\": \"12345\",\n" +
+"\t\"$set\": {\n" +
+"\t\t\"msg\": \"Hello, world\"\n" +
+"\t}\n" +
+"}",
+};
+
+String[] updateKeyProps = new String[] { "_id", "_id", "updateKey" 
};
+Object[] updateKeys = new Object[] { "12345", new 
ObjectId("5a5617b9c1f5de6d8276e87d"), "12345" };
+int index = 0;
+
+for (String upsert : upserts) {
+runner.setProperty(PutMongo.UPDATE_MODE, 
PutMongo.UPDATE_WITH_OPERATORS);
--- End diff --

this and the following two can be set outside the for loop


---


[GitHub] nifi issue #2394: NIFI-4764: Add tooltips to status bar icons

2018-01-11 Thread mgaido91
Github user mgaido91 commented on the issue:

https://github.com/apache/nifi/pull/2394
  
thanks for your suggestions and help @mcgilman.
I updated the PR accordingly.
Thanks.


---


[GitHub] nifi issue #2394: NIFI-4764: Add tooltips to status bar icons

2018-01-10 Thread mgaido91
Github user mgaido91 commented on the issue:

https://github.com/apache/nifi/pull/2394
  
@mcgilman yes, sure, thanks for your comment. Just one question before 
proceeding. Doing this might involve some refactoring/non-trivial js, since 
process groups are in the canvas and therefore `title` doesn't work. Is it ok?


---


[GitHub] nifi pull request #2394: NIFI-4764: Add tooltips to status bar icons

2018-01-10 Thread mgaido91
GitHub user mgaido91 opened a pull request:

https://github.com/apache/nifi/pull/2394

NIFI-4764: Add tooltips to status bar icons

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [x] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [x] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [x] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [x] Is your initial contribution a single, squashed commit?

### For code changes:
- [x] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [x] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/mgaido91/nifi NIFI-4764

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/2394.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2394


commit 8df5df11f30cc56de6830da7adfdb62761fae9c5
Author: Marco Gaido <marcogaido91@...>
Date:   2018-01-10T16:00:55Z

NIFI-4764: Add tooltips to status bar icons




---


[GitHub] nifi pull request #2392: NIFI-4759 Fixed a bug that left a hard-coded refere...

2018-01-10 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2392#discussion_r160685547
  
--- Diff: 
nifi-nar-bundles/nifi-mongodb-bundle/nifi-mongodb-processors/src/main/java/org/apache/nifi/processors/mongodb/PutMongo.java
 ---
@@ -155,12 +155,7 @@ public void onTrigger(final ProcessContext context, 
final ProcessSession session
 try {
 // Read the contents of the FlowFile into a byte array
 final byte[] content = new byte[(int) flowFile.getSize()];
-session.read(flowFile, new InputStreamCallback() {
-@Override
-public void process(final InputStream in) throws 
IOException {
-StreamUtils.fillBuffer(in, content, true);
-}
-});
+session.read(flowFile, in -> StreamUtils.fillBuffer(in, 
content, true));
--- End diff --

this is causing the error on Ci because of the unused import which 
introduces. Since this is not related with the PR, can we get it back to its 
original state?


---


[GitHub] nifi issue #2363: NIFI-4726: Avoid concurrency issues in JoltTransformJSON

2018-01-09 Thread mgaido91
Github user mgaido91 commented on the issue:

https://github.com/apache/nifi/pull/2363
  
sure, thanks @ijokarumawak and @joewitt. Sorry for the mistake and wasting 
your time.


---


[GitHub] nifi pull request #2363: NIFI-4726: Avoid concurrency issues in JoltTransfor...

2018-01-09 Thread mgaido91
Github user mgaido91 closed the pull request at:

https://github.com/apache/nifi/pull/2363


---


[GitHub] nifi pull request #2294: NIFI-3538 Added DeleteHBaseRow

2018-01-08 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2294#discussion_r160140592
  
--- Diff: 
nifi-nar-bundles/nifi-hbase-bundle/nifi-hbase-processors/src/main/java/org/apache/nifi/hbase/DeleteHBaseRow.java
 ---
@@ -0,0 +1,154 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.hbase;
+
+import org.apache.commons.io.IOUtils;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.AllowableValue;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.util.StandardValidators;
+
+import java.util.ArrayList;
+import java.util.List;
+
+@Tags({ "delete", "hbase" })
+@CapabilityDescription(
+"Delete HBase records individually or in batches. The input can be 
a single row ID in the body, one ID per line, " +
+"row IDs separated by commas or a combination of the two. ")
+public class DeleteHBaseRow extends AbstractDeleteHBase {
+static final AllowableValue ROW_ID_BODY = new AllowableValue("body", 
"FlowFile content", "Get the row key(s) from the flowfile content.");
+static final AllowableValue ROW_ID_ATTR = new AllowableValue("attr", 
"FlowFile attributes", "Get the row key from an expression language 
statement.");
+
+static final PropertyDescriptor ROW_ID_LOCATION = new 
PropertyDescriptor.Builder()
+.name("delete-hb-row-id-location")
+.displayName("Row ID Location")
+.description("The location of the row ID to use for building 
the delete. Can be from the content or an expression language statement.")
+.required(true)
+.defaultValue(ROW_ID_BODY.getValue())
+.allowableValues(ROW_ID_BODY, ROW_ID_ATTR)
+.addValidator(Validator.VALID)
+.build();
+
+static final PropertyDescriptor FLOWFILE_FETCH_COUNT = new 
PropertyDescriptor.Builder()
+.name("delete-hb-flowfile-fetch-count")
+.displayName("Flowfile Fetch Count")
+.description("The number of flowfiles to fetch per run.")
+.required(true)
+.addValidator(StandardValidators.POSITIVE_INTEGER_VALIDATOR)
+.defaultValue("5")
+.expressionLanguageSupported(false)
+.build();
+
+static final PropertyDescriptor BATCH_SIZE = new 
PropertyDescriptor.Builder()
+.name("delete-hb-row-ff-count")
+.displayName("Batch Size")
+.description("The number of deletes to send per batch.")
+.required(true)
+.defaultValue("50")
+.addValidator(StandardValidators.POSITIVE_INTEGER_VALIDATOR)
+.expressionLanguageSupported(false)
+.build();
+
+static final PropertyDescriptor KEY_SEPARATOR = new 
PropertyDescriptor.Builder()
+.name("delete-hb-separator")
+.displayName("Delete Row Key Separator")
+.description("The separator character(s) that separate 
multiple row keys " +
+"when multiple row keys are provided in the flowfile 
body")
+.required(true)
+.defaultValue(",")
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.expressionLanguageSupported(true)
+.build();
+
+@Ove

[GitHub] nifi issue #2363: NIFI-4726: Avoid concurrency issues in JoltTransformJSON

2018-01-04 Thread mgaido91
Github user mgaido91 commented on the issue:

https://github.com/apache/nifi/pull/2363
  
Thanks for your suggestion @joewitt, I'll update this PR accordingly ASAP, 
thanks.


---


[GitHub] nifi issue #2363: NIFI-4726: Avoid concurrency issues in JoltTransformJSON

2018-01-03 Thread mgaido91
Github user mgaido91 commented on the issue:

https://github.com/apache/nifi/pull/2363
  
Thanks for your comment @joewitt. I think we can't fix this in jolt since 
the root cause is our concurrent usage of their utilities and fixing it there 
seems an overkill to me. What do you think about using a pool instead of the 
thread local then?


---


[GitHub] nifi pull request #2294: NIFI-3538 Added DeleteHBaseRow

2017-12-29 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2294#discussion_r159047973
  
--- Diff: 
nifi-nar-bundles/nifi-hbase-bundle/nifi-hbase-processors/src/main/java/org/apache/nifi/hbase/DeleteHBaseRow.java
 ---
@@ -0,0 +1,158 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.hbase;
+
+import org.apache.commons.io.IOUtils;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.AllowableValue;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.util.StandardValidators;
+
+import java.io.IOException;
+import java.util.ArrayList;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+
+@Tags({ "delete", "hbase" })
+@CapabilityDescription(
+"Delete HBase records individually or in batches. The input can be 
a single row ID in the body, one ID per line, " +
+"row IDs separated by commas or a combination of the two. ")
+public class DeleteHBaseRow extends AbstractDeleteHBase {
+static final AllowableValue ROW_ID_BODY = new AllowableValue("body", 
"FlowFile content", "Get the row key(s) from the flowfile content.");
+static final AllowableValue ROW_ID_ATTR = new AllowableValue("attr", 
"FlowFile attributes", "Get the row key from an expression language 
statement.");
+
+static final PropertyDescriptor ROW_ID_LOCATION = new 
PropertyDescriptor.Builder()
+.name("delete-hb-row-id-location")
+.displayName("Row ID Location")
+.description("The location of the row ID to use for building 
the delete. Can be from the content or an expression language statement.")
+.required(true)
+.defaultValue(ROW_ID_BODY.getValue())
+.allowableValues(ROW_ID_BODY, ROW_ID_ATTR)
+.addValidator(Validator.VALID)
+.build();
+
+static final PropertyDescriptor FLOWFILE_FETCH_COUNT = new 
PropertyDescriptor.Builder()
+.name("delete-hb-flowfile-fetch-count")
+.displayName("Flowfile Fetch Count")
+.description("The number of flowfiles to fetch per run.")
+.required(true)
+.addValidator(StandardValidators.POSITIVE_INTEGER_VALIDATOR)
+.defaultValue("5")
+.expressionLanguageSupported(false)
+.build();
+
+static final PropertyDescriptor BATCH_SIZE = new 
PropertyDescriptor.Builder()
+.name("delete-hb-row-ff-count")
+.displayName("Batch Size")
+.description("The number of deletes to send per batch.")
+.required(true)
+.defaultValue("50")
+.addValidator(StandardValidators.POSITIVE_INTEGER_VALIDATOR)
+.expressionLanguageSupported(false)
+.build();
+
+static final PropertyDescriptor KEY_SEPARATOR = new 
PropertyDescriptor.Builder()
+.name("delete-hb-separator")
+.displayName("Delete Row Key Separator")
+.description("The separator character(s) that separate 
multiple row keys " +
+"when multiple row keys are provided in the flowfile 
body")
+.required(true)
+.defaultValue(",")
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+  

[GitHub] nifi pull request #2294: NIFI-3538 Added DeleteHBaseRow

2017-12-29 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2294#discussion_r159047906
  
--- Diff: 
nifi-nar-bundles/nifi-hbase-bundle/nifi-hbase-processors/src/main/java/org/apache/nifi/hbase/DeleteHBaseRow.java
 ---
@@ -0,0 +1,158 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.hbase;
+
+import org.apache.commons.io.IOUtils;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.AllowableValue;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.util.StandardValidators;
+
+import java.io.IOException;
+import java.util.ArrayList;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+
+@Tags({ "delete", "hbase" })
+@CapabilityDescription(
+"Delete HBase records individually or in batches. The input can be 
a single row ID in the body, one ID per line, " +
+"row IDs separated by commas or a combination of the two. ")
+public class DeleteHBaseRow extends AbstractDeleteHBase {
+static final AllowableValue ROW_ID_BODY = new AllowableValue("body", 
"FlowFile content", "Get the row key(s) from the flowfile content.");
+static final AllowableValue ROW_ID_ATTR = new AllowableValue("attr", 
"FlowFile attributes", "Get the row key from an expression language 
statement.");
+
+static final PropertyDescriptor ROW_ID_LOCATION = new 
PropertyDescriptor.Builder()
+.name("delete-hb-row-id-location")
+.displayName("Row ID Location")
+.description("The location of the row ID to use for building 
the delete. Can be from the content or an expression language statement.")
+.required(true)
+.defaultValue(ROW_ID_BODY.getValue())
+.allowableValues(ROW_ID_BODY, ROW_ID_ATTR)
+.addValidator(Validator.VALID)
+.build();
+
+static final PropertyDescriptor FLOWFILE_FETCH_COUNT = new 
PropertyDescriptor.Builder()
+.name("delete-hb-flowfile-fetch-count")
+.displayName("Flowfile Fetch Count")
+.description("The number of flowfiles to fetch per run.")
+.required(true)
+.addValidator(StandardValidators.POSITIVE_INTEGER_VALIDATOR)
+.defaultValue("5")
+.expressionLanguageSupported(false)
+.build();
+
+static final PropertyDescriptor BATCH_SIZE = new 
PropertyDescriptor.Builder()
+.name("delete-hb-row-ff-count")
+.displayName("Batch Size")
+.description("The number of deletes to send per batch.")
+.required(true)
+.defaultValue("50")
+.addValidator(StandardValidators.POSITIVE_INTEGER_VALIDATOR)
+.expressionLanguageSupported(false)
+.build();
+
+static final PropertyDescriptor KEY_SEPARATOR = new 
PropertyDescriptor.Builder()
+.name("delete-hb-separator")
+.displayName("Delete Row Key Separator")
+.description("The separator character(s) that separate 
multiple row keys " +
+"when multiple row keys are provided in the flowfile 
body")
+.required(true)
+.defaultValue(",")
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+  

[GitHub] nifi issue #2294: NIFI-3538 Added DeleteHBaseRow

2017-12-29 Thread mgaido91
Github user mgaido91 commented on the issue:

https://github.com/apache/nifi/pull/2294
  
@MikeThomsen there are validation errors due to unused imports. May you 
please fix them?
```
[WARNING] 
src/main/java/org/apache/nifi/hbase/DeleteHBaseRow.java:[31,8] (imports) 
UnusedImports: Unused import - java.io.IOException.
[WARNING] 
src/main/java/org/apache/nifi/hbase/DeleteHBaseRow.java:[33,8] (imports) 
UnusedImports: Unused import - java.util.HashMap.
[WARNING] 
src/main/java/org/apache/nifi/hbase/DeleteHBaseRow.java:[35,8] (imports) 
UnusedImports: Unused import - java.util.Map.
```


---


[GitHub] nifi pull request #2363: NIFI-4726: Avoid concurrency issues in JoltTransfor...

2017-12-29 Thread mgaido91
GitHub user mgaido91 opened a pull request:

https://github.com/apache/nifi/pull/2363

NIFI-4726: Avoid concurrency issues in JoltTransformJSON

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [x] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [x] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [x] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [x] Is your initial contribution a single, squashed commit?

### For code changes:
- [x] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? NA 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly? NA
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly? NA
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties? NA

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered? NA

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/mgaido91/nifi NIFI-4726

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/2363.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2363


commit 5f938708b797f4894005ab16300585792d6f1da0
Author: mark91 <marcogaido91@...>
Date:   2017-12-28T16:14:32Z

NIFI-4726: Avoid concurrency issues in JoltTransformJSON




---


[GitHub] nifi pull request #2356: NIFI-3660: Support schema containing a map with an ...

2017-12-21 Thread mgaido91
GitHub user mgaido91 opened a pull request:

https://github.com/apache/nifi/pull/2356

NIFI-3660: Support schema containing a map with an array value in 
ConvertAvroToORC

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [x] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [x] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [x] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [x] Is your initial contribution a single, squashed commit?

### For code changes:
- [x] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [x] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? NA 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly? NA
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly? NA
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties? NA

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered? NA

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/mgaido91/nifi NIFI-3660

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/2356.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2356


commit 0a881a2f65f828172a5c2e9b33b400a78bc3816b
Author: Marco Gaido <marcogaido91@...>
Date:   2017-12-21T15:28:48Z

NIFI-3660: Support schema containing a map with an array value in 
ConvertAvroToORC




---


[GitHub] nifi issue #2343: NIFI-2169: Cache compiled regexp for RouteText

2017-12-20 Thread mgaido91
Github user mgaido91 commented on the issue:

https://github.com/apache/nifi/pull/2343
  
@markap14 do you have further comments on this? Thanks


---


[GitHub] nifi pull request #2294: NIFI-3538 Added DeleteHBaseRow

2017-12-17 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2294#discussion_r157414765
  
--- Diff: 
nifi-nar-bundles/nifi-hbase-bundle/nifi-hbase-processors/src/main/java/org/apache/nifi/hbase/DeleteHBaseRow.java
 ---
@@ -0,0 +1,178 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.hbase;
+
+import org.apache.commons.io.IOUtils;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.AllowableValue;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.util.StandardValidators;
+
+import java.io.IOException;
+import java.util.ArrayList;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+
+@Tags({ "delete", "hbase" })
+@CapabilityDescription(
+"Delete HBase records individually or in batches. The input can be 
a single row ID in the body, one ID per line, " +
+"row IDs separated by commas or a combination of the two. ")
+public class DeleteHBaseRow extends AbstractDeleteHBase {
+static final AllowableValue ROW_ID_BODY = new AllowableValue("body", 
"FlowFile content", "Get the row key(s) from the flowfile content.");
+static final AllowableValue ROW_ID_ATTR = new AllowableValue("attr", 
"FlowFile attributes", "Get the row key from an expression language 
statement.");
+
+static final PropertyDescriptor ROW_ID_LOCATION = new 
PropertyDescriptor.Builder()
+.name("delete-hb-row-id-location")
+.displayName("Row ID Location")
+.description("The location of the row ID to use for building 
the delete. Can be from the content or an expression language statement.")
+.required(true)
+.defaultValue("body")
+.allowableValues(ROW_ID_BODY, ROW_ID_ATTR)
+.addValidator(Validator.VALID)
+.build();
+
+static final PropertyDescriptor FLOWFILE_FETCH_COUNT = new 
PropertyDescriptor.Builder()
+.name("delete-hb-flowfile-fetch-count")
+.displayName("Flowfile Fetch Count")
+.description("The number of flowfiles to fetch per run.")
+.required(true)
+.addValidator(StandardValidators.POSITIVE_INTEGER_VALIDATOR)
+.defaultValue("5")
+.expressionLanguageSupported(false)
+.build();
+
+static final PropertyDescriptor BATCH_SIZE = new 
PropertyDescriptor.Builder()
+.name("delete-hb-row-ff-count")
+.displayName("Batch Size")
+.description("The number of deletes to send per batch.")
+.required(true)
+.defaultValue("50")
+.addValidator(StandardValidators.POSITIVE_INTEGER_VALIDATOR)
+.expressionLanguageSupported(false)
+.build();
+
+@Override
+protected List getSupportedPropertyDescriptors() {
+final List properties = 
super.getSupportedPropertyDescriptors();
+properties.add(ROW_ID_LOCATION);
+properties.add(FLOWFILE_FETCH_COUNT);
+properties.add(BATCH_SIZE);
+
+return properties;
+}
+
+@Override
+protected void doDelete(ProcessContext context, ProcessSession 
session) throws Exception {
+final int batchSize = 
context.getProperty(BATCH_SIZE).asInteger();
+final String location   = 
co

[GitHub] nifi issue #2344: NIFI-4619: Enable EL on AWSCredentialsProviderControllerSe...

2017-12-17 Thread mgaido91
Github user mgaido91 commented on the issue:

https://github.com/apache/nifi/pull/2344
  
thank you for your review @jvwing!


---


[GitHub] nifi pull request #2294: NIFI-3538 Added DeleteHBaseRow

2017-12-17 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2294#discussion_r157377689
  
--- Diff: 
nifi-nar-bundles/nifi-hbase-bundle/nifi-hbase-processors/src/main/java/org/apache/nifi/hbase/DeleteHBaseRow.java
 ---
@@ -0,0 +1,178 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.hbase;
+
+import org.apache.commons.io.IOUtils;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.AllowableValue;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.util.StandardValidators;
+
+import java.io.IOException;
+import java.util.ArrayList;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+
+@Tags({ "delete", "hbase" })
+@CapabilityDescription(
+"Delete HBase records individually or in batches. The input can be 
a single row ID in the body, one ID per line, " +
+"row IDs separated by commas or a combination of the two. ")
+public class DeleteHBaseRow extends AbstractDeleteHBase {
+static final AllowableValue ROW_ID_BODY = new AllowableValue("body", 
"FlowFile content", "Get the row key(s) from the flowfile content.");
+static final AllowableValue ROW_ID_ATTR = new AllowableValue("attr", 
"FlowFile attributes", "Get the row key from an expression language 
statement.");
+
+static final PropertyDescriptor ROW_ID_LOCATION = new 
PropertyDescriptor.Builder()
+.name("delete-hb-row-id-location")
+.displayName("Row ID Location")
+.description("The location of the row ID to use for building 
the delete. Can be from the content or an expression language statement.")
+.required(true)
+.defaultValue("body")
+.allowableValues(ROW_ID_BODY, ROW_ID_ATTR)
+.addValidator(Validator.VALID)
+.build();
+
+static final PropertyDescriptor FLOWFILE_FETCH_COUNT = new 
PropertyDescriptor.Builder()
+.name("delete-hb-flowfile-fetch-count")
+.displayName("Flowfile Fetch Count")
+.description("The number of flowfiles to fetch per run.")
+.required(true)
+.addValidator(StandardValidators.POSITIVE_INTEGER_VALIDATOR)
+.defaultValue("5")
+.expressionLanguageSupported(false)
+.build();
+
+static final PropertyDescriptor BATCH_SIZE = new 
PropertyDescriptor.Builder()
+.name("delete-hb-row-ff-count")
+.displayName("Batch Size")
+.description("The number of deletes to send per batch.")
+.required(true)
+.defaultValue("50")
+.addValidator(StandardValidators.POSITIVE_INTEGER_VALIDATOR)
+.expressionLanguageSupported(false)
+.build();
+
+@Override
+protected List getSupportedPropertyDescriptors() {
+final List properties = 
super.getSupportedPropertyDescriptors();
+properties.add(ROW_ID_LOCATION);
+properties.add(FLOWFILE_FETCH_COUNT);
+properties.add(BATCH_SIZE);
+
+return properties;
+}
+
+@Override
+protected void doDelete(ProcessContext context, ProcessSession 
session) throws Exception {
+final int batchSize = 
context.getProperty(BATCH_SIZE).asInteger();
+final String location   = 
co

[GitHub] nifi pull request #2294: NIFI-3538 Added DeleteHBaseRow

2017-12-15 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2294#discussion_r157258216
  
--- Diff: 
nifi-nar-bundles/nifi-hbase-bundle/nifi-hbase-processors/src/main/java/org/apache/nifi/hbase/DeleteHBaseRow.java
 ---
@@ -0,0 +1,178 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.hbase;
+
+import org.apache.commons.io.IOUtils;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.AllowableValue;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.util.StandardValidators;
+
+import java.io.IOException;
+import java.util.ArrayList;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+
+@Tags({ "delete", "hbase" })
+@CapabilityDescription(
+"Delete HBase records individually or in batches. The input can be 
a single row ID in the body, one ID per line, " +
+"row IDs separated by commas or a combination of the two. ")
+public class DeleteHBaseRow extends AbstractDeleteHBase {
+static final AllowableValue ROW_ID_BODY = new AllowableValue("body", 
"FlowFile content", "Get the row key(s) from the flowfile content.");
+static final AllowableValue ROW_ID_ATTR = new AllowableValue("attr", 
"FlowFile attributes", "Get the row key from an expression language 
statement.");
+
+static final PropertyDescriptor ROW_ID_LOCATION = new 
PropertyDescriptor.Builder()
+.name("delete-hb-row-id-location")
+.displayName("Row ID Location")
+.description("The location of the row ID to use for building 
the delete. Can be from the content or an expression language statement.")
+.required(true)
+.defaultValue("body")
+.allowableValues(ROW_ID_BODY, ROW_ID_ATTR)
+.addValidator(Validator.VALID)
+.build();
+
+static final PropertyDescriptor FLOWFILE_FETCH_COUNT = new 
PropertyDescriptor.Builder()
+.name("delete-hb-flowfile-fetch-count")
+.displayName("Flowfile Fetch Count")
+.description("The number of flowfiles to fetch per run.")
+.required(true)
+.addValidator(StandardValidators.POSITIVE_INTEGER_VALIDATOR)
+.defaultValue("5")
+.expressionLanguageSupported(false)
+.build();
+
+static final PropertyDescriptor BATCH_SIZE = new 
PropertyDescriptor.Builder()
+.name("delete-hb-row-ff-count")
+.displayName("Batch Size")
+.description("The number of deletes to send per batch.")
+.required(true)
+.defaultValue("50")
+.addValidator(StandardValidators.POSITIVE_INTEGER_VALIDATOR)
+.expressionLanguageSupported(false)
+.build();
+
+@Override
+protected List getSupportedPropertyDescriptors() {
+final List properties = 
super.getSupportedPropertyDescriptors();
+properties.add(ROW_ID_LOCATION);
+properties.add(FLOWFILE_FETCH_COUNT);
+properties.add(BATCH_SIZE);
+
+return properties;
+}
+
+@Override
+protected void doDelete(ProcessContext context, ProcessSession 
session) throws Exception {
+final int batchSize = 
context.getProperty(BATCH_SIZE).asInteger();
+final String location   = 
co

[GitHub] nifi pull request #2294: NIFI-3538 Added DeleteHBaseRow

2017-12-15 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2294#discussion_r157257248
  
--- Diff: 
nifi-nar-bundles/nifi-hbase-bundle/nifi-hbase-processors/src/main/java/org/apache/nifi/hbase/DeleteHBaseRow.java
 ---
@@ -0,0 +1,178 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.hbase;
+
+import org.apache.commons.io.IOUtils;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.AllowableValue;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.util.StandardValidators;
+
+import java.io.IOException;
+import java.util.ArrayList;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+
+@Tags({ "delete", "hbase" })
+@CapabilityDescription(
+"Delete HBase records individually or in batches. The input can be 
a single row ID in the body, one ID per line, " +
+"row IDs separated by commas or a combination of the two. ")
+public class DeleteHBaseRow extends AbstractDeleteHBase {
+static final AllowableValue ROW_ID_BODY = new AllowableValue("body", 
"FlowFile content", "Get the row key(s) from the flowfile content.");
+static final AllowableValue ROW_ID_ATTR = new AllowableValue("attr", 
"FlowFile attributes", "Get the row key from an expression language 
statement.");
+
+static final PropertyDescriptor ROW_ID_LOCATION = new 
PropertyDescriptor.Builder()
+.name("delete-hb-row-id-location")
+.displayName("Row ID Location")
+.description("The location of the row ID to use for building 
the delete. Can be from the content or an expression language statement.")
+.required(true)
+.defaultValue("body")
+.allowableValues(ROW_ID_BODY, ROW_ID_ATTR)
+.addValidator(Validator.VALID)
+.build();
+
+static final PropertyDescriptor FLOWFILE_FETCH_COUNT = new 
PropertyDescriptor.Builder()
+.name("delete-hb-flowfile-fetch-count")
+.displayName("Flowfile Fetch Count")
+.description("The number of flowfiles to fetch per run.")
+.required(true)
+.addValidator(StandardValidators.POSITIVE_INTEGER_VALIDATOR)
+.defaultValue("5")
+.expressionLanguageSupported(false)
+.build();
+
+static final PropertyDescriptor BATCH_SIZE = new 
PropertyDescriptor.Builder()
+.name("delete-hb-row-ff-count")
+.displayName("Batch Size")
+.description("The number of deletes to send per batch.")
+.required(true)
+.defaultValue("50")
+.addValidator(StandardValidators.POSITIVE_INTEGER_VALIDATOR)
+.expressionLanguageSupported(false)
+.build();
+
+@Override
+protected List getSupportedPropertyDescriptors() {
+final List properties = 
super.getSupportedPropertyDescriptors();
+properties.add(ROW_ID_LOCATION);
+properties.add(FLOWFILE_FETCH_COUNT);
+properties.add(BATCH_SIZE);
+
+return properties;
+}
+
+@Override
+protected void doDelete(ProcessContext context, ProcessSession 
session) throws Exception {
+final int batchSize = 
context.getProperty(BATCH_SIZE).asInteger();
+final String location   = 
co

[GitHub] nifi pull request #2294: NIFI-3538 Added DeleteHBaseRow

2017-12-15 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2294#discussion_r157256777
  
--- Diff: 
nifi-nar-bundles/nifi-hbase-bundle/nifi-hbase-processors/src/main/java/org/apache/nifi/hbase/DeleteHBaseRow.java
 ---
@@ -0,0 +1,178 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.hbase;
+
+import org.apache.commons.io.IOUtils;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.AllowableValue;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.util.StandardValidators;
+
+import java.io.IOException;
+import java.util.ArrayList;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+
+@Tags({ "delete", "hbase" })
+@CapabilityDescription(
+"Delete HBase records individually or in batches. The input can be 
a single row ID in the body, one ID per line, " +
+"row IDs separated by commas or a combination of the two. ")
+public class DeleteHBaseRow extends AbstractDeleteHBase {
+static final AllowableValue ROW_ID_BODY = new AllowableValue("body", 
"FlowFile content", "Get the row key(s) from the flowfile content.");
+static final AllowableValue ROW_ID_ATTR = new AllowableValue("attr", 
"FlowFile attributes", "Get the row key from an expression language 
statement.");
+
+static final PropertyDescriptor ROW_ID_LOCATION = new 
PropertyDescriptor.Builder()
+.name("delete-hb-row-id-location")
+.displayName("Row ID Location")
+.description("The location of the row ID to use for building 
the delete. Can be from the content or an expression language statement.")
+.required(true)
+.defaultValue("body")
+.allowableValues(ROW_ID_BODY, ROW_ID_ATTR)
+.addValidator(Validator.VALID)
+.build();
+
+static final PropertyDescriptor FLOWFILE_FETCH_COUNT = new 
PropertyDescriptor.Builder()
+.name("delete-hb-flowfile-fetch-count")
+.displayName("Flowfile Fetch Count")
+.description("The number of flowfiles to fetch per run.")
+.required(true)
+.addValidator(StandardValidators.POSITIVE_INTEGER_VALIDATOR)
+.defaultValue("5")
+.expressionLanguageSupported(false)
+.build();
+
+static final PropertyDescriptor BATCH_SIZE = new 
PropertyDescriptor.Builder()
+.name("delete-hb-row-ff-count")
+.displayName("Batch Size")
+.description("The number of deletes to send per batch.")
+.required(true)
+.defaultValue("50")
+.addValidator(StandardValidators.POSITIVE_INTEGER_VALIDATOR)
+.expressionLanguageSupported(false)
+.build();
+
+@Override
+protected List getSupportedPropertyDescriptors() {
+final List properties = 
super.getSupportedPropertyDescriptors();
+properties.add(ROW_ID_LOCATION);
+properties.add(FLOWFILE_FETCH_COUNT);
+properties.add(BATCH_SIZE);
+
+return properties;
+}
+
+@Override
+protected void doDelete(ProcessContext context, ProcessSession 
session) throws Exception {
+final int batchSize = 
context.getProperty(BATCH_SIZE).asInteger();
+final String location   = 
co

[GitHub] nifi pull request #2294: NIFI-3538 Added DeleteHBaseRow

2017-12-15 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2294#discussion_r157256636
  
--- Diff: 
nifi-nar-bundles/nifi-hbase-bundle/nifi-hbase-processors/src/main/java/org/apache/nifi/hbase/DeleteHBaseRow.java
 ---
@@ -0,0 +1,178 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.hbase;
+
+import org.apache.commons.io.IOUtils;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.AllowableValue;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.util.StandardValidators;
+
+import java.io.IOException;
+import java.util.ArrayList;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+
+@Tags({ "delete", "hbase" })
+@CapabilityDescription(
+"Delete HBase records individually or in batches. The input can be 
a single row ID in the body, one ID per line, " +
+"row IDs separated by commas or a combination of the two. ")
+public class DeleteHBaseRow extends AbstractDeleteHBase {
+static final AllowableValue ROW_ID_BODY = new AllowableValue("body", 
"FlowFile content", "Get the row key(s) from the flowfile content.");
+static final AllowableValue ROW_ID_ATTR = new AllowableValue("attr", 
"FlowFile attributes", "Get the row key from an expression language 
statement.");
+
+static final PropertyDescriptor ROW_ID_LOCATION = new 
PropertyDescriptor.Builder()
+.name("delete-hb-row-id-location")
+.displayName("Row ID Location")
+.description("The location of the row ID to use for building 
the delete. Can be from the content or an expression language statement.")
+.required(true)
+.defaultValue("body")
+.allowableValues(ROW_ID_BODY, ROW_ID_ATTR)
+.addValidator(Validator.VALID)
+.build();
+
+static final PropertyDescriptor FLOWFILE_FETCH_COUNT = new 
PropertyDescriptor.Builder()
+.name("delete-hb-flowfile-fetch-count")
+.displayName("Flowfile Fetch Count")
+.description("The number of flowfiles to fetch per run.")
+.required(true)
+.addValidator(StandardValidators.POSITIVE_INTEGER_VALIDATOR)
+.defaultValue("5")
+.expressionLanguageSupported(false)
+.build();
+
+static final PropertyDescriptor BATCH_SIZE = new 
PropertyDescriptor.Builder()
+.name("delete-hb-row-ff-count")
+.displayName("Batch Size")
+.description("The number of deletes to send per batch.")
+.required(true)
+.defaultValue("50")
+.addValidator(StandardValidators.POSITIVE_INTEGER_VALIDATOR)
+.expressionLanguageSupported(false)
+.build();
+
+@Override
+protected List getSupportedPropertyDescriptors() {
+final List properties = 
super.getSupportedPropertyDescriptors();
+properties.add(ROW_ID_LOCATION);
+properties.add(FLOWFILE_FETCH_COUNT);
+properties.add(BATCH_SIZE);
+
+return properties;
+}
+
+@Override
+protected void doDelete(ProcessContext context, ProcessSession 
session) throws Exception {
+final int batchSize = 
context.getProperty(BATCH_SIZE).asInteger();
+final String location   = 
co

[GitHub] nifi pull request #2294: NIFI-3538 Added DeleteHBaseRow

2017-12-15 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2294#discussion_r157256381
  
--- Diff: 
nifi-nar-bundles/nifi-hbase-bundle/nifi-hbase-processors/src/main/java/org/apache/nifi/hbase/DeleteHBaseRow.java
 ---
@@ -0,0 +1,178 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.hbase;
+
+import org.apache.commons.io.IOUtils;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.AllowableValue;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.util.StandardValidators;
+
+import java.io.IOException;
+import java.util.ArrayList;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+
+@Tags({ "delete", "hbase" })
+@CapabilityDescription(
+"Delete HBase records individually or in batches. The input can be 
a single row ID in the body, one ID per line, " +
+"row IDs separated by commas or a combination of the two. ")
+public class DeleteHBaseRow extends AbstractDeleteHBase {
+static final AllowableValue ROW_ID_BODY = new AllowableValue("body", 
"FlowFile content", "Get the row key(s) from the flowfile content.");
+static final AllowableValue ROW_ID_ATTR = new AllowableValue("attr", 
"FlowFile attributes", "Get the row key from an expression language 
statement.");
+
+static final PropertyDescriptor ROW_ID_LOCATION = new 
PropertyDescriptor.Builder()
+.name("delete-hb-row-id-location")
+.displayName("Row ID Location")
+.description("The location of the row ID to use for building 
the delete. Can be from the content or an expression language statement.")
+.required(true)
+.defaultValue("body")
+.allowableValues(ROW_ID_BODY, ROW_ID_ATTR)
+.addValidator(Validator.VALID)
+.build();
+
+static final PropertyDescriptor FLOWFILE_FETCH_COUNT = new 
PropertyDescriptor.Builder()
+.name("delete-hb-flowfile-fetch-count")
+.displayName("Flowfile Fetch Count")
+.description("The number of flowfiles to fetch per run.")
+.required(true)
+.addValidator(StandardValidators.POSITIVE_INTEGER_VALIDATOR)
+.defaultValue("5")
+.expressionLanguageSupported(false)
+.build();
+
+static final PropertyDescriptor BATCH_SIZE = new 
PropertyDescriptor.Builder()
+.name("delete-hb-row-ff-count")
+.displayName("Batch Size")
+.description("The number of deletes to send per batch.")
+.required(true)
+.defaultValue("50")
+.addValidator(StandardValidators.POSITIVE_INTEGER_VALIDATOR)
+.expressionLanguageSupported(false)
+.build();
+
+@Override
+protected List getSupportedPropertyDescriptors() {
+final List properties = 
super.getSupportedPropertyDescriptors();
+properties.add(ROW_ID_LOCATION);
+properties.add(FLOWFILE_FETCH_COUNT);
+properties.add(BATCH_SIZE);
+
+return properties;
+}
+
+@Override
+protected void doDelete(ProcessContext context, ProcessSession 
session) throws Exception {
+final int batchSize = 
context.getProperty(BATCH_SIZE).asInteger();
--- End diff --

I think that the preferred syntax in the project is to have only one space 
before and after the `=` sign


---


[GitHub] nifi pull request #2294: NIFI-3538 Added DeleteHBaseRow

2017-12-15 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2294#discussion_r157255826
  
--- Diff: 
nifi-nar-bundles/nifi-hbase-bundle/nifi-hbase-processors/src/main/java/org/apache/nifi/hbase/DeleteHBaseRow.java
 ---
@@ -0,0 +1,178 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.hbase;
+
+import org.apache.commons.io.IOUtils;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.AllowableValue;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.util.StandardValidators;
+
+import java.io.IOException;
+import java.util.ArrayList;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+
+@Tags({ "delete", "hbase" })
+@CapabilityDescription(
+"Delete HBase records individually or in batches. The input can be 
a single row ID in the body, one ID per line, " +
+"row IDs separated by commas or a combination of the two. ")
+public class DeleteHBaseRow extends AbstractDeleteHBase {
+static final AllowableValue ROW_ID_BODY = new AllowableValue("body", 
"FlowFile content", "Get the row key(s) from the flowfile content.");
+static final AllowableValue ROW_ID_ATTR = new AllowableValue("attr", 
"FlowFile attributes", "Get the row key from an expression language 
statement.");
+
+static final PropertyDescriptor ROW_ID_LOCATION = new 
PropertyDescriptor.Builder()
+.name("delete-hb-row-id-location")
+.displayName("Row ID Location")
+.description("The location of the row ID to use for building 
the delete. Can be from the content or an expression language statement.")
+.required(true)
+.defaultValue("body")
--- End diff --

what about using `defaultValue(ROW_ID_BODY.getValue())`?


---


[GitHub] nifi pull request #2343: NIFI-2169: Cache compiled regexp for RouteText

2017-12-15 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2343#discussion_r157242761
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/RouteText.java
 ---
@@ -209,6 +215,30 @@
 private volatile Map<Relationship, PropertyValue> propertyMap = new 
HashMap<>();
 private volatile Pattern groupingRegex = null;
 
+@VisibleForTesting
+final static int PATTERNS_CACHE_MAXIMUM_ENTRIES = 10;
+
+/**
+ * LRU cache for the compiled patterns. The size of the cache is 
determined by the value of
+ * {@link #PATTERNS_CACHE_MAXIMUM_ENTRIES}.
+ */
+@VisibleForTesting
+final ConcurrentMap<Pair<Boolean, String>, Pattern> patternsCache = 
CacheBuilder.newBuilder()
+.maximumSize(PATTERNS_CACHE_MAXIMUM_ENTRIES)
+.<Pair<Boolean, String>, Pattern>build()
+.asMap();
+
+private final Function<Pair<Boolean, String>, Pattern> compileRegex = 
ignoreCaseAndRegex -> {
--- End diff --

for the `Pair`, I'll get rid of it following your suggestion above.

For the `Function`, my goal in defining the function as a parameter was to 
avoid the creation of a new `Function` object at every invocation, using always 
the same. What do you think?


---


[GitHub] nifi pull request #2343: NIFI-2169: Cache compiled regexp for RouteText

2017-12-15 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2343#discussion_r157241548
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/RouteText.java
 ---
@@ -209,6 +215,30 @@
 private volatile Map<Relationship, PropertyValue> propertyMap = new 
HashMap<>();
 private volatile Pattern groupingRegex = null;
 
+@VisibleForTesting
+final static int PATTERNS_CACHE_MAXIMUM_ENTRIES = 10;
+
+/**
+ * LRU cache for the compiled patterns. The size of the cache is 
determined by the value of
+ * {@link #PATTERNS_CACHE_MAXIMUM_ENTRIES}.
+ */
+@VisibleForTesting
+final ConcurrentMap<Pair<Boolean, String>, Pattern> patternsCache = 
CacheBuilder.newBuilder()
--- End diff --

yes, this was my other option. I will do this, thanks.


---


[GitHub] nifi pull request #2343: NIFI-2169: Cache compiled regexp for RouteText

2017-12-15 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2343#discussion_r157241247
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/RouteText.java
 ---
@@ -209,6 +215,30 @@
 private volatile Map<Relationship, PropertyValue> propertyMap = new 
HashMap<>();
 private volatile Pattern groupingRegex = null;
 
+@VisibleForTesting
+final static int PATTERNS_CACHE_MAXIMUM_ENTRIES = 10;
--- End diff --

yes, I was thinking about introducing a configuration property for it but 
it seemed an overkill to me. I agree that it is not a big deal having more. I 
will set it to 1024, thanks.


---


  1   2   >