[GitHub] nifi issue #3116: NIFI-4715: ListS3 produces duplicates in frequently update...

2018-11-03 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/3116
  
Thanks, @ijokarumawak and @adamlamar!  The combined change looks good to 
me.  

I ran it through multiple test loops putting and listing about 10,000 
objects in S3.  No objects were missed.  Duplicates were very low (< 100 per 
10,000), coinciding with S3 500 errors "We encountered an internal error. 
Please try again."  I believe we are handling this well to allow a few 
duplicates for at-least-once processing.


---


[GitHub] nifi issue #2361: NIFI-4715: ListS3 produces duplicates in frequently update...

2018-11-03 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2361
  
Thanks, @adamlamar!


---


[GitHub] nifi issue #3116: NIFI-4715: ListS3 produces duplicates in frequently update...

2018-10-31 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/3116
  
Reviewing...


---


[GitHub] nifi pull request #2859: NIFI-4802 Added character configuration to PutSQS

2018-07-24 Thread jvwing
Github user jvwing commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2859#discussion_r204968906
  
--- Diff: 
nifi-nar-bundles/nifi-aws-bundle/nifi-aws-processors/src/main/java/org/apache/nifi/processors/aws/sqs/PutSQS.java
 ---
@@ -115,7 +125,19 @@ public void onTrigger(final ProcessContext context, 
final ProcessSession session
 entry.setId(flowFile.getAttribute("uuid"));
 final ByteArrayOutputStream baos = new ByteArrayOutputStream();
 session.exportTo(flowFile, baos);
-final String flowFileContent = baos.toString();
+
+final Charset charset = 
Charset.forName(context.getProperty(CHARSET).getValue());
+
+String flowFileContent;
+
+// Get the content of the FlowFile with the given Charset. Fall 
back to the default charset
+// if the given charset is not supported.
+try {
+flowFileContent = baos.toString(charset.name());
+} catch (UnsupportedEncodingException e) {
--- End diff --

`customValidate` is defined in `AbstractConfigurableComponent`, but can be 
overridden in processors to validate properties as the processor is started, 
before `onTrigger` is ever called.  Take a look at 
[PutCloudWatchMetric.customValidate](https://github.com/apache/nifi/blob/2834fa4ce477014dfad8c96b6d326d0f1f06a23e/nifi-nar-bundles/nifi-aws-bundle/nifi-aws-processors/src/main/java/org/apache/nifi/processors/aws/cloudwatch/PutCloudWatchMetric.java#L216)
 for a good example.  
* It's important to call `super.customValidate()` to get validation for 
inherited property descriptors.
* I also recommend TestPutCloudWatchMetric for several examples on how to 
unit test the validation cases.


---


[GitHub] nifi pull request #2859: NIFI-4802 Added character configuration to PutSQS

2018-07-22 Thread jvwing
Github user jvwing commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2859#discussion_r204246421
  
--- Diff: 
nifi-nar-bundles/nifi-aws-bundle/nifi-aws-processors/src/main/java/org/apache/nifi/processors/aws/sqs/PutSQS.java
 ---
@@ -115,7 +125,19 @@ public void onTrigger(final ProcessContext context, 
final ProcessSession session
 entry.setId(flowFile.getAttribute("uuid"));
 final ByteArrayOutputStream baos = new ByteArrayOutputStream();
 session.exportTo(flowFile, baos);
-final String flowFileContent = baos.toString();
+
+final Charset charset = 
Charset.forName(context.getProperty(CHARSET).getValue());
+
+String flowFileContent;
+
+// Get the content of the FlowFile with the given Charset. Fall 
back to the default charset
+// if the given charset is not supported.
+try {
+flowFileContent = baos.toString(charset.name());
+} catch (UnsupportedEncodingException e) {
--- End diff --

I don't think we should silently fall back to a default encoding without 
notifying users.  Alternatives might include:
* Routing the flowfile to the FAILURE relationship and logging an error
* Falling back to the default encoding, but log a warning describing the 
issue (via `getLogger().warn(...)`)

Also, have you considered validating the charset with a customValidate() 
override?


---


[GitHub] nifi pull request #2859: NIFI-4802 Added character configuration to PutSQS

2018-07-22 Thread jvwing
Github user jvwing commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2859#discussion_r204246953
  
--- Diff: 
nifi-nar-bundles/nifi-aws-bundle/nifi-aws-processors/src/main/java/org/apache/nifi/processors/aws/sqs/PutSQS.java
 ---
@@ -57,6 +59,14 @@
 + "the Message Attribute and value will become the value of the 
Message Attribute", expressionLanguageScope = 
ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
 public class PutSQS extends AbstractSQSProcessor {
 
+public static final PropertyDescriptor CHARSET = new 
PropertyDescriptor.Builder()
--- End diff --

PutSQS.CHARSET appears similar to GetSQS.CHARSET.  Can this property 
descriptor be moved to AbstractSQSProcessor and shared?


---


[GitHub] nifi issue #2859: NIFI-4802 Added character configuration to PutSQS

2018-07-21 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2859
  
Reviewing...


---


[GitHub] nifi issue #2825: NIFI-5352 Change distribution to have the JARs and NARs in...

2018-07-10 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2825
  
Thanks @lfrancke 


---


[GitHub] nifi pull request #2751: NIFI-5221: Added 'Object Tagging' functionalities t...

2018-06-08 Thread jvwing
Github user jvwing commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2751#discussion_r194140856
  
--- Diff: 
nifi-nar-bundles/nifi-aws-bundle/nifi-aws-processors/src/main/java/org/apache/nifi/processors/aws/s3/ListS3.java
 ---
@@ -307,6 +328,20 @@ private boolean commit(final ProcessContext context, 
final ProcessSession sessio
 return willCommit;
 }
 
+private Map writeObjectTags(AmazonS3 client, 
S3VersionSummary versionSummary) {
+final GetObjectTaggingResult taggingResult = 
client.getObjectTagging(new 
GetObjectTaggingRequest(versionSummary.getBucketName(), 
versionSummary.getKey()));
--- End diff --

I agree with @pvillard31 that it should be off by default.  From comments 
on the users/developer email lists, I understand ListS3 is used to process very 
large lists of objects, easily 10,000+ on a regular basis.  Even if the 
additional API calls are quick, it will add up to be a lot of API calls.

Unfortunately, it does not look like the 
[S3ObjectSummary](https://docs.aws.amazon.com/AWSJavaSDK/latest/javadoc/com/amazonaws/services/s3/model/S3ObjectSummary.html)
 returned by the listing contains any hints on the number of tags present, if 
any.


---


[GitHub] nifi pull request #2704: NIFI-4199: Consistent proxy support across componen...

2018-05-17 Thread jvwing
Github user jvwing commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2704#discussion_r189164826
  
--- Diff: 
nifi-nar-bundles/nifi-aws-bundle/nifi-aws-processors/src/test/java/org/apache/nifi/processors/aws/s3/TestListS3.java
 ---
@@ -311,5 +312,10 @@ public void testGetPropertyDescriptors() throws 
Exception {
 assertTrue(pd.contains(ListS3.PREFIX));
 assertTrue(pd.contains(ListS3.USE_VERSIONS));
 assertTrue(pd.contains(ListS3.MIN_AGE));
+
assertTrue(pd.contains(ProxyConfigurationService.PROXY_CONFIGURATION_SERVICE));
+assertTrue(pd.contains(ListS3.PROXY_HOST));
--- End diff --

Minor tweak: The check for PROXY_HOST and PROXY_HOST_PORT duplicates checks 
above on lines 309-310.  I believe this is why we add 5 lines of new 
assertions, but the count of property descriptors only goes up by 3 from 17 to 
20.  It doesn't make any difference, really, but the math was bothering me.


---


[GitHub] nifi pull request #2704: NIFI-4199: Consistent proxy support across componen...

2018-05-17 Thread jvwing
Github user jvwing commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2704#discussion_r189164994
  
--- Diff: 
nifi-nar-bundles/nifi-aws-bundle/nifi-aws-abstract-processors/src/main/java/org/apache/nifi/processors/aws/AbstractAWSProcessor.java
 ---
@@ -92,6 +94,23 @@
 .addValidator(StandardValidators.PORT_VALIDATOR)
 .build();
 
+public static final PropertyDescriptor PROXY_USERNAME = new 
PropertyDescriptor.Builder()
+.name("Proxy Username")
+.description("Proxy username")
+.expressionLanguageSupported(true)
+.required(false)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor PROXY_PASSWORD = new 
PropertyDescriptor.Builder()
+.name("Proxy Password")
--- End diff --

I recommend a separate `name` vs `displayName` for PROXY_PASSWORD.


---


[GitHub] nifi pull request #2704: NIFI-4199: Consistent proxy support across componen...

2018-05-17 Thread jvwing
Github user jvwing commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2704#discussion_r189164964
  
--- Diff: 
nifi-nar-bundles/nifi-aws-bundle/nifi-aws-abstract-processors/src/main/java/org/apache/nifi/processors/aws/AbstractAWSProcessor.java
 ---
@@ -92,6 +94,23 @@
 .addValidator(StandardValidators.PORT_VALIDATOR)
 .build();
 
+public static final PropertyDescriptor PROXY_USERNAME = new 
PropertyDescriptor.Builder()
+.name("Proxy Username")
--- End diff --

I recommend a separate `name` vs `displayName` for PROXY_USERNAME.


---


[GitHub] nifi issue #1888: NIFI-4015 NIFI-3999 Fix DeleteSQS Issues

2018-05-16 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/1888
  
Thanks, @pvillard31 


---


[GitHub] nifi issue #2665: NIFI-5129: AWS Processors now displays proper region names

2018-04-29 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2665
  
Thanks, @zenfenan, this looks good.


---


[GitHub] nifi issue #2665: NIFI-5129: AWS Processors now displays proper region names

2018-04-29 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2665
  
Reviewing...


---


[GitHub] nifi issue #2664: NIFI-5105: Improvements for nifi-aws-bundle

2018-04-28 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2664
  
Thanks, @zenfenan, this looks good. 
- Full build with contrib-check passes
- nifi-aws-processors integration tests pass
- Manual testing with an S3 flow worked fine

And great news that the NAR file is down to ~15 MB!


---


[GitHub] nifi issue #2664: NIFI-5105: Improvements for nifi-aws-bundle

2018-04-28 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2664
  
Reviewing...


---


[GitHub] nifi issue #2556: NIFI-4981 - Allow using FF's content as message in PutEmai...

2018-03-18 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2556
  
LGTM:
* Full build passes
* Good use of SystemResourceConsideration for memory consideration
* Good test coverage
* PutEmail worked correctly in my testing with various content option 
combinations

Thanks @pvillard31 and @zenfenan!


---


[GitHub] nifi issue #2556: NIFI-4981 - Allow using FF's content as message in PutEmai...

2018-03-18 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2556
  
Reviewing


---


[GitHub] nifi issue #2506: NIFI-4922 - Add badges to the README file

2018-03-03 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2506
  
@pvillard31 - Would you please close the PR?  I merged the change, but 
fat-fingered the PR number in the commit message.


---


[GitHub] nifi issue #2506: NIFI-4922 - Add badges to the README file

2018-03-03 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2506
  
Thanks for the update @pvillard31.  Thanks for reviewing, @zenfenan.


---


[GitHub] nifi issue #2505: NIFI-4920 Skipping sensitive properties when updating comp...

2018-03-03 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2505
  
The extra close is a result of my carelessness, sorry.  That should have 
been for PR #2506.


---


[GitHub] nifi issue #2491: NIFI-4876 Adding Min Object Age to ListS3

2018-02-28 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2491
  
Thanks, @pvillard31 


---


[GitHub] nifi issue #2492: NIFI-4910 Fixing slight spelling mistake in error message,...

2018-02-26 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2492
  
Reviewing...


---


[GitHub] nifi pull request #2491: NIFI-4876 Adding Min Object Age to ListS3

2018-02-25 Thread jvwing
GitHub user jvwing opened a pull request:

https://github.com/apache/nifi/pull/2491

NIFI-4876 Adding Min Object Age to ListS3

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [X] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [X] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [X] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [X] Is your initial contribution a single, squashed commit?

### For code changes:
- [X] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [X] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [X] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/jvwing/nifi NIFI-4876-lists3-minage-1

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/2491.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2491


commit b0fb4c6e541fc2ce8b6f6babf743c75d4f259100
Author: James Wing <jvwing@...>
Date:   2018-02-25T18:41:33Z

NIFI-4876 Adding Min Object Age to ListS3




---


[GitHub] nifi issue #2409: NIFI-4786 - Allow Expression Evaluation to Kinesis/Firehos...

2018-01-27 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2409
  
@SunSatION, this is nice work.  Thanks for the new Kinesis expression 
language features.  And double thanks for fixing the if statements and the 
integration tests, I think your Jackson test dependency fix is a good approach.


---


[GitHub] nifi issue #2409: NIFI-4786 - Allow Expression Evaluation to Kinesis/Firehos...

2018-01-20 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2409
  
@SunSatION , thanks for submitting this PR, it looks like good stuff.

May I ask how you tested the changes?  For PutKinesisFirehose, I was able 
to run the integration test suite successfully, and I was able to manually 
build and run a simple flow sending data to Kinesis Firehose.  For 
PutKinesisStream, the manual flow building worked, but I get exceptions running 
the integration test suite (ITPutKinesisStream.testIntegrationSuccess) as it 
calls `putRecords()`:

```
com.amazonaws.SdkClientException: Unable to marshall request to JSON: 

com.fasterxml.jackson.dataformat.cbor.CBORGenerator.getOutputContext()Lcom/fasterxml/jackson/core/json/JsonWriteContext;
```

Did you try the integration tests, and did they work for you?  I don't 
believe your change introduced this issue.  A quick check of the master branch 
suggests it was already there, but it complicated testing.


---


[GitHub] nifi pull request #2409: NIFI-4786 - Allow Expression Evaluation to Kinesis/...

2018-01-20 Thread jvwing
Github user jvwing commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2409#discussion_r162796692
  
--- Diff: 
nifi-nar-bundles/nifi-aws-bundle/nifi-aws-processors/src/main/java/org/apache/nifi/processors/aws/kinesis/firehose/PutKinesisFirehose.java
 ---
@@ -89,63 +89,80 @@ public void onTrigger(final ProcessContext context, 
final ProcessSession session
 
 final int batchSize = context.getProperty(BATCH_SIZE).asInteger();
 final long maxBufferSizeBytes = 
context.getProperty(MAX_MESSAGE_BUFFER_SIZE_MB).asDataSize(DataUnit.B).longValue();
-final String firehoseStreamName = 
context.getProperty(KINESIS_FIREHOSE_DELIVERY_STREAM_NAME).getValue();
 
-List flowFiles = filterMessagesByMaxSize(session, 
batchSize, maxBufferSizeBytes, firehoseStreamName,
-AWS_KINESIS_FIREHOSE_ERROR_MESSAGE);
+List flowFiles = filterMessagesByMaxSize(session, 
batchSize, maxBufferSizeBytes, AWS_KINESIS_FIREHOSE_ERROR_MESSAGE);
+HashMap<String, List> hashFlowFiles = new HashMap<>();
+HashMap<String, List> recordHash = new HashMap<String, 
List>();
 
 final AmazonKinesisFirehoseClient client = getClient();
 
 try {
-List records = new ArrayList<>();
-
 List failedFlowFiles = new ArrayList<>();
 List successfulFlowFiles = new ArrayList<>();
 
 // Prepare batch of records
 for (int i = 0; i < flowFiles.size(); i++) {
 FlowFile flowFile = flowFiles.get(i);
 
+final String firehoseStreamName = 
context.getProperty(KINESIS_FIREHOSE_DELIVERY_STREAM_NAME).evaluateAttributeExpressions(flowFile).getValue();
+
 final ByteArrayOutputStream baos = new 
ByteArrayOutputStream();
 session.exportTo(flowFile, baos);
-records.add(new 
Record().withData(ByteBuffer.wrap(baos.toByteArray(;
+
+if ( !recordHash.containsKey(firehoseStreamName) )
--- End diff --

Thank you for following the pre-existing code style in this class file.  
That's usually a good practice.  However, `if` statements in NiFi code 
typically have braces even for single-line contents, and there is typically no 
space between parenthesis:
```
if (somevar == true) {
doStuff();
}
```
Would you please fix these up?


---


[GitHub] nifi pull request #2409: NIFI-4786 - Allow Expression Evaluation to Kinesis/...

2018-01-20 Thread jvwing
Github user jvwing commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2409#discussion_r162796710
  
--- Diff: 
nifi-nar-bundles/nifi-aws-bundle/nifi-aws-processors/src/main/java/org/apache/nifi/processors/aws/kinesis/stream/PutKinesisStream.java
 ---
@@ -111,64 +112,80 @@ public void onTrigger(final ProcessContext context, 
final ProcessSession session
 for (int i = 0; i < flowFiles.size(); i++) {
 FlowFile flowFile = flowFiles.get(i);
 
+String streamName = 
context.getProperty(KINESIS_STREAM_NAME).evaluateAttributeExpressions(flowFile).getValue();;
+
 final ByteArrayOutputStream baos = new 
ByteArrayOutputStream();
 session.exportTo(flowFile, baos);
 PutRecordsRequestEntry record = new 
PutRecordsRequestEntry().withData(ByteBuffer.wrap(baos.toByteArray()));
 
 String partitionKey = 
context.getProperty(PutKinesisStream.KINESIS_PARTITION_KEY)
-
.evaluateAttributeExpressions(flowFiles.get(i)).getValue();
+
.evaluateAttributeExpressions(flowFiles.get(i)).getValue();
 
 if ( ! StringUtils.isBlank(partitionKey) ) {
 record.setPartitionKey(partitionKey);
 } else {
 
record.setPartitionKey(Integer.toString(randomParitionKeyGenerator.nextInt()));
 }
 
-records.add(record);
+if ( !recordHash.containsKey(streamName) ) {
--- End diff --

Would you also please fix these `if`s?  I completely understand they were 
already that way.


---


[GitHub] nifi issue #2409: NIFI-4786 - Allow Expression Evaluation to Kinesis/Firehos...

2018-01-17 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2409
  
Reviewing...


---


[GitHub] nifi issue #2017: NIFI-4197 - Expose some proxy settings to GCS Processors

2018-01-06 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2017
  
Thanks, @trixpan.


---


[GitHub] nifi issue #2344: NIFI-4619: Enable EL on AWSCredentialsProviderControllerSe...

2017-12-17 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2344
  
Thanks, @mgaido91!  The code change looks good and works good.  And thanks 
for tuning up the tests appropriately.


---


[GitHub] nifi issue #2344: NIFI-4619: Enable EL on AWSCredentialsProviderControllerSe...

2017-12-16 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2344
  
Reviewing...


---


[GitHub] nifi issue #2321: change all the old fasioned headings in docs to modern one...

2017-12-07 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2321
  
@edwardzjl - Is there a JIRA ticket for this issue?  If so, would you 
please update the PR title with it.  If not, would you please create one (at 
https://issues.apache.org/jira/browse/NIFI/)?  JIRA tickets are important to 
the NiFi project to track work and define releases.


---


[GitHub] nifi issue #2291: NIFI-4256 - Add support for all AWS S3 Encryption Options

2017-12-04 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2291
  
@baank, did you mean to close this PR?  The latest changes appear to be 
well worth reviewing, if that is OK with you?


---


[GitHub] nifi issue #2300: NIFI-4628 Add support for ListS3Version2 API

2017-11-30 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2300
  
@aburkard Thanks again for putting this together.


---


[GitHub] nifi issue #2291: NIFI-4256 - Add support for all AWS S3 Encryption Options

2017-11-30 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2291
  
@baank - closed?


---


[GitHub] nifi issue #2291: NIFI-4256 - Add support for all AWS S3 Encryption Options

2017-11-29 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2291
  
@baank - thanks for the update, I think we're almost done.  Two things: 

1.) I still see a checkstyle warning for nifi-aws-processors:

> [INFO] --- maven-checkstyle-plugin:2.15:check (check-style) @ 
nifi-aws-processors ---
[WARNING] 
src/test/java/org/apache/nifi/processors/aws/s3/TestPutS3Object.java[27:8] 
(imports) UnusedImports: Unused import - 
com.amazonaws.services.s3.model.CryptoConfiguration.

Are you able to run the full build with `mvn clean install -Pcontrib-check` 
or otherwise run the checkstyle:check task?

2.) We still have property descriptors in EncryptedS3PutEnrichmentService 
that specify expression language, but are not evaluating EL (KMS_KEY_ID, 
CUSTOMER_KEY).  Did you mean to change those also?


---


[GitHub] nifi pull request #2291: NIFI-4256 - Add support for all AWS S3 Encryption O...

2017-11-27 Thread jvwing
Github user jvwing commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2291#discussion_r153396888
  
--- Diff: 
nifi-nar-bundles/nifi-aws-bundle/nifi-aws-processors/src/main/java/org/apache/nifi/processors/aws/s3/PutS3Object.java
 ---
@@ -154,6 +155,14 @@
 .defaultValue(StorageClass.Standard.name())
 .build();
 
+
+public static final PropertyDescriptor PUT_ENRICHMENT_SERVICE = new 
PropertyDescriptor.Builder()
+.name("Put Enrichment Service")
--- End diff --

Same request for machine-readable name and human-readable displayName.


---


[GitHub] nifi pull request #2291: NIFI-4256 - Add support for all AWS S3 Encryption O...

2017-11-27 Thread jvwing
Github user jvwing commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2291#discussion_r153396684
  
--- Diff: 
nifi-nar-bundles/nifi-aws-bundle/nifi-aws-abstract-processors/src/main/java/org/apache/nifi/processors/aws/s3/AbstractS3Processor.java
 ---
@@ -125,16 +125,31 @@
 new AllowableValue("S3SignerType", "Signature v2"))
 .defaultValue("Default Signature")
 .build();
+
+public static final PropertyDescriptor CLIENT_SERVICE = new 
PropertyDescriptor.Builder()
+.name("Client Service")
--- End diff --

Name should be a machine-readable key like "client-service" with a 
human-readable displayName like "Client Service".  I know this has not been 
done previously in this file, but we should try to do it going forward.


---


[GitHub] nifi pull request #2291: NIFI-4256 - Add support for all AWS S3 Encryption O...

2017-11-27 Thread jvwing
Github user jvwing commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2291#discussion_r153398470
  
--- Diff: 
nifi-nar-bundles/nifi-aws-bundle/nifi-aws-processors/src/main/java/org/apache/nifi/processors/aws/s3/encryption/EncryptedS3PutEnrichmentService.java
 ---
@@ -0,0 +1,181 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.aws.s3.encryption;
+
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.List;
+
+import com.amazonaws.services.s3.model.InitiateMultipartUploadRequest;
+import com.amazonaws.services.s3.model.ObjectMetadata;
+import com.amazonaws.services.s3.model.PutObjectRequest;
+import com.amazonaws.services.s3.model.SSEAwsKeyManagementParams;
+import com.amazonaws.services.s3.model.SSECustomerKey;
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.processors.aws.s3.service.S3PutEnrichmentService;
+import org.apache.nifi.reporting.InitializationException;
+
+@Tags({"aws", "s3", "encryption", "server", "kms", "key"})
+@CapabilityDescription("Provides the ability to configure S3 Server Side 
Encryption once and reuse " +
+"that configuration throughout the application")
+public class EncryptedS3PutEnrichmentService extends 
AbstractControllerService implements S3PutEnrichmentService {
+
+public static final String METHOD_SSE_S3 = "SSE-S3";
+public static final String METHOD_SSE_KMS = "SSE-KMS";
+public static final String METHOD_SSE_C = "SSE-C";
+
+public static final String ALGORITHM_AES256 = "AES256";
+public static final String CUSTOMER_ALGORITHM_AES256 = "AES256";
+
+
+public static final PropertyDescriptor ENCRYPTION_METHOD = new 
PropertyDescriptor.Builder()
+.name("encryption-method")
+.displayName("Encryption Method")
+.required(true)
+.allowableValues(METHOD_SSE_S3, METHOD_SSE_KMS, METHOD_SSE_C)
+.defaultValue(METHOD_SSE_S3)
+.description("Method by which the S3 object will be encrypted 
server-side.")
+.build();
+
+public static final PropertyDescriptor ALGORITHM = new 
PropertyDescriptor.Builder()
+.name("sse-algorithm")
+.displayName("Algorithm")
+.allowableValues(ALGORITHM_AES256)
+.defaultValue(ALGORITHM_AES256)
+.description("Encryption algorithm to use (only AES256 
currently supported)")
+.build();
+
+public static final PropertyDescriptor KMS_KEY_ID = new 
PropertyDescriptor.Builder()
+.name("sse-kme-key-id")
+.displayName("KMS Key Id")
+.required(false)
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.description("Custom KMS key identifier. Supports key or alias 
ARN.")
+.build();
+
+public static final PropertyDescriptor CUSTOMER_KEY = new 
PropertyDescriptor.Builder()
+.name("sse-customer-key")
+.displayName("Customer Key")
+.required(false)
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDA

[GitHub] nifi issue #2291: NIFI-4256 - Add support for all AWS S3 Encryption Options

2017-11-27 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2291
  
@baank, thanks for the latest update.  Good news, we're getting down to the 
nit-picks:

1.  I had a checkstyle error running the full build with contrib check on 
nifi-aws-service-api `UnusedImports: Unused import - 
com.amazonaws.services.s3.AmazonS3Encryption`.

2.  In your services, some of the Property Descriptors are marked as 
supporting expression language, but EL is not evaluated when extracting the 
value of the property (like 
`context.getProperty(KMS_CMK_ID).evaluateAttributeExpressions().getValue()` or 
similar).  We should either evaluate the expressions or not mark them as 
supporting expression language:
  * EncryptedS3ClientService (KMS_CMK_ID, SECRET_KEY, PRIVATE_KEY, 
PUBLIC_KEY)
  * EncryptedS3PutEnrichmentService (KMS_KEY_ID, CUSTOMER_KEY)

It's fine to update this PR.  I'll work out rebasing and squashing when 
we're ready.




---


[GitHub] nifi issue #2248: NIFI-4256 - Add support for all AWS S3 Encryption Options

2017-11-19 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2248
  
@baank, thanks for putting in all the work on this PR so far, it is looking 
pretty good.  The full build with contrib-check passed.  I have been able to 
run through the encryption functionality without problems so far.  I have a few 
comments and questions:

* I see warnings like "The service APIs should not be bundled with the 
implementations" when starting NiFi:
```
2017-11-19 03:01:29,148 WARN [main] org.apache.nifi.nar.ExtensionManager 
Controller Service 
org.apache.nifi.processors.aws.s3.encryption.EncryptedS3ClientService is 
bundled with its supporting APIs 
org.apache.nifi.processors.aws.s3.S3ClientService. The service APIs should not 
be bundled with the implementations.
2017-11-19 03:01:29,158 WARN [main] org.apache.nifi.nar.ExtensionManager 
Controller Service 
org.apache.nifi.processors.aws.s3.encryption.EncryptedS3PutEnrichmentService is 
bundled with its supporting APIs 
org.apache.nifi.processors.aws.s3.service.S3PutEnrichmentService. The service 
APIs should not be bundled with the implementations.
```

As part of the recent restructuring of the nifi-aws-bundle, service 
interfaces are now defined in the nifi-aws-service-api project, with 
implementations in one of the other projects.  I think we should move the 
interface definitions for S3ClientService and S3PutEnrichmentService into the 
nifi-aws-service-api project.

* In PutS3Object, I recommend that calling the S3PutEnrichmentService be 
the very last thing before making the request to S3, to provide the maximum 
range of modification options.  At the moment, there are some ACL settings that 
follow enrichment.  Does that make sense?
* Why does the EncryptedS3PutEnrichmentService offer to capture and use the 
MD5 hash of the key?  I vaguely understood that the AWS SDK would handle the 
MD5 hash when necessary, which I also understood to be for get requests, and 
I'm not sure when I would fill it in.  The documentation is not very clear, but 
[ObjectMetadata.setSSECustomerKeyMd5](http://docs.aws.amazon.com/AWSJavaSDK/latest/javadoc/com/amazonaws/services/s3/model/ObjectMetadata.html#setSSECustomerKeyMd5-java.lang.String-)
 says "for internal use only".  The service seemed to work just fine for me 
without providing it, and it is not described as required.  From line 163:
```
if (StringUtils.isNotBlank(customerKeyMD5)) {
putObjectRequest.getMetadata().setSSECustomerKeyMd5(customerKeyMD5);
}
```
When would we need this?


---


[GitHub] nifi issue #2248: NIFI-4256 - Add support for all AWS S3 Encryption Options

2017-11-14 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2248
  
Reviewing


---


[GitHub] nifi issue #2140: NIFI-3950 Refactor AWS bundle

2017-10-21 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2140
  
@christophercurrie, thanks again for the work on this infrastructure 
project. 

I merged these commits to master, except for the last commit "Limit NOTICE 
to AWS SDK" for nifi-aws-service-api-nar.  I lied about not needed all those 
dependency notices.  After checking with Maven, I understand  HttpComponents, 
Joda Time, etc. are transitive dependencies of the AWS SDK, and we need them 
even for the interfaces.  I apologize for the confusion about that, and thank 
you for adding it as a separate commit.

I'll follow up with some text for the NiFi 1.5.0 migration notes.


---


[GitHub] nifi issue #2140: NIFI-3950 Refactor AWS bundle

2017-10-18 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2140
  
@christophercurrie The outstanding item for now is a set of LICENSE/NOTICE 
files for nifi-aws-service-api-nar, similar to what is now in the nifi-aws-nar. 
I believe the NOTICE file can be pared down to only reference the aws-sdk.


---


[GitHub] nifi issue #2140: NIFI-3950 Refactor AWS bundle

2017-10-17 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2140
  
@christophercurrie I think we're pretty close on this PR, any interest in 
continuing?


---


[GitHub] nifi issue #2140: NIFI-3950 Refactor AWS bundle

2017-09-18 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2140
  
More migration notes:

* My 1.3.0 AWS processor worked OK without modification when used with this 
PR.
* My 1.3.0 AWS controller service did not work with just the NAR file, with 
the expected incompatibility error:

> AWS Credentials Provider service' validated against 
'8902b809-015e-1000-46c3-e321c1d9a1b4' is invalid because SampleAWSCreds - 
1.3.0 from sample - sample-aws-services-nar is not compatible with 
AWSCredentialsProviderService - 1.4.0-SNAPSHOT from org.apache.nifi - 
nifi-aws-nar

* However, if `nifi-aws-nar-1.3.0.nar` was included side-by-side with 
`nifi-aws-nar-1.4.0.nar`, the 1.3.0 processor worked with the 1.3.0 controller 
service, unchanged.

@bbende , I'm not sure the last option is what you described above.  I was 
expecting to add more NARs.  But it seemed plausible that the monolithic nature 
of nifi-aws-nar-1.3.0.nar might make it easier to dump in side-by-side?


---


[GitHub] nifi issue #2140: NIFI-3950 Refactor AWS bundle

2017-09-17 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2140
  
Thanks for the update, @christophercurrie .  This PR is looking pretty good:
* Passes the full suite of unit tests with contrib-check.
* AWS processors and controller service still work OK in my testing.
* Provides a good migration experience -- just rebuild against NiFi 1.4.0 
nars -- better than I feared.  More below.

One thing we still need is a set of LICENSE/NOTICE files for 
nifi-aws-service-api-nar, similar to what is now in the nifi-aws-nar.  I 
believe the NOTICE file can be pared down to only reference the aws-sdk.

**Migration Experience**
I created a [simple AWS 
bundle](https://github.com/jvwing/sample-aws-bundle) targeting NiFi 1.3.0, and 
went through the exercise of [migrating 
it](https://github.com/jvwing/sample-aws-bundle/tree/target-nifi-1.4.0) to 
1.4.0 as of this PR.  It seems "smooth" enough to me.
* Advancing the NiFi dependency version to 1.4.0 and rebuilding is enough, 
maintaining the NAR dependency on `nifi-aws-nar`.
* For bundles that only implement controller service interfaces, they may 
optionally change their NAR dependency to `nifi-aws-service-api-nar`.  Since 
nifi-aws-nar already has this NAR dependency, I believe this is a recommended, 
but not strictly necessary step.


---


[GitHub] nifi issue #2140: NIFI-3950 Refactor AWS bundle

2017-09-13 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2140
  
Thanks, @bbende, I'll take up the migration guidance for the wiki.  As part 
of reviewing this PR, I am building some sample processor and service projects 
to reproduce the problems and check the fix.  I plan to work through the 
migration steps myself and can document the process.  

It's very likely that I'll have more questions for you as I get into 
details of it.


---


[GitHub] nifi issue #2140: NIFI-3950 Refactor AWS bundle

2017-09-13 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2140
  
Please add the commits to this one.  It usually helps reviewing to see the 
commits separately, and it's easy enough to squash at the end.


---


[GitHub] nifi pull request #2140: NIFI-3950 Refactor AWS bundle

2017-09-12 Thread jvwing
Github user jvwing commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2140#discussion_r138521026
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/pom.xml ---
@@ -253,11 +253,6 @@
 
 
 org.apache.nifi
-nifi-schema-registry-service-api
--- End diff --

Again, a great change in a different PR for a different ticket.


---


[GitHub] nifi pull request #2140: NIFI-3950 Refactor AWS bundle

2017-09-12 Thread jvwing
Github user jvwing commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2140#discussion_r138520916
  
--- Diff: pom.xml ---
@@ -1109,12 +1109,6 @@
 
 
 org.apache.nifi
-nifi-kudu-nar
--- End diff --

I understand.  I think that might make a great change in a different PR for 
a different ticket.  But again, I would prefer not to evaluate it as part of 
this one.


---


[GitHub] nifi pull request #2140: NIFI-3950 Refactor AWS bundle

2017-09-12 Thread jvwing
Github user jvwing commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2140#discussion_r138520837
  
--- Diff: nifi-assembly/pom.xml ---
@@ -1,13 +1,13 @@
 
-

[GitHub] nifi issue #2140: NIFI-3950 Refactor AWS bundle

2017-09-11 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2140
  
@christophercurrie - With respect to a transition plan, I'm not sure 
exactly what we need.  I'll have to get back to you on that.  In vague concept, 
users who have built custom processors and custom controller services against 
the existing API should have a smooth upgrade experience to the new one.  I'll 
try to work out a more concrete definition for 'smooth'.


---


[GitHub] nifi pull request #2140: NIFI-3950 Refactor AWS bundle

2017-09-11 Thread jvwing
Github user jvwing commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2140#discussion_r138251181
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/pom.xml ---
@@ -253,11 +253,6 @@
 
 
 org.apache.nifi
-nifi-schema-registry-service-api
--- End diff --

Is nifi-schema-registry-service-api intentionally removed here?


---


[GitHub] nifi pull request #2140: NIFI-3950 Refactor AWS bundle

2017-09-11 Thread jvwing
Github user jvwing commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2140#discussion_r138248781
  
--- Diff: nifi-assembly/pom.xml ---
@@ -1,13 +1,13 @@
 
-

[GitHub] nifi pull request #2140: NIFI-3950 Refactor AWS bundle

2017-09-11 Thread jvwing
Github user jvwing commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2140#discussion_r138251087
  
--- Diff: pom.xml ---
@@ -1109,12 +1109,6 @@
 
 
 org.apache.nifi
-nifi-kudu-nar
--- End diff --

Did you intend to remove nifi-kudu-nar?


---


[GitHub] nifi issue #2140: NIFI-3950 Refactor AWS bundle

2017-09-11 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2140
  
Reviewing... I agree the CI failure may not be related to this change.  
`mvn clean install -Pcontrib-check` worked OK for me.


---


[GitHub] nifi pull request #2017: NIFI-4197 - Expose some proxy settings to GCS Proce...

2017-09-10 Thread jvwing
Github user jvwing commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2017#discussion_r137959726
  
--- Diff: 
nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/AbstractGCPProcessor.java
 ---
@@ -55,6 +55,25 @@
 .addValidator(StandardValidators.INTEGER_VALIDATOR)
 .build();
 
+public static final PropertyDescriptor PROXY_HOST = new 
PropertyDescriptor
+.Builder().name("proxy-host")
+.displayName("Proxy host")
+.description("IP or hostname of the proxy to be used")
+.required(false)
+.expressionLanguageSupported(false)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor PROXY_PORT = new 
PropertyDescriptor
+.Builder().name("proxy-port")
+.displayName("Proxy port")
+.description("Porxy port number")
--- End diff --

"**Porxy** port number" -  looks like a typo


---


[GitHub] nifi issue #2017: NIFI-4197 - Expose some proxy settings to GCS Processors

2017-09-10 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2017
  
@trixpan - Thanks for contributing the proxy feature.  It looks pretty 
good, and worked just fine in my testing with GCS processors.  Please take a 
look at the nit-picks above, but I don't have any more substantial comments at 
the moment.


---


[GitHub] nifi pull request #2017: NIFI-4197 - Expose some proxy settings to GCS Proce...

2017-09-10 Thread jvwing
Github user jvwing commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2017#discussion_r137959785
  
--- Diff: 
nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/AbstractGCPProcessor.java
 ---
@@ -55,6 +55,25 @@
 .addValidator(StandardValidators.INTEGER_VALIDATOR)
 .build();
 
+public static final PropertyDescriptor PROXY_HOST = new 
PropertyDescriptor
+.Builder().name("proxy-host")
+.displayName("Proxy host")
+.description("IP or hostname of the proxy to be used")
+.required(false)
+.expressionLanguageSupported(false)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor PROXY_PORT = new 
PropertyDescriptor
+.Builder().name("proxy-port")
--- End diff --

Same thing with the `gcp-` prefix.


---


[GitHub] nifi pull request #2017: NIFI-4197 - Expose some proxy settings to GCS Proce...

2017-09-10 Thread jvwing
Github user jvwing commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2017#discussion_r137959775
  
--- Diff: 
nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/AbstractGCPProcessor.java
 ---
@@ -55,6 +55,25 @@
 .addValidator(StandardValidators.INTEGER_VALIDATOR)
 .build();
 
+public static final PropertyDescriptor PROXY_HOST = new 
PropertyDescriptor
+.Builder().name("proxy-host")
--- End diff --

The other property descriptors in this file prefix their names with `gcp-`. 
 Any special reason not to fit in with the others?


---


[GitHub] nifi issue #2110: NIFI-4116: Allow fields of Record returned from Lookup Ser...

2017-09-05 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2110
  
Thanks, @markap14, looks good. 


---


[GitHub] nifi issue #2110: NIFI-4116: Allow fields of Record returned from Lookup Ser...

2017-08-30 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2110
  
Reviewing...


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #2066: NIFI-4256 - Add support for all AWS S3 Encryption Options

2017-08-09 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2066
  
@baank Thanks for putting together this PR, it looks like you put a lot of 
thought into covering all the possible encryption scenarios.  I haven't run it 
yet, but I have a few starter questions after looking over some of the code:

1. What was the driver behind updating the AWS SDK version?
1. Although the service interfaces and their methods are named specific to 
encryption, the substance of their interaction are not necessarily limited to 
encryption.  What would you think about making the interfaces more generic?  
For example:
* Could the S3ClientSideEncryptionService be "S3ClientService" with 
only `getClient` methods, with the `needsEncryptedClient()` logic being 
performed internally by the concrete implementation 
StandardS3ClientSideEncryptionService.  I can see a number of use cases beyond 
encryption that could be covered by a custom client factory.
* Could the S3ServerSideEncryptionService be a more generic S3 put 
request modifier?  My efforts at thinking up a good name failed miserably here. 
 But the interface allows many non-encryption modifications of an S3 request, 
which might indeed be useful, despite the `encrypt()` naming of the methods.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #2066: NIFI-4256 - Add support for all AWS S3 Encryption Options

2017-08-09 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2066
  
Reviewing...


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1888: NIFI-4015 NIFI-3999 Fix DeleteSQS Issues

2017-07-31 Thread jvwing
Github user jvwing commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1888#discussion_r130516693
  
--- Diff: 
nifi-nar-bundles/nifi-aws-bundle/nifi-aws-processors/src/test/java/org/apache/nifi/processors/aws/sqs/ITDeleteSQS.java
 ---
@@ -0,0 +1,83 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.aws.sqs;
+
+import java.io.File;
+import java.io.IOException;
+import java.util.HashMap;
+import java.util.Map;
+
+import com.amazonaws.regions.Regions;
+import com.amazonaws.services.sqs.model.Message;
+import com.amazonaws.services.sqs.model.ReceiveMessageResult;
+import com.amazonaws.services.sqs.model.SendMessageResult;
+import org.apache.nifi.util.TestRunner;
+import org.apache.nifi.util.TestRunners;
+
+import com.amazonaws.auth.PropertiesCredentials;
+import com.amazonaws.services.sqs.AmazonSQSClient;
+
+import org.junit.Before;
+import org.junit.Ignore;
+import org.junit.Test;
+
+import static org.junit.Assert.assertEquals;
+
+
+@Ignore("For local testing only - interacts with S3 so the credentials 
file must be configured and all necessary queues created")
+public class ITDeleteSQS {
+
+private final String CREDENTIALS_FILE = 
System.getProperty("user.home") + "/aws-credentials.properties";
--- End diff --

@jzonthemtn The DeleteSQS processor can absolutely use default credentials 
through a controller service.  The integration tests across the SQS processors 
and most of the AWS bundle use a properties file, and I followed that pattern.  
I'm not sure if it was desired as the best solution, or just the one we knew 
how to use at the time.  Maybe it prevents people from accidentally running the 
integration tests on with their default credentials?

What do you think?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #2034: NIFI-4215 Fixed stackoverflow error when NiFi tries to par...

2017-07-30 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2034
  
Looks good.  Thanks again for fixing this, @Wesley-Lawrence.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #2030: NIFI-4212 - RethinkDB Delete Processor

2017-07-29 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2030
  
Thanks for fixing those items, @mans2singh, I'll get this merged. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #2034: NIFI-4215 Fixed stackoverflow error when NiFi tries to par...

2017-07-29 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2034
  
@Wesley-Lawrence  I think this is a good fix, your approach to the solution 
seems solid.  Thanks for adding the unit tests for the recursive and 
mutually-referential cases.  Would you please:

1. Fix the checkstyle issue and remove the change to the checkstyle 
definitions
1. Optionally change the foundSchemas/knownRecordTypes exception message
1. Squash and rebase on master




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #2034: NIFI-4215 Fixed stackoverflow error when NiFi tries...

2017-07-29 Thread jvwing
Github user jvwing commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2034#discussion_r130225722
  
--- Diff: 
nifi-nar-bundles/nifi-extension-utils/nifi-record-utils/nifi-avro-record-utils/src/main/java/org/apache/nifi/avro/AvroTypeUtil.java
 ---
@@ -218,6 +218,15 @@ private static Schema nullable(final Schema schema) {
  * @return a Data Type that corresponds to the given Avro Schema
  */
 public static DataType determineDataType(final Schema avroSchema) {
+return determineDataType(avroSchema, new HashMap<>());
+}
+
+public static DataType determineDataType(final Schema avroSchema, 
Map<String, DataType> knownRecordTypes) {
+
+if (knownRecordTypes == null) {
+throw new IllegalArgumentException("'foundSchemas' cannot be 
null.");
--- End diff --

Did you mean this to say "knownRecordTypes" rather than "foundSchemas"?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #1126: NIFI-1769: Implemented SSE with KMS.

2017-07-26 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/1126
  
@baank - Thanks for being willing to work on this.  A new pull request 
might be easier.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #2034: NIFI-4215 Fixed stackoverflow error when NiFi tries to par...

2017-07-26 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2034
  
Thanks @pvillard31!

@Wesley-Lawrence Don't worry about squashing yet, we can do that as a final 
step.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #2034: NIFI-4215 Fixed stackoverflow error when NiFi tries to par...

2017-07-25 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2034
  
@Wesley-Lawrence Thanks for putting this PR together.  I am still reviewing 
- the only immediate feedback I can give is that I would prefer not to update 
the checkstyle definition in pom.xml as part of this fix for the 
AvroSchemaRegistry issue.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #2034: NIFI-4215 Fixed stackoverflow error when NiFi tries to par...

2017-07-25 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2034
  
Reviewing...


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #2030: NIFI-4212 - RethinkDB Delete Processor

2017-07-24 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2030
  
Overall, this looks good - tests pass, integration tests pass, refactoring 
of integration test credentials is nice, code looks good.  In my basic testing, 
it deleted objects from RethinkDB just fine.  I have a few questions about the 
**Return old Changes from the delete** option, and how it is processed.  I am 
not very familiar with RethinkDB, so I had to check the [delete 
docs](https://rethinkdb.com/api/java/delete/) on how it works.  I have no 
practical experience using this option.

* Instead of a true/false dropdown list, you made this a free text field 
supporting expression language.  May I ask why?  I think it would be easier to 
edit if we did, and easier to understand if we made the options more 
explanatory ("Capture changes as FlowFile content", "Do not capture changes", 
or something like that).
* If the option is true, the processor writes the changes to the attribute 
`rethinkdb.delete.changes`, the changes are also contained in the entire raw 
`attribute rethinkdb.delete.result` attribute, and finally the changes are 
written to the flowfile content.  Do we need all of those?
* In the event that the deleted document is large, the attributes will be 
large, and potentially unusable due to truncation at 64KB.
* The modify provenance event is reported regardless of this setting.  
Shouldn't we only do that if a change is made?



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #2030: NIFI-4212 - RethinkDB Delete Processor

2017-07-24 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2030
  
Reviewing...


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #2025: NIFI-917 - improve rendering of component documentation

2017-07-23 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2025
  
Thanks, @pvillard31, for taking the time to improve the documentation 
formatting. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #1972: NIFI-4147 Class org.apache.nifi.bootstrap.RunNiFi is not d...

2017-07-22 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/1972
  
@peter-gergely-horvath - Thanks for the update, it looks good.  I will 
merge shortly.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #2012: NIFI-4188 - Nifi RethinkDB Get processor

2017-07-19 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2012
  
Thanks for those updates, @mans2singh, everything looks good.  I'll merge 
shortly. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #2012: NIFI-4188 - Nifi RethinkDB Get processor

2017-07-18 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2012
  
I'm OK with the name "GetRethinkDB".  Thanks for the tagging.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #2012: NIFI-4188 - Nifi RethinkDB Get processor

2017-07-18 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2012
  
My question about the not_found relationship is not if we should have it or 
not -- I think it is a great addition.  My question is about the logging for 
that code path.  You log an error on line 150 when a flowfile is routed to 
not_found, and I'm asking if you believe not_found is "bad enough" to deserve 
an error.  It seems possible that it is a perfectly normal occurrence that keys 
are not found in the table, and the flowfile would be routed to not_found 
without any worries.  As you point out, the user can choose to treat not_found 
separately from failure, or not, as they choose.  But it is logged as an error 
regardless of their choice.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #2012: NIFI-4188 - Nifi RethinkDB Get processor

2017-07-18 Thread jvwing
Github user jvwing commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2012#discussion_r128149561
  
--- Diff: 
nifi-nar-bundles/nifi-rethinkdb-bundle/nifi-rethinkdb-processors/src/main/java/org/apache/nifi/processors/rethinkdb/GetRethinkDB.java
 ---
@@ -137,9 +135,9 @@ public void onTrigger(final ProcessContext context, 
final ProcessSession session
 String id = 
context.getProperty(RETHINKDB_DOCUMENT_ID).evaluateAttributeExpressions(flowFile).getValue();
 String readMode = 
context.getProperty(READ_MODE).evaluateAttributeExpressions(flowFile).getValue();
 
-if ( StringUtils.isEmpty(id) ) {
-getLogger().error("Empty id '" + id + "'");
-flowFile = session.putAttribute(flowFile, 
RETHINKDB_ERROR_MESSAGE, "Empty id '" + id + "'");
+if ( StringUtils.isBlank(id) ) {
--- End diff --

I do not recommend this change to using `isBlank()`, because it will return 
true if the id is whitespace only, and I believe whitespace is in fact a valid 
RethinkDB key.

I'm sorry I was not clear about my concern.  In the earlier code using 
`isEmpty()`, the if block would be entered if id was null or an empty string.  
However, property values are not null, they would be an empty string at least.  
So the block appeared to only be entered in practice if id was an empty string. 
 Given that we knew id was always an empty string, it did not seem necessary to 
interpolate the value.  It is an empty string, and the result is always the 
same.

This appeared to be confirmed by your testBlankId() unit test that 
exercised this code and checked its value against `Blank id ''`.  It appears to 
be further confirmed by the new testNullId() unit test that checks against the 
exact same value. 

How about just logging it as `Document Identifier cannot be empty` or 
similar?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #2012: NIFI-4188 - Nifi RethinkDB Get processor

2017-07-17 Thread jvwing
Github user jvwing commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2012#discussion_r127879683
  
--- Diff: 
nifi-nar-bundles/nifi-rethinkdb-bundle/nifi-rethinkdb-processors/src/main/java/org/apache/nifi/processors/rethinkdb/GetRethinkDB.java
 ---
@@ -0,0 +1,197 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.rethinkdb;
+
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.SeeAlso;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.AllowableValue;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.expression.AttributeExpression.ResultType;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+
+import com.google.gson.Gson;
+
+import java.io.ByteArrayInputStream;
+import java.nio.charset.Charset;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@EventDriven
+@Tags({"rethinkdb", "get", "read"})
+@CapabilityDescription("Processor to get a JSON document from RethinkDB 
(https://www.rethinkdb.com/) using the document id. The FlowFile will contain 
the retrieved document")
+@WritesAttributes({
+@WritesAttribute(attribute = GetRethinkDB.RETHINKDB_ERROR_MESSAGE, 
description = "RethinkDB error message"),
+})
+@SeeAlso({PutRethinkDB.class})
+public class GetRethinkDB extends AbstractRethinkDBProcessor {
+
+public static AllowableValue READ_MODE_SINGLE = new 
AllowableValue("single", "Single", "Read values from memory from primary 
replica (Default)");
+public static AllowableValue READ_MODE_MAJORITY = new 
AllowableValue("majority", "Majority", "Read values commited to disk on 
majority of replicas");
+public static AllowableValue READ_MODE_OUTDATED = new 
AllowableValue("outdated", "Outdated", "Read values from memory from an 
arbitrary replica ");
+
+protected static final PropertyDescriptor READ_MODE = new 
PropertyDescriptor.Builder()
+.name("rethinkdb-read-mode")
+.displayName("Read Mode")
+.description("Read mode used for consistency")
+.required(true)
+.defaultValue(READ_MODE_SINGLE.getValue())
+.allowableValues(READ_MODE_SINGLE, READ_MODE_MAJORITY, 
READ_MODE_OUTDATED)
+.expressionLanguageSupported(true)
+.build();
+
+public static final PropertyDescriptor RETHINKDB_DOCUMENT_ID = new 
PropertyDescriptor.Builder()
+.displayName("Document Identifier")
+.name("rethinkdb-document-identifier")
+.description("A FlowFile attribute, or attribute expression 
used " +
+"for determining RethinkDB key for the Flow File content")
+.required(true)
+
.addValidator(StandardValidators.createAttribute

[GitHub] nifi pull request #2012: NIFI-4188 - Nifi RethinkDB Get processor

2017-07-17 Thread jvwing
Github user jvwing commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2012#discussion_r127882745
  
--- Diff: 
nifi-nar-bundles/nifi-rethinkdb-bundle/nifi-rethinkdb-processors/src/main/java/org/apache/nifi/processors/rethinkdb/GetRethinkDB.java
 ---
@@ -0,0 +1,197 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.rethinkdb;
+
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.SeeAlso;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.AllowableValue;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.expression.AttributeExpression.ResultType;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+
+import com.google.gson.Gson;
+
+import java.io.ByteArrayInputStream;
+import java.nio.charset.Charset;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@EventDriven
+@Tags({"rethinkdb", "get", "read"})
+@CapabilityDescription("Processor to get a JSON document from RethinkDB 
(https://www.rethinkdb.com/) using the document id. The FlowFile will contain 
the retrieved document")
+@WritesAttributes({
+@WritesAttribute(attribute = GetRethinkDB.RETHINKDB_ERROR_MESSAGE, 
description = "RethinkDB error message"),
+})
+@SeeAlso({PutRethinkDB.class})
+public class GetRethinkDB extends AbstractRethinkDBProcessor {
+
+public static AllowableValue READ_MODE_SINGLE = new 
AllowableValue("single", "Single", "Read values from memory from primary 
replica (Default)");
+public static AllowableValue READ_MODE_MAJORITY = new 
AllowableValue("majority", "Majority", "Read values commited to disk on 
majority of replicas");
+public static AllowableValue READ_MODE_OUTDATED = new 
AllowableValue("outdated", "Outdated", "Read values from memory from an 
arbitrary replica ");
+
+protected static final PropertyDescriptor READ_MODE = new 
PropertyDescriptor.Builder()
+.name("rethinkdb-read-mode")
+.displayName("Read Mode")
+.description("Read mode used for consistency")
+.required(true)
+.defaultValue(READ_MODE_SINGLE.getValue())
+.allowableValues(READ_MODE_SINGLE, READ_MODE_MAJORITY, 
READ_MODE_OUTDATED)
+.expressionLanguageSupported(true)
+.build();
+
+public static final PropertyDescriptor RETHINKDB_DOCUMENT_ID = new 
PropertyDescriptor.Builder()
+.displayName("Document Identifier")
+.name("rethinkdb-document-identifier")
+.description("A FlowFile attribute, or attribute expression 
used " +
+"for determining RethinkDB key for the Flow File content")
+.required(true)
+
.addValidator(StandardValidators.createAttribute

[GitHub] nifi pull request #2012: NIFI-4188 - Nifi RethinkDB Get processor

2017-07-17 Thread jvwing
Github user jvwing commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2012#discussion_r127880492
  
--- Diff: 
nifi-nar-bundles/nifi-rethinkdb-bundle/nifi-rethinkdb-processors/src/main/java/org/apache/nifi/processors/rethinkdb/GetRethinkDB.java
 ---
@@ -0,0 +1,197 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.rethinkdb;
+
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.SeeAlso;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.AllowableValue;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.expression.AttributeExpression.ResultType;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+
+import com.google.gson.Gson;
+
+import java.io.ByteArrayInputStream;
+import java.nio.charset.Charset;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@EventDriven
+@Tags({"rethinkdb", "get", "read"})
+@CapabilityDescription("Processor to get a JSON document from RethinkDB 
(https://www.rethinkdb.com/) using the document id. The FlowFile will contain 
the retrieved document")
+@WritesAttributes({
+@WritesAttribute(attribute = GetRethinkDB.RETHINKDB_ERROR_MESSAGE, 
description = "RethinkDB error message"),
+})
+@SeeAlso({PutRethinkDB.class})
+public class GetRethinkDB extends AbstractRethinkDBProcessor {
+
+public static AllowableValue READ_MODE_SINGLE = new 
AllowableValue("single", "Single", "Read values from memory from primary 
replica (Default)");
+public static AllowableValue READ_MODE_MAJORITY = new 
AllowableValue("majority", "Majority", "Read values commited to disk on 
majority of replicas");
+public static AllowableValue READ_MODE_OUTDATED = new 
AllowableValue("outdated", "Outdated", "Read values from memory from an 
arbitrary replica ");
+
+protected static final PropertyDescriptor READ_MODE = new 
PropertyDescriptor.Builder()
+.name("rethinkdb-read-mode")
+.displayName("Read Mode")
+.description("Read mode used for consistency")
+.required(true)
+.defaultValue(READ_MODE_SINGLE.getValue())
+.allowableValues(READ_MODE_SINGLE, READ_MODE_MAJORITY, 
READ_MODE_OUTDATED)
+.expressionLanguageSupported(true)
+.build();
+
+public static final PropertyDescriptor RETHINKDB_DOCUMENT_ID = new 
PropertyDescriptor.Builder()
+.displayName("Document Identifier")
+.name("rethinkdb-document-identifier")
+.description("A FlowFile attribute, or attribute expression 
used " +
+"for determining RethinkDB key for the Flow File content")
+.required(true)
+
.addValidator(StandardValidators.createAttribute

[GitHub] nifi pull request #2012: NIFI-4188 - Nifi RethinkDB Get processor

2017-07-17 Thread jvwing
Github user jvwing commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2012#discussion_r127880998
  
--- Diff: 
nifi-nar-bundles/nifi-rethinkdb-bundle/nifi-rethinkdb-processors/src/main/java/org/apache/nifi/processors/rethinkdb/GetRethinkDB.java
 ---
@@ -0,0 +1,197 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.rethinkdb;
+
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.SeeAlso;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.AllowableValue;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.expression.AttributeExpression.ResultType;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+
+import com.google.gson.Gson;
+
+import java.io.ByteArrayInputStream;
+import java.nio.charset.Charset;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@EventDriven
+@Tags({"rethinkdb", "get", "read"})
+@CapabilityDescription("Processor to get a JSON document from RethinkDB 
(https://www.rethinkdb.com/) using the document id. The FlowFile will contain 
the retrieved document")
+@WritesAttributes({
+@WritesAttribute(attribute = GetRethinkDB.RETHINKDB_ERROR_MESSAGE, 
description = "RethinkDB error message"),
+})
+@SeeAlso({PutRethinkDB.class})
+public class GetRethinkDB extends AbstractRethinkDBProcessor {
+
+public static AllowableValue READ_MODE_SINGLE = new 
AllowableValue("single", "Single", "Read values from memory from primary 
replica (Default)");
+public static AllowableValue READ_MODE_MAJORITY = new 
AllowableValue("majority", "Majority", "Read values commited to disk on 
majority of replicas");
+public static AllowableValue READ_MODE_OUTDATED = new 
AllowableValue("outdated", "Outdated", "Read values from memory from an 
arbitrary replica ");
+
+protected static final PropertyDescriptor READ_MODE = new 
PropertyDescriptor.Builder()
+.name("rethinkdb-read-mode")
+.displayName("Read Mode")
+.description("Read mode used for consistency")
+.required(true)
+.defaultValue(READ_MODE_SINGLE.getValue())
+.allowableValues(READ_MODE_SINGLE, READ_MODE_MAJORITY, 
READ_MODE_OUTDATED)
+.expressionLanguageSupported(true)
+.build();
+
+public static final PropertyDescriptor RETHINKDB_DOCUMENT_ID = new 
PropertyDescriptor.Builder()
+.displayName("Document Identifier")
+.name("rethinkdb-document-identifier")
+.description("A FlowFile attribute, or attribute expression 
used " +
+"for determining RethinkDB key for the Flow File content")
+.required(true)
+
.addValidator(StandardValidators.createAttribute

[GitHub] nifi pull request #2012: NIFI-4188 - Nifi RethinkDB Get processor

2017-07-17 Thread jvwing
Github user jvwing commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2012#discussion_r127879552
  
--- Diff: 
nifi-nar-bundles/nifi-rethinkdb-bundle/nifi-rethinkdb-processors/src/main/java/org/apache/nifi/processors/rethinkdb/GetRethinkDB.java
 ---
@@ -0,0 +1,197 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.rethinkdb;
+
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.SeeAlso;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.AllowableValue;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.expression.AttributeExpression.ResultType;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+
+import com.google.gson.Gson;
+
+import java.io.ByteArrayInputStream;
+import java.nio.charset.Charset;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@EventDriven
+@Tags({"rethinkdb", "get", "read"})
+@CapabilityDescription("Processor to get a JSON document from RethinkDB 
(https://www.rethinkdb.com/) using the document id. The FlowFile will contain 
the retrieved document")
+@WritesAttributes({
+@WritesAttribute(attribute = GetRethinkDB.RETHINKDB_ERROR_MESSAGE, 
description = "RethinkDB error message"),
+})
+@SeeAlso({PutRethinkDB.class})
+public class GetRethinkDB extends AbstractRethinkDBProcessor {
+
+public static AllowableValue READ_MODE_SINGLE = new 
AllowableValue("single", "Single", "Read values from memory from primary 
replica (Default)");
+public static AllowableValue READ_MODE_MAJORITY = new 
AllowableValue("majority", "Majority", "Read values commited to disk on 
majority of replicas");
--- End diff --

Spell checker says `committed` instead of `commited`.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #2012: NIFI-4188 - Nifi RethinkDB Get processor

2017-07-16 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/2012
  
Reviewing...


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1972: NIFI-4147 Class org.apache.nifi.bootstrap.RunNiFi i...

2017-07-15 Thread jvwing
Github user jvwing commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1972#discussion_r127591766
  
--- Diff: 
nifi-bootstrap/src/main/java/org/apache/nifi/bootstrap/RunNiFi.java ---
@@ -135,8 +135,8 @@
 private volatile Set<Future> loggingFutures = new HashSet<>(2);
 private final NotificationServiceManager serviceManager;
 
-public RunNiFi(final File bootstrapConfigFile, final boolean verbose) 
throws IOException {
--- End diff --

I do not recommend changing the signature of a public constructor.  
Although there is no other use of the RunNiFi constructor in the NiFi project, 
someone might be using it in their own code, elsewhere.  Is this essential to 
your testability goal?  It does look like we're not sure about if/how to use 
`verbose`, but that might be a different ticket.  If it is essential, how about 
a second constructor?

With respect to the `File bootstrapConfigFile` argument, I understand there 
is a difference because you make `getBootstrapConfigFile()` an instance method, 
so RunNiFi will not separately call `getBootstrapConfigFile()` first.  What if 
we allowed that as a nullable argument, where it is accepted if given, and 
looked it up if `null` is passed in?  Again, I think this would preserve 
compatibility with possible existing callers.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #1972: NIFI-4147 Class org.apache.nifi.bootstrap.RunNiFi is not d...

2017-07-15 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/1972
  
@peter-gergely-horvath Thank you for submitting this PR, improving 
testability is always welcome.  I will be happy to review.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #1888: NIFI-4015 NIFI-3999 Fix DeleteSQS Issues

2017-07-15 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/1888
  
@brosander I added an update to keep the existing attribute name.  Please 
let me know if/when we need a squash and rebase.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #1998: NIFI-4145 Added missing write attribute annotation

2017-07-10 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/1998
  
@mans2singh Looks good, thanks for taking the time to fix this.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #1998: NIFI-4145 Added missing write attribute annotation

2017-07-10 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/1998
  
Reviewing...


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #1982: NIFI-4154 - fix line endings in source code

2017-07-09 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/1982
  
Thanks @trkurc, @m-hogue.  I confirmed the whitespace-only changes in the 
diffs, ran full build with contrib-check OK.  I'll merge shortly.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #1942: NIFI-4118 First commit of RethinkDB put processor

2017-07-09 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/1942
  
Thanks @mans2singh!  This has been merged in.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1942: NIFI-4118 First commit of RethinkDB put processor

2017-07-08 Thread jvwing
Github user jvwing commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1942#discussion_r126293212
  
--- Diff: 
nifi-nar-bundles/nifi-rethinkdb-bundle/nifi-rethinkdb-processors/src/test/java/org/apache/nifi/processors/rethinkdb/ITPutRethinkDBTest.java
 ---
@@ -0,0 +1,216 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.rethinkdb;
+
+import static org.junit.Assert.assertEquals;
+import static org.junit.Assert.assertNotNull;
+import java.util.List;
+import org.apache.nifi.util.MockFlowFile;
+import org.apache.nifi.util.TestRunner;
+import org.apache.nifi.util.TestRunners;
+import org.json.simple.JSONArray;
+import org.junit.After;
+import org.junit.Before;
+import org.junit.Ignore;
+import org.junit.Test;
+
+import com.rethinkdb.RethinkDB;
+import com.rethinkdb.net.Connection;
+
+import net.minidev.json.JSONObject;
+
+/**
+ * Integration test for RethinkDB. Please ensure that the RethinkDB is 
running
+ * on local host with default port and has database test with table test 
and user
+ * admin with password admin before running the integration tests or set 
the attributes in the
+ * test accordingly.
+ */
+@Ignore("Comment this out for running tests against a real instance of 
RethinkDB")
+public class ITPutRethinkDBTest {
+private TestRunner runner;
+private Connection connection;
+private String dbName = "test";
+private String dbHost = "localhost";
+private String dbPort = "28015";
+private String user = "admin";
+private String password = "admin";
+private String table = "test";
+
+@Before
+public void setUp() throws Exception {
+runner = TestRunners.newTestRunner(PutRethinkDB.class);
+runner.setProperty(PutRethinkDB.DB_NAME, dbName);
+runner.setProperty(PutRethinkDB.DB_HOST, dbHost);
+runner.setProperty(PutRethinkDB.DB_PORT, dbPort);
+runner.setProperty(PutRethinkDB.USERNAME, user);
+runner.setProperty(PutRethinkDB.PASSWORD, password);
+runner.setProperty(PutRethinkDB.TABLE_NAME, table);
+runner.setProperty(PutRethinkDB.CHARSET, "UTF-8");
+runner.setProperty(PutRethinkDB.CONFLICT_STRATEGY, 
PutRethinkDB.CONFLICT_STRATEGY_UPDATE);
+runner.setProperty(PutRethinkDB.DURABILITY, 
PutRethinkDB.DURABILITY_HARD);
+runner.setProperty(PutRethinkDB.MAX_DOCUMENTS_SIZE, "1 KB");
+runner.assertValid();
+
+connection = RethinkDB.r.connection().user(user, 
password).db(dbName).hostname(dbHost).port(Integer.parseInt(dbPort)).connect();
+}
+
+@After
+public void tearDown() throws Exception {
+runner = null;
+connection.close();
+connection = null;
+}
+
+@Test
+public void testValidSingleMessage() {
+RethinkDB.r.db(dbName).table(table).delete().run(connection);
+long count = 
RethinkDB.r.db(dbName).table(table).count().run(connection);
+assertEquals("Count should be same", 0L, count);
+
+JSONObject message = new JSONObject();
+message.put("hello", "rethinkdb");
+byte [] bytes = message.toJSONString().getBytes();
+runner.enqueue(bytes);
+runner.run(1,true,true);
+runner.assertAllFlowFilesTransferred(PutRethinkDB.REL_SUCCESS, 1);
+
+List flowFiles = 
runner.getFlowFilesForRelationship(PutRethinkDB.REL_SUCCESS);
+
assertEquals(flowFiles.get(0).getAttribute(PutRethinkDB.RETHINKDB_INSERT_RESULT_DELETED_KEY),
 "0");
+
assertEquals(flowFiles.get(0).getAttribute(PutRethinkDB.RETHINKDB_INSERT_RESULT_ERROR_KEY),"0");
+
assertNotNull(flowFiles.get(0).getAttribute(PutRethin

[GitHub] nifi pull request #1942: NIFI-4118 First commit of RethinkDB put processor

2017-07-08 Thread jvwing
Github user jvwing commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1942#discussion_r126293188
  
--- Diff: 
nifi-nar-bundles/nifi-rethinkdb-bundle/nifi-rethinkdb-processors/src/main/java/org/apache/nifi/processors/rethinkdb/PutRethinkDB.java
 ---
@@ -0,0 +1,237 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.rethinkdb;
+
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.AllowableValue;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.json.simple.parser.JSONParser;
+import com.rethinkdb.gen.ast.Insert;
+import java.io.ByteArrayOutputStream;
+import java.nio.charset.Charset;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@EventDriven
+@SupportsBatching
+@Tags({"rethinkdb", "stream","insert", "update", "write", "put"})
+@CapabilityDescription("Processor to write the JSON content of a FlowFile 
to RethinkDB (https://www.rethinkdb.com/). The flow file should contain either 
JSON Object an array of JSON documents")
+@WritesAttributes({
+@WritesAttribute(attribute = PutRethinkDB.RETHINKDB_ERROR_MESSAGE, 
description = "RethinkDB error message"),
+@WritesAttribute(attribute = 
PutRethinkDB.RETHINKDB_INSERT_RESULT_ERROR_KEY, description = "Error count 
while inserting documents"),
+@WritesAttribute(attribute = 
PutRethinkDB.RETHINKDB_INSERT_RESULT_DELETED_KEY, description = "Number of 
documents deleted"),
+@WritesAttribute(attribute = 
PutRethinkDB.RETHINKDB_INSERT_RESULT_GENERATED_KEYS_KEY, description = "Keys 
generated on inserting documents"),
+@WritesAttribute(attribute = 
PutRethinkDB.RETHINKDB_INSERT_RESULT_INSERTED_KEY, description = "Number of 
documents inserted"),
+@WritesAttribute(attribute = 
PutRethinkDB.RETHINKDB_INSERT_RESULT_REPLACED_KEY, description = "Number of 
documents replaced"),
+@WritesAttribute(attribute = 
PutRethinkDB.RETHINKDB_INSERT_RESULT_SKIPPED_KEY, description = "Number of 
documents skipped because they already existed"),
+@WritesAttribute(attribute = 
PutRethinkDB.RETHINKDB_INSERT_RESULT_UNCHANGED_KEY, description = "Number of 
documents unchanged since they already existed"),
+@WritesAttribute(attribute = 
PutRethinkDB.RETHINKDB_INSERT_RESULT_FIRST_ERROR_KEY, description = "First 
error while inserting documents"),
+@WritesAttribute(attribute = 
PutRethinkDB.RETHINKDB_INSERT_RESULT_WARNINGS_KEY, description = "Warning 
message in case of large number of ids being returned on insertion")
+})
+public class PutRethinkDB extends AbstractRethinkDBProcessor {
+
+public static AllowableValue CONFLICT_STRATEGY_UPDATE = new 
AllowableValue("update", "Update", "Update the document having same id with new 
values");
+public static Allowab

[GitHub] nifi issue #1942: NIFI-4118 First commit of RethinkDB put processor

2017-07-04 Thread jvwing
Github user jvwing commented on the issue:

https://github.com/apache/nifi/pull/1942
  
@mans2singh, thanks for contributing this!  I have a few general comments 
from a first pass:

* The notice information for RethinkDB should be included in the 
nifi-assembly/NOTICE file.
* The RethinkDB project styles it's name as "RethinkDB" with capitalized R, 
D, and B.  I recommend we do the same in all of the user-visible processor 
naming, docs, property descriptions, etc.
* The processor docs should include a link to the RethinkDB project.
* Thanks for including a comprehensive test suite with solid code coverage. 
 The integration tests also ran fine for me.

And I have a question about connection management.  I don't have any 
experience with RethinkDB.  A brief search for documentation suggests that the 
Java connection might be thread safe ([RethinkDB Issue 
#1041](https://github.com/rethinkdb/docs/issues/1041)).  Is that your 
understanding?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


  1   2   3   >