[GitHub] nifi pull request: NIFI-1180: Modify PutS3Object to enable encrypt...

2016-02-26 Thread adamonduty
Github user adamonduty commented on a diff in the pull request:

https://github.com/apache/nifi/pull/246#discussion_r54327468
  
--- Diff: 
nifi-nar-bundles/nifi-aws-bundle/nifi-aws-processors/src/main/java/org/apache/nifi/processors/aws/s3/PutS3Object.java
 ---
@@ -177,10 +179,19 @@
 .addValidator(StandardValidators.TIME_PERIOD_VALIDATOR)
 .build();
 
+public static final PropertyDescriptor SERVER_SIDE_ENCRYPTION = new 
PropertyDescriptor.Builder()
+.name("Server Side Encryption")
--- End diff --

That makes sense, but there are also 6 instances above this code that 
doesn't follow the same rule. Which is best: match the existing style, or match 
the rule?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: NIFI-1180: Modify PutS3Object to enable encrypt...

2016-02-26 Thread alopresto
Github user alopresto commented on a diff in the pull request:

https://github.com/apache/nifi/pull/246#discussion_r54322948
  
--- Diff: 
nifi-nar-bundles/nifi-aws-bundle/nifi-aws-processors/src/main/java/org/apache/nifi/processors/aws/s3/PutS3Object.java
 ---
@@ -177,10 +179,19 @@
 .addValidator(StandardValidators.TIME_PERIOD_VALIDATOR)
 .build();
 
+public static final PropertyDescriptor SERVER_SIDE_ENCRYPTION = new 
PropertyDescriptor.Builder()
+.name("Server Side Encryption")
--- End diff --

`.name()` should be used to set a unique name (in the form 
`"server-side-encryption"`). `.displayName()` can be used for a human-readable 
form as above. 

```java
 public static final PropertyDescriptor SERVER_SIDE_ENCRYPTION = new 
PropertyDescriptor.Builder()
.name("server-side-encryption")
.displayName("Server Side Encryption")
.description("Specifies the algorithm used for server side 
encryption.")
.required(true)
.allowableValues(NO_SERVER_SIDE_ENCRYPTION, 
ObjectMetadata.AES_256_SERVER_SIDE_ENCRYPTION)
.defaultValue(NO_SERVER_SIDE_ENCRYPTION)
.build();
```


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


Re: [VOTE] Release Apache NiFi 0.5.1 (RC2)

2016-02-26 Thread Joe Skora
+1 Release this package as nifi-0.5.1 (non-binding)

* signature checks out
* hashes check out
* binary and files look good
* works as expected


On Tue, Feb 23, 2016 at 9:32 PM, Tony Kurc  wrote:

> Hello,
> I am pleased to be calling this vote for the source release of Apache NiFi
> nifi-0.5.1.
>
> The source zip, including signatures, digests, etc. can be found at:
> https://repository.apache.org/content/repositories/orgapachenifi-1076
>
> The Git tag is nifi-0.5.1-RC2
> The Git commit ID is 672211b87b4f1e52f8ee5153c26a467b555a331e
>
> https://git-wip-us.apache.org/repos/asf?p=nifi.git;a=commit;h=672211b87b4f1e52f8ee5153c26a467b555a331e
>
> This release candidate is a branch off of support/nifi-0.5.x at
> e2005fa059fbe128e2e278cda5ed7a27ab6e1ec3
>
> Checksums of nifi-0.5.1-source-release.zip:
> MD5: 9139aaae5d0a42a0fbbb624c2e739cdd
> SHA1: 374a24354f54c7b6c04ba0897f8e2e0b9a6a5127
>
> Release artifacts are signed with the following key:
> https://people.apache.org/keys/committer/tkurc.asc
>
> KEYS file available here:
> https://dist.apache.org/repos/dist/release/nifi/KEYS
>
> 12 issues were closed/resolved for this release:
>
> https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12316020=12334887
>
> Release note highlights can be found here:
>
> https://cwiki.apache.org/confluence/display/NIFI/Release+Notes#ReleaseNotes-Version0.5.1
>
> The vote will be open for 72 hours.
> Please download the release candidate and evaluate the necessary items
> including checking hashes, signatures, build from source, and test. Then
> please vote:
>
> [ ] +1 Release this package as nifi-0.5.1
> [ ] +0 no opinion
> [ ] -1 Do not release this package because because...
>
> Thanks!
> Tony
>


[GitHub] nifi pull request: NIFI-1568: Add Filter Capability to UnpackConte...

2016-02-26 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/248#discussion_r54299720
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/UnpackContent.java
 ---
@@ -202,11 +213,11 @@ public void onTrigger(final ProcessContext context, 
final ProcessSession session
 final boolean addFragmentAttrs;
 switch (packagingFormat) {
 case TAR_FORMAT:
--- End diff --

I would. And if there are ways to improve test coverage that would help 
eliminate the "fear it would break" in the future ;)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: NIFI-901: Add QueryCassandra and PutCassandraQL...

2016-02-26 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/237#discussion_r54299220
  
--- Diff: 
nifi-nar-bundles/nifi-cassandra-bundle/nifi-cassandra-processors/src/main/java/org/apache/nifi/processors/cassandra/AbstractCassandraProcessor.java
 ---
@@ -0,0 +1,441 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.cassandra;
+
+import com.datastax.driver.core.Cluster;
+import com.datastax.driver.core.ConsistencyLevel;
+import com.datastax.driver.core.DataType;
+import com.datastax.driver.core.Metadata;
+import com.datastax.driver.core.Row;
+import com.datastax.driver.core.SSLOptions;
+import com.datastax.driver.core.Session;
+import org.apache.avro.Schema;
+import org.apache.avro.SchemaBuilder;
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.authorization.exception.ProviderCreationException;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.PropertyValue;
+import org.apache.nifi.components.ValidationContext;
+import org.apache.nifi.components.ValidationResult;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.logging.ProcessorLog;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.security.util.SslContextFactory;
+import org.apache.nifi.ssl.SSLContextService;
+
+import javax.net.ssl.SSLContext;
+import java.net.InetSocketAddress;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collection;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.concurrent.atomic.AtomicReference;
+
+/**
+ * AbstractCassandraProcessor is a base class for Cassandra processors and 
contains logic and variables common to most
+ * processors integrating with Apache Cassandra.
+ */
+public abstract class AbstractCassandraProcessor extends AbstractProcessor 
{
+
+private static final Validator HOSTNAME_PORT_VALIDATOR = new 
Validator() {
+@Override
--- End diff --

Don't you think it would be easier to follow a common convention and simply 
have 'host', 'port' properties? 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: NIFI-1568: Add Filter Capability to UnpackConte...

2016-02-26 Thread rickysaltzer
Github user rickysaltzer commented on a diff in the pull request:

https://github.com/apache/nifi/pull/248#discussion_r54298517
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/UnpackContent.java
 ---
@@ -202,11 +213,11 @@ public void onTrigger(final ProcessContext context, 
final ProcessSession session
 final boolean addFragmentAttrs;
 switch (packagingFormat) {
 case TAR_FORMAT:
--- End diff --

yeah I noticed the same thing when I added the filter ability. I didn't 
want to make any major refactoring to it in fear it would break. I could give 
it a try though, because I agree, it doesn't make sense to have this logic for 
each incoming flowfile. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: NIFI-1568: Add Filter Capability to UnpackConte...

2016-02-26 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/248#discussion_r54298253
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/UnpackContent.java
 ---
@@ -202,11 +213,11 @@ public void onTrigger(final ProcessContext context, 
final ProcessSession session
 final boolean addFragmentAttrs;
 switch (packagingFormat) {
 case TAR_FORMAT:
--- End diff --

Have you considered doing some restructuring of the _onTrigger()_ method?
Since the packaging format is based on the processor wide configuration 
property (not FlowFile) most of the logic (switch/case) could be performed once 
instead of every time _onTrigger()_ is called. That includes the new logic with 
regex pattern compile.
Or am I missing something big here?



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: Nifi 1516 - AWS DynamoDB Get/Put/Delete Process...

2016-02-26 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/224#discussion_r54296973
  
--- Diff: 
nifi-nar-bundles/nifi-aws-bundle/nifi-aws-processors/src/main/java/org/apache/nifi/processors/aws/dynamodb/GetDynamoDB.java
 ---
@@ -0,0 +1,180 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.aws.dynamodb;
+
+import java.io.ByteArrayInputStream;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.ReadsAttribute;
+import org.apache.nifi.annotation.behavior.ReadsAttributes;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.SeeAlso;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+
+import com.amazonaws.AmazonClientException;
+import com.amazonaws.AmazonServiceException;
+import com.amazonaws.services.dynamodbv2.document.BatchGetItemOutcome;
+import com.amazonaws.services.dynamodbv2.document.DynamoDB;
+import com.amazonaws.services.dynamodbv2.document.Item;
+import com.amazonaws.services.dynamodbv2.document.TableKeysAndAttributes;
+import com.amazonaws.services.dynamodbv2.model.AttributeValue;
+import com.amazonaws.services.dynamodbv2.model.KeysAndAttributes;
+
+@SupportsBatching
+@SeeAlso({DeleteDynamoDB.class, PutDynamoDB.class})
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@Tags({"Amazon", "DynamoDB", "AWS", "Get", "Fetch"})
+@CapabilityDescription("Retrieves a document from DynamoDB based on hash 
and range key.  The key can be string or number."
++ "For any get request all the parimary keys are required (hash or 
hash and range based on the table keys)")
+@WritesAttributes({
+@WritesAttribute(attribute = 
AbstractDynamoDBProcessor.DYNAMODB_KEY_ERROR_UNPROCESSED, description = "Dynamo 
db unprocessed keys"),
+@WritesAttribute(attribute = 
AbstractDynamoDBProcessor.DYNAMODB_RANGE_KEY_VALUE_ERROR, description = 
"Dynamod db range key error"),
+@WritesAttribute(attribute = 
AbstractDynamoDBProcessor.DYNAMODB_KEY_ERROR_NOT_FOUND, description = "Dynamo 
db key not found"),
+@WritesAttribute(attribute = 
AbstractDynamoDBProcessor.DYNAMODB_ERROR_EXCEPTION_MESSAGE, description = 
"Dynamo db exception message"),
+@WritesAttribute(attribute = 
AbstractDynamoDBProcessor.DYNAMODB_ERROR_CODE, description = "Dynamo db error 
code"),
+@WritesAttribute(attribute = 
AbstractDynamoDBProcessor.DYNAMODB_ERROR_MESSAGE, description = "Dynamo db 
error message"),
+@WritesAttribute(attribute = 
AbstractDynamoDBProcessor.DYNAMODB_ERROR_TYPE, description = "Dynamo db error 
type"),
+@WritesAttribute(attribute = 
AbstractDynamoDBProcessor.DYNAMODB_ERROR_SERVICE, description = "Dynamo db 
error service"),
+@WritesAttribute(attribute = 
AbstractDynamoDBProcessor.DYNAMODB_ERROR_RETRYABLE, description = "Dynamo db 
error is retryable"),
+@WritesAttribute(attribute = 
AbstractDynamoDBProcessor.DYNAMODB_ERROR_REQUEST_ID, description = "Dynamo db 
error request id"),
+@WritesAttribute(attribute = 
AbstractDynamoDBProcessor.DYNAMODB_ERROR_STATUS_CODE, description = "Dynamo db 
status code")
+})
+@ReadsAttributes({
+@ReadsAttribute(attribute = 
AbstractDynamoDBProcessor.DYNAMODB_ITEM_HASH_KEY_VALUE, 

[GitHub] nifi pull request: Nifi 1516 - AWS DynamoDB Get/Put/Delete Process...

2016-02-26 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/224#discussion_r54296834
  
--- Diff: 
nifi-nar-bundles/nifi-aws-bundle/nifi-aws-processors/src/main/java/org/apache/nifi/processors/aws/dynamodb/DeleteDynamoDB.java
 ---
@@ -0,0 +1,152 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.aws.dynamodb;
+
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.ReadsAttribute;
+import org.apache.nifi.annotation.behavior.ReadsAttributes;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.SeeAlso;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+
+import com.amazonaws.AmazonClientException;
+import com.amazonaws.AmazonServiceException;
+import com.amazonaws.services.dynamodbv2.document.BatchWriteItemOutcome;
+import com.amazonaws.services.dynamodbv2.document.DynamoDB;
+import com.amazonaws.services.dynamodbv2.document.TableWriteItems;
+
+@SupportsBatching
+@SeeAlso({GetDynamoDB.class, PutDynamoDB.class})
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@Tags({"Amazon", "DynamoDB", "AWS", "Delete", "Remove"})
+@CapabilityDescription("Deletes a document from DynamoDB based on hash and 
range key. The key can be string or number."
++ " The request requires all the primary keys for the operation 
(hash or hash and range key)")
+@WritesAttributes({
+@WritesAttribute(attribute = 
AbstractDynamoDBProcessor.DYNAMODB_KEY_ERROR_UNPROCESSED, description = "Dynamo 
db unprocessed keys"),
+@WritesAttribute(attribute = 
AbstractDynamoDBProcessor.DYNAMODB_RANGE_KEY_VALUE_ERROR, description = 
"Dynamod db range key error"),
+@WritesAttribute(attribute = 
AbstractDynamoDBProcessor.DYNAMODB_KEY_ERROR_NOT_FOUND, description = "Dynamo 
db key not found"),
+@WritesAttribute(attribute = 
AbstractDynamoDBProcessor.DYNAMODB_ERROR_EXCEPTION_MESSAGE, description = 
"Dynamo db exception message"),
+@WritesAttribute(attribute = 
AbstractDynamoDBProcessor.DYNAMODB_ERROR_CODE, description = "Dynamo db error 
code"),
+@WritesAttribute(attribute = 
AbstractDynamoDBProcessor.DYNAMODB_ERROR_MESSAGE, description = "Dynamo db 
error message"),
+@WritesAttribute(attribute = 
AbstractDynamoDBProcessor.DYNAMODB_ERROR_TYPE, description = "Dynamo db error 
type"),
+@WritesAttribute(attribute = 
AbstractDynamoDBProcessor.DYNAMODB_ERROR_SERVICE, description = "Dynamo db 
error service"),
+@WritesAttribute(attribute = 
AbstractDynamoDBProcessor.DYNAMODB_ERROR_RETRYABLE, description = "Dynamo db 
error is retryable"),
+@WritesAttribute(attribute = 
AbstractDynamoDBProcessor.DYNAMODB_ERROR_REQUEST_ID, description = "Dynamo db 
error request id"),
+@WritesAttribute(attribute = 
AbstractDynamoDBProcessor.DYNAMODB_ERROR_STATUS_CODE, description = "Dynamo db 
status code")
+})
+@ReadsAttributes({
+@ReadsAttribute(attribute = 
AbstractDynamoDBProcessor.DYNAMODB_ITEM_HASH_KEY_VALUE, description = "Items 
hash key value" ),
+@ReadsAttribute(attribute = 
AbstractDynamoDBProcessor.DYNAMODB_ITEM_RANGE_KEY_VALUE, description = "Items 
range key value" ),
+})
+public class DeleteDynamoDB extends AbstractWriteDynamoDBProcessor {
+

[GitHub] nifi pull request: Nifi 1516 - AWS DynamoDB Get/Put/Delete Process...

2016-02-26 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/224#discussion_r54296295
  
--- Diff: 
nifi-nar-bundles/nifi-aws-bundle/nifi-aws-processors/src/main/java/org/apache/nifi/processors/aws/dynamodb/AbstractDynamoDBProcessor.java
 ---
@@ -0,0 +1,302 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.aws.dynamodb;
+
+import java.math.BigDecimal;
+import java.nio.charset.Charset;
+import java.util.ArrayList;
+import java.util.List;
+import java.util.Map;
+
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.components.AllowableValue;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.util.StandardValidators;
+import 
org.apache.nifi.processors.aws.AbstractAWSCredentialsProviderProcessor;
+
+import com.amazonaws.AmazonClientException;
+import com.amazonaws.AmazonServiceException;
+import com.amazonaws.ClientConfiguration;
+import com.amazonaws.auth.AWSCredentials;
+import com.amazonaws.auth.AWSCredentialsProvider;
+import com.amazonaws.services.dynamodbv2.AmazonDynamoDBClient;
+import com.amazonaws.services.dynamodbv2.document.DynamoDB;
+import com.amazonaws.services.dynamodbv2.model.AttributeValue;
+
+/**
+ * Base class for Nifi dynamo db related processors
+ *
+ * @see DeleteDynamoDB
+ * @see PutDynamoDB
+ * @see GetDynamoDB
+ */
+public abstract class AbstractDynamoDBProcessor extends 
AbstractAWSCredentialsProviderProcessor {
+
+public static final AllowableValue ALLOWABLE_VALUE_STRING = new 
AllowableValue("string");
+public static final AllowableValue ALLOWABLE_VALUE_NUMBER = new 
AllowableValue("number");
+
+public static final String DYNAMODB_KEY_ERROR_UNPROCESSED = 
"dynamodb.key.error.unprocessed";
+public static final String DYNAMODB_RANGE_KEY_VALUE_ERROR = 
"dynmodb.range.key.value.error";
+public static final String DYNAMODB_HASH_KEY_VALUE_ERROR = 
"dynmodb.hash.key.value.error";
+public static final String DYNAMODB_KEY_ERROR_NOT_FOUND = 
"dynamodb.key.error.not.found";
+public static final String DYNAMODB_ERROR_EXCEPTION_MESSAGE = 
"dynamodb.error.exception.message";
+public static final String DYNAMODB_ERROR_CODE = "dynamodb.error.code";
+public static final String DYNAMODB_ERROR_MESSAGE = 
"dynamodb.error.message";
+public static final String DYNAMODB_ERROR_TYPE = "dynamodb.error.type";
+public static final String DYNAMODB_ERROR_SERVICE = 
"dynamodb.error.service";
+public static final String DYNAMODB_ERROR_RETRYABLE = 
"dynamodb.error.retryable";
+public static final String DYNAMODB_ERROR_REQUEST_ID = 
"dynamodb.error.request.id";
+public static final String DYNAMODB_ERROR_STATUS_CODE = 
"dynamodb.error.status.code";
+public static final String DYNAMODB_ITEM_HASH_KEY_VALUE = "  
dynamodb.item.hash.key.value";
+public static final String DYNAMODB_ITEM_RANGE_KEY_VALUE = "  
dynamodb.item.range.key.value";
+
+protected static final String DYNAMODB_KEY_ERROR_NOT_FOUND_MESSAGE = 
"DynamoDB key not found : ";
+
+public static final PropertyDescriptor TABLE = new 
PropertyDescriptor.Builder()
+.name("Table Name")
+.required(true)
+.expressionLanguageSupported(false)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.description("The DynamoDB table name")
+.build();
+
+public static final PropertyDescriptor HASH_KEY_VALUE = new 
PropertyDescriptor.Builder()
+.name("Hash Key Value")
+.required(true)
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.description("The hash key 

[GitHub] nifi pull request: Nifi 1516 - AWS DynamoDB Get/Put/Delete Process...

2016-02-26 Thread jvwing
Github user jvwing commented on the pull request:

https://github.com/apache/nifi/pull/224#issuecomment-189425709
  
I stand corrected.  The non-integration unit tests are importing 
CREDENTIALS_FILE from ITAbstractDynamoDBTest, although the integration test 
itself may not be called.

But I think the unit tests run by TravisCI should pass.  Either the tests 
requiring a file in the user's home folder are truly integration tests and need 
to be marked as such, or the file needs to get checked in to git.  I believe 
you have checked in a similar test credentials file while working on the 
AWSCredentialsProviderControllerService, and that seemed to work just fine.

It's also possible that the unit tests do not need a credentials file at 
all, since they won't make API calls to AWS.  Can't we just remove it from the 
tests?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: Nifi 1516 - AWS DynamoDB Get/Put/Delete Process...

2016-02-26 Thread mans2singh
Github user mans2singh commented on the pull request:

https://github.com/apache/nifi/pull/224#issuecomment-189404424
  
Hi James:
The unit tests are failing because the credentails file is not found in the 
home folder.  
DeleteDynamoDBTest.testStringHashStringRangeDeleteNoHashValueFailure:151 
Processor has 1 validation failures:
'Credentials File' validated against 
'/home/travis/aws-credentials.properties' is invalid because File 
/home/travis/aws-credentials.properties does not exist

My understanding was that the tests marked as IT prefix do not have to be 
ignored so I had removed that annotation from them.  If that is required, 
please let me know and I will update them.
Thanks 

On Friday, February 26, 2016 9:18 AM, James Wing 
 wrote:
 

 I believe the unit tests are failing now, for some combination of the 
following reasons (see the TravisCI logs)
   - ITAbstractDynamoDBTest is no longer marked Ignore 
   - The file referenced in CREDENTIALS_FILE does not exist
   - Json Document is now required
—
Reply to this email directly or view it on GitHub.  

  


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: Nifi 1516 - AWS DynamoDB Get/Put/Delete Process...

2016-02-26 Thread jvwing
Github user jvwing commented on the pull request:

https://github.com/apache/nifi/pull/224#issuecomment-189379141
  
I believe the unit tests are failing now, for some combination of the 
following reasons (see the [TravisCI 
logs](https://s3.amazonaws.com/archive.travis-ci.org/jobs/111920709/log.txt)) 

* ITAbstractDynamoDBTest is no longer marked Ignore 
* The file referenced in CREDENTIALS_FILE does not exist
* Json Document is now required


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: NIFI-1464, NIFI-78, NIFI-1487 Refactored Proces...

2016-02-26 Thread markap14
Github user markap14 commented on the pull request:

https://github.com/apache/nifi/pull/210#issuecomment-189328683
  
Otherwise, I think all looks good. I definitely like the simplification of 
the code. I agree that the locking is no longer necessary. It was there because 
we wanted to atomically evaluate isRunning() and then update the processor, all 
as a single unit, without the processor being started in the meantime. However, 
this is not really a concern, since starting the processor with a separate 
thread would have to be done via the web tier, which will hold a lock anyway.

Great addition! Once the DTO & FlowSerializer are addressed, and the 
DISABLED state is added back in, I'll be a +1!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: NIFI-1464, NIFI-78, NIFI-1487 Refactored Proces...

2016-02-26 Thread markap14
Github user markap14 commented on the pull request:

https://github.com/apache/nifi/pull/210#issuecomment-189327203
  
We also have the same above issue with the ProcessorDTO - in 
DtoFactory.createProcessorDto, we need to set the state to RUNNING or STOPPED - 
STARTING and STOPPING are not valid states for the DTO.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: NIFI-1464, NIFI-78, NIFI-1487 Refactored Proces...

2016-02-26 Thread markap14
Github user markap14 commented on the pull request:

https://github.com/apache/nifi/pull/210#issuecomment-189326513
  
In StandardFlowSerializer, Line 314, we write out the Processor's Scheduled 
State. This is done so that when we restore the flow on restart, we know what 
state the Processor is in. So if we are adding new states to Scheduled State, 
we need to be sure to account for that there. Specifically, I think we should 
just update StandardFlowSerializer to write out a state of RUNNING if the state 
is STARTING and a state of STOPPED if it's STOPPING. Then we don't have to 
worry about the code that restores the state because what is written to the 
flow.xml is the state that we want to use when we restore the flow.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: NIFI-1511 Incorporate Groovy unit tests as part...

2016-02-26 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/220


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: NIFI-1464, NIFI-78, NIFI-1487 Refactored Proces...

2016-02-26 Thread olegz
Github user olegz commented on the pull request:

https://github.com/apache/nifi/pull/210#issuecomment-189315181
  
Point taken, wasn't sure what DISABLED meant in the context of Processor 
and thought it was some leftover that shouldn't be there. Will put it back and 
add tests for it


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: NIFI-1464, NIFI-78, NIFI-1487 Refactored Proces...

2016-02-26 Thread markap14
Github user markap14 commented on the pull request:

https://github.com/apache/nifi/pull/210#issuecomment-189314275
  
@olegz I noticed in StandardProcessorNode.verifyCanStart, you allowed the 
Processor to start even if it's DISABLED. I think this is a bug that needs to 
be addressed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


Re: Installing NIFI through ambari.

2016-02-26 Thread Bryan Bende
It may have something to do with the APP_ID in the metrics being
'nificluster'. It needs to line up with the APP_ID in the service. I would
recommend leaving it as the default value and see if that works.

On Friday, February 26, 2016, davi  wrote:

> I'm have similar situation . I have installed a NIFI cluster (1 NCM /3
> Nodes)with ambari .NIFI is running fine and data are sent to ambari metrics
> collector . I verified with the phoenix query , but still not showing on
> the
> ambari web page: Result of Phoenix query :   SELECT
> METRIC_NAME,SERVER_TIME,APP_ID FROM METRIC_RECORD WHERE
> APP_ID='nificluster'
> ORDER BY SERVER_TIME LIMIT
>
> 20;+--+---+|
> METRIC_NAME|   SERVER_TIME
>
> |+--+---+|
> jvm.daemon_thread_count  | 1456484338751
> || jvm.gc.runs.PSMarkSweep  | 1456484338751
> || jvm.gc.runs.PSScavenge   | 1456484338751
> || jvm.heap_usage   | 1456484338751
> || jvm.gc.time.PSMarkSweep  | 1456484338751
> || jvm.gc.time.PSScavenge   | 1456484338751
> || jvm.heap_used| 1456484338751
> || jvm.uptime   | 1456484338751
> || jvm.thread_states.blocked| 1456484338751
> || jvm.thread_states.runnable   | 1456484338751
> || jvm.file_descriptor_usage| 1456484338751
> || jvm.thread_states.timed_waiting  | 1456484338751
> || jvm.thread_states.terminated | 1456484338751
> || jvm.non_heap_usage   | 1456484338751
> || jvm.thread_count | 1456484338751
> || ActiveThreads| 1456484338751
> || BytesReadLast5Minutes| 1456484338751
> || BytesReceivedLast5Minutes| 1456484338751
> || FlowFilesQueued  | 1456484338751
> || BytesSentLast5Minutes| 1456484338751
>
> |+--+---+and
> ambari view :
> <
> http://apache-nifi-developer-list.39713.n7.nabble.com/file/n7644/nifiservice.png
> >
>
>
>
> --
> View this message in context:
> http://apache-nifi-developer-list.39713.n7.nabble.com/Installing-NIFI-through-ambari-tp7194p7644.html
> Sent from the Apache NiFi Developer List mailing list archive at
> Nabble.com.



-- 
Sent from Gmail Mobile


Re: Getting Started with NiFi Development

2016-02-26 Thread Oleg Zhurakousky
Colin

NiFi is pure Java, so any OS that has JDK distribution will run NiFi and or 
will be able to become a dev environment for it.
The choice of IDE is also entirely up to you. On this list I am sure guys are 
using at least 3 different IDEs (me personally is Eclipse).

You can get more information here - 
https://nifi.apache.org/developer-guide.html and if you believe something is 
missing or unclear please let us know so we can update the guide.
Cheers
Oleg


On Feb 26, 2016, at 8:24 AM, Colin Bowdery 
> wrote:

Hi Dev Team!



I am looking to start familiarizing myself with NiFi. I see the
documentation states Linux is supported but does not define which Distro(s).



Any advice on the best Linux distro to use? How about IDE, etc..?



Many thanks,





Colin.




Getting Started with NiFi Development

2016-02-26 Thread Colin Bowdery
Hi Dev Team!

 

I am looking to start familiarizing myself with NiFi. I see the
documentation states Linux is supported but does not define which Distro(s).

 

Any advice on the best Linux distro to use? How about IDE, etc..?

 

Many thanks,

 

 

Colin.



Re: Installing NIFI through ambari.

2016-02-26 Thread davi
I'm have similar situation . I have installed a NIFI cluster (1 NCM /3
Nodes)with ambari .NIFI is running fine and data are sent to ambari metrics
collector . I verified with the phoenix query , but still not showing on the
ambari web page: Result of Phoenix query :   SELECT
METRIC_NAME,SERVER_TIME,APP_ID FROM METRIC_RECORD WHERE APP_ID='nificluster'
ORDER BY SERVER_TIME LIMIT
20;+--+---+|
  
METRIC_NAME|   SERVER_TIME
|+--+---+|
jvm.daemon_thread_count  | 1456484338751
|| jvm.gc.runs.PSMarkSweep  | 1456484338751
|| jvm.gc.runs.PSScavenge   | 1456484338751
|| jvm.heap_usage   | 1456484338751
|| jvm.gc.time.PSMarkSweep  | 1456484338751
|| jvm.gc.time.PSScavenge   | 1456484338751
|| jvm.heap_used| 1456484338751
|| jvm.uptime   | 1456484338751
|| jvm.thread_states.blocked| 1456484338751
|| jvm.thread_states.runnable   | 1456484338751
|| jvm.file_descriptor_usage| 1456484338751
|| jvm.thread_states.timed_waiting  | 1456484338751
|| jvm.thread_states.terminated | 1456484338751
|| jvm.non_heap_usage   | 1456484338751
|| jvm.thread_count | 1456484338751
|| ActiveThreads| 1456484338751
|| BytesReadLast5Minutes| 1456484338751
|| BytesReceivedLast5Minutes| 1456484338751
|| FlowFilesQueued  | 1456484338751
|| BytesSentLast5Minutes| 1456484338751
|+--+---+and
ambari view : 

 



--
View this message in context: 
http://apache-nifi-developer-list.39713.n7.nabble.com/Installing-NIFI-through-ambari-tp7194p7644.html
Sent from the Apache NiFi Developer List mailing list archive at Nabble.com.