[jira] [Created] (NIFI-3654) PutKinesisFirehose Needs A Separate Outgoing Relationship for Retryable Errors

2017-03-27 Thread Nicholas Carenza (JIRA)
Nicholas Carenza created NIFI-3654:
--

 Summary: PutKinesisFirehose Needs A Separate Outgoing Relationship 
for Retryable Errors
 Key: NIFI-3654
 URL: https://issues.apache.org/jira/browse/NIFI-3654
 Project: Apache NiFi
  Issue Type: Improvement
Reporter: Nicholas Carenza
Priority: Minor


Just like the InvokeHttp processor understands what kinds of errors are worth 
retrying and which are not, so should this processor.

For example: the error when a file is over the max size of 1000kb is 
unrecoverable and there is no point in retrying so output to failure. Server 
unavailable, output to retry... and maybe administrative yield?



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Resolved] (NIFI-3652) Fix link in README documentation.

2017-03-27 Thread Andy LoPresto (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-3652?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Andy LoPresto resolved NIFI-3652.
-
Resolution: Fixed

> Fix link in README documentation. 
> --
>
> Key: NIFI-3652
> URL: https://issues.apache.org/jira/browse/NIFI-3652
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Documentation & Website
>Affects Versions: 1.1.1
>Reporter: Andy LoPresto
>Assignee: Andy LoPresto
>Priority: Trivial
>  Labels: documentation
> Fix For: 1.2.0
>
>
> The "Export Control" link in the README doc has a syntax error. 



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] nifi pull request #1628: Add link to website and fixed link in README.md

2017-03-27 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/1628


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (NIFI-3652) Fix link in README documentation.

2017-03-27 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-3652?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15944174#comment-15944174
 ] 

ASF subversion and git services commented on NIFI-3652:
---

Commit 17ec6264a0c9f494db1d8fbfc4a145222fccf591 in nifi's branch 
refs/heads/master from [~kturner]
[ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=17ec626 ]

NIFI-3652 Add link to website and fixed link in README.md

This closes #1628.

Signed-off-by: Andy LoPresto 


> Fix link in README documentation. 
> --
>
> Key: NIFI-3652
> URL: https://issues.apache.org/jira/browse/NIFI-3652
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Documentation & Website
>Affects Versions: 1.1.1
>Reporter: Andy LoPresto
>Assignee: Andy LoPresto
>Priority: Trivial
>  Labels: documentation
> Fix For: 1.2.0
>
>
> The "Export Control" link in the README doc has a syntax error. 



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (NIFI-3650) Ensure travis-ci won't cache nifi artifacts

2017-03-27 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-3650?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15944128#comment-15944128
 ] 

ASF GitHub Bot commented on NIFI-3650:
--

Github user trixpan commented on the issue:

https://github.com/apache/nifi/pull/1625
  
I am obviously happy to remove the pre-build cache "rm -rf" but since it 
should not return error, so I reckon we can leave it there as a safety 
mechanism in case before_cache fails


> Ensure travis-ci won't cache nifi artifacts
> ---
>
> Key: NIFI-3650
> URL: https://issues.apache.org/jira/browse/NIFI-3650
> Project: Apache NiFi
>  Issue Type: Bug
>Reporter: Andre F de Miranda
>Assignee: Andre F de Miranda
>
> while caching dependencies is a fair way of improving travis-ci build times, 
> the existence of cached NiFi artifacts within maven's ~/.m2 directory may 
> hide issues caused by incomplete code refactoring that would otherwise be 
> triggered had travis-ci been running a truly clean build.
> We should find a way of balancing caching and possible build false negatives.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] nifi issue #1625: NIFI-3650 - Adjust travis to forcefuly remove $HOME/.m2/re...

2017-03-27 Thread trixpan
Github user trixpan commented on the issue:

https://github.com/apache/nifi/pull/1625
  
I am obviously happy to remove the pre-build cache "rm -rf" but since it 
should not return error, so I reckon we can leave it there as a safety 
mechanism in case before_cache fails


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (NIFI-3650) Ensure travis-ci won't cache nifi artifacts

2017-03-27 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-3650?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15944126#comment-15944126
 ] 

ASF GitHub Bot commented on NIFI-3650:
--

Github user trixpan commented on the issue:

https://github.com/apache/nifi/pull/1625
  
@apiri good catch. I wasn't aware of that feature. I pushed a modified 
version. let me know what you think


> Ensure travis-ci won't cache nifi artifacts
> ---
>
> Key: NIFI-3650
> URL: https://issues.apache.org/jira/browse/NIFI-3650
> Project: Apache NiFi
>  Issue Type: Bug
>Reporter: Andre F de Miranda
>Assignee: Andre F de Miranda
>
> while caching dependencies is a fair way of improving travis-ci build times, 
> the existence of cached NiFi artifacts within maven's ~/.m2 directory may 
> hide issues caused by incomplete code refactoring that would otherwise be 
> triggered had travis-ci been running a truly clean build.
> We should find a way of balancing caching and possible build false negatives.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] nifi issue #1625: NIFI-3650 - Adjust travis to forcefuly remove $HOME/.m2/re...

2017-03-27 Thread trixpan
Github user trixpan commented on the issue:

https://github.com/apache/nifi/pull/1625
  
@apiri good catch. I wasn't aware of that feature. I pushed a modified 
version. let me know what you think


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi-minifi pull request #78: MINIFI-251 - Added explicit Java runtime argum...

2017-03-27 Thread brosander
Github user brosander commented on a diff in the pull request:

https://github.com/apache/nifi-minifi/pull/78#discussion_r108285965
  
--- Diff: 
minifi-nar-bundles/minifi-framework-bundle/minifi-framework/minifi-resources/src/main/resources/conf/bootstrap.conf
 ---
@@ -85,9 +85,13 @@ java.arg.4=-Djava.net.preferIPv4Stack=true
 java.arg.5=-Dsun.net.http.allowRestrictedHeaders=true
 java.arg.6=-Djava.protocol.handler.pkgs=sun.net.www.protocol
 
+# Sets the provider of SecureRandom to /dev/urandom to prevent blocking on 
VMs
+java.arg.7=-Djava.security.egd=file:/dev/urandom
--- End diff --

@trixpan have you tested that this works in Java 8?

I've used a slightly modified version found [on stack 
overflow](http://stackoverflow.com/questions/137212/how-to-solve-performance-problem-with-java-securerandom#answer-2325109):
```
-Djava.security.egd=file:/dev/./urandom
```



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Created] (NIFI-3653) Allow extension of authorize method in AbstractPolicyBasedAuthorizer

2017-03-27 Thread Michael Moser (JIRA)
Michael Moser created NIFI-3653:
---

 Summary: Allow extension of authorize method in 
AbstractPolicyBasedAuthorizer
 Key: NIFI-3653
 URL: https://issues.apache.org/jira/browse/NIFI-3653
 Project: Apache NiFi
  Issue Type: Improvement
  Components: Core Framework
Reporter: Michael Moser


While investigating alternate implementations of the Authorizer interface, I 
see the AbstractPolicyBasedAuthorizer is meant to be extended.  It's 
authorize() method is final, however, and does not have an abstract 
doAuthorize() method that sub-classes can extend.

In particular, the existing AbstractPolicyBasedAuthorizer authorize() method 
does not take into account the AuthorizationRequest "resourceContext" in its 
authorization decision.  This is especially important when authorizing access 
to events in Provenance, which places attributes in resouceContext of its 
AuthorizationRequest when obtaining an authorization decision.  I would like to 
use attributes to authorize access to Provenance download & view content 
feature.

If I had my own sub-class of AbstractPolicyBasedAuthorizer, with the 
availability of a doAuthorize() method, then I could maintain my own user 
policies for allowing access to flowfile content via Provenance.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (NIFI-3652) Fix link in README documentation.

2017-03-27 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-3652?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15944061#comment-15944061
 ] 

ASF GitHub Bot commented on NIFI-3652:
--

Github user alopresto commented on the issue:

https://github.com/apache/nifi/pull/1628
  
Opened [NIFI-3652](https://issues.apache.org/jira/browse/NIFI-3652) to 
document this. 


> Fix link in README documentation. 
> --
>
> Key: NIFI-3652
> URL: https://issues.apache.org/jira/browse/NIFI-3652
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Documentation & Website
>Affects Versions: 1.1.1
>Reporter: Andy LoPresto
>Assignee: Andy LoPresto
>Priority: Trivial
>  Labels: documentation
> Fix For: 1.2.0
>
>
> The "Export Control" link in the README doc has a syntax error. 



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] nifi issue #1628: Add link to website and fixed link in README.md

2017-03-27 Thread alopresto
Github user alopresto commented on the issue:

https://github.com/apache/nifi/pull/1628
  
Opened [NIFI-3652](https://issues.apache.org/jira/browse/NIFI-3652) to 
document this. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Created] (NIFI-3652) Fix link in README documentation.

2017-03-27 Thread Andy LoPresto (JIRA)
Andy LoPresto created NIFI-3652:
---

 Summary: Fix link in README documentation. 
 Key: NIFI-3652
 URL: https://issues.apache.org/jira/browse/NIFI-3652
 Project: Apache NiFi
  Issue Type: Bug
  Components: Documentation & Website
Affects Versions: 1.1.1
Reporter: Andy LoPresto
Assignee: Andy LoPresto
Priority: Trivial
 Fix For: 1.2.0


The "Export Control" link in the README doc has a syntax error. 



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Created] (NIFI-3651) Consider adding logo to README.md

2017-03-27 Thread Keith Turner (JIRA)
Keith Turner created NIFI-3651:
--

 Summary: Consider adding logo to README.md
 Key: NIFI-3651
 URL: https://issues.apache.org/jira/browse/NIFI-3651
 Project: Apache NiFi
  Issue Type: Improvement
  Components: Documentation & Website
Reporter: Keith Turner


I recently made a PR to make the projects README.md link to the website first 
thing.  In my opinion it would be nicer to make the README use the project logo 
and have that logo link to the website.  We did this for Fluo and I think it's 
nice.

https://github.com/apache/incubator-fluo



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] nifi issue #1628: Add link to website and fixed link in README.md

2017-03-27 Thread alopresto
Github user alopresto commented on the issue:

https://github.com/apache/nifi/pull/1628
  
Will merge. Thanks. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi-minifi-cpp pull request #69: MINIFI-225: Add Linter for Google style gu...

2017-03-27 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/nifi-minifi-cpp/pull/69


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi-minifi-cpp issue #69: MINIFI-225: Add Linter for Google style guide

2017-03-27 Thread apiri
Github user apiri commented on the issue:

https://github.com/apache/nifi-minifi-cpp/pull/69
  
@phrocker thanks for the updates, the changes look good and should cover us 
on the license front.  will get this merged.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi-minifi-cpp issue #69: MINIFI-225: Add Linter for Google style guide

2017-03-27 Thread apiri
Github user apiri commented on the issue:

https://github.com/apache/nifi-minifi-cpp/pull/69
  
reviewing updates


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi-minifi pull request #75: MINIFI-238 - MiNiFi Initial Command and Contro...

2017-03-27 Thread apiri
Github user apiri commented on a diff in the pull request:

https://github.com/apache/nifi-minifi/pull/75#discussion_r107990471
  
--- Diff: minifi-c2/minifi-c2-assembly/pom.xml ---
@@ -0,0 +1,171 @@
+
+
+http://maven.apache.org/POM/4.0.0; 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance; 
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
http://maven.apache.org/xsd/maven-4.0.0.xsd;>
+4.0.0
+
+minifi-c2
+org.apache.nifi.minifi
+0.2.0-SNAPSHOT
+
+minifi-c2-assembly
+pom
+This is the assembly of Apache MiNiFi's - Command And 
Control Server
+
+10080
+
+false
+
./conf/keystore.jks
+jks
+
--- End diff --

would need keyPasswd here as well


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1628: Add link to website and fixed link in README.md

2017-03-27 Thread keith-turner
GitHub user keith-turner opened a pull request:

https://github.com/apache/nifi/pull/1628

Add link to website and fixed link in README.md

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [ ] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [ ] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [X] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [X] Is your initial contribution a single, squashed commit?

### For code changes:
- [ ] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [X] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/keith-turner/nifi link

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/1628.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #1628


commit 76c695244d002f19a46560679a84350f0b9ab79f
Author: Keith Turner 
Date:   2017-03-27T20:42:50Z

Add link to website and fixed link in README.md




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi-minifi pull request #75: MINIFI-238 - MiNiFi Initial Command and Contro...

2017-03-27 Thread apiri
Github user apiri commented on a diff in the pull request:

https://github.com/apache/nifi-minifi/pull/75#discussion_r107981828
  
--- Diff: 
minifi-c2/minifi-c2-api/src/main/java/org/apache/nifi/minifi/c2/api/properties/C2Properties.java
 ---
@@ -0,0 +1,89 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.minifi.c2.api.properties;
+
+import org.eclipse.jetty.util.ssl.SslContextFactory;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.nio.file.Files;
+import java.nio.file.Path;
+import java.nio.file.Paths;
+import java.security.GeneralSecurityException;
+import java.security.KeyStore;
+import java.util.Properties;
+
+public class C2Properties extends Properties {
+public static final String MINIFI_C2_SERVER_SECURE = 
"minifi.c2.server.secure";
+public static final String MINIFI_C2_SERVER_KEYSTORE_TYPE = 
"minifi.c2.server.keystoreType";
+public static final String MINIFI_C2_SERVER_KEYSTORE = 
"minifi.c2.server.keystore";
+public static final String MINIFI_C2_SERVER_KEYSTORE_PASSWD = 
"minifi.c2.server.keystorePasswd";
+public static final String MINIFI_C2_SERVER_KEYSTORE_PASSWD1 = 
"minifi.c2.server.keystorePasswd";
--- End diff --

did you intend for this to be the keyPassword?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Updated] (NIFI-3643) nifi-api: deserialization error of nifiVersion field due to capitalization

2017-03-27 Thread Bryan Bende (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-3643?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Bryan Bende updated NIFI-3643:
--
Resolution: Fixed
Status: Resolved  (was: Patch Available)

> nifi-api: deserialization error of nifiVersion field due to capitalization
> --
>
> Key: NIFI-3643
> URL: https://issues.apache.org/jira/browse/NIFI-3643
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.1.1
> Environment: OpenJDK8. gson, nifi-1.1.0
>Reporter: David Arllen
>Assignee: Matt Gilman
>Priority: Trivial
> Fix For: 1.2.0
>
>
> The response from `/nifi-api/system-diagnostics` includes the field name of 
> 'niFiVersion'.  The gson serialization library works on reflection and 
> expects 'nifiVersion' to be the field name because the class private field 
> 'nifiVersion' does not have a capital F.
> Resolution is expected to involve only the class 'VersionInfoDTO'.  The fix 
> would be matching the field capitalization with the capitalization of the 
> associated getter and setter.
> Without this fix, a gson FieldNamingStrategy is required to artifically map 
> the api-provided 'niFiVersion' field name to the reflection-expected 
> 'nifiVersion'



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (NIFI-3380) Multiple Versions of the Same Component

2017-03-27 Thread Bryan Bende (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-3380?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Bryan Bende updated NIFI-3380:
--
Resolution: Fixed
Status: Resolved  (was: Patch Available)

> Multiple Versions of the Same Component
> ---
>
> Key: NIFI-3380
> URL: https://issues.apache.org/jira/browse/NIFI-3380
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core Framework
>Reporter: Bryan Bende
>Assignee: Matt Gilman
> Fix For: 1.2.0
>
> Attachments: nifi-example-processors-nar-1.0.nar, 
> nifi-example-processors-nar-2.0.nar, nifi-example-service-api-nar-1.0.nar, 
> nifi-example-service-api-nar-2.0.nar, nifi-example-service-nar-1.0.nar, 
> nifi-example-service-nar-1.1.nar, nifi-example-service-nar-2.0.nar
>
>
> This ticket is to track the work for supporting multiple versions of the same 
> component within NiFi. The overall design for this feature is described in 
> detail at the following wiki page:
> https://cwiki.apache.org/confluence/display/NIFI/Multiple+Versions+of+the+Same+Extension
> This ticket will track only the core NiFi work, and a separate ticket will be 
> created to track enhancements for the NAR Maven Plugin.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (NIFI-3643) nifi-api: deserialization error of nifiVersion field due to capitalization

2017-03-27 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-3643?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15944011#comment-15944011
 ] 

ASF GitHub Bot commented on NIFI-3643:
--

Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/1626


> nifi-api: deserialization error of nifiVersion field due to capitalization
> --
>
> Key: NIFI-3643
> URL: https://issues.apache.org/jira/browse/NIFI-3643
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.1.1
> Environment: OpenJDK8. gson, nifi-1.1.0
>Reporter: David Arllen
>Assignee: Matt Gilman
>Priority: Trivial
> Fix For: 1.2.0
>
>
> The response from `/nifi-api/system-diagnostics` includes the field name of 
> 'niFiVersion'.  The gson serialization library works on reflection and 
> expects 'nifiVersion' to be the field name because the class private field 
> 'nifiVersion' does not have a capital F.
> Resolution is expected to involve only the class 'VersionInfoDTO'.  The fix 
> would be matching the field capitalization with the capitalization of the 
> associated getter and setter.
> Without this fix, a gson FieldNamingStrategy is required to artifically map 
> the api-provided 'niFiVersion' field name to the reflection-expected 
> 'nifiVersion'



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] nifi pull request #1626: NIFI-3643: Fixing capitalization in Java Property

2017-03-27 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/1626


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (NIFI-3643) nifi-api: deserialization error of nifiVersion field due to capitalization

2017-03-27 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-3643?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15944010#comment-15944010
 ] 

ASF subversion and git services commented on NIFI-3643:
---

Commit 4432958a65b0a3531004988fccc2339ec12f6d96 in nifi's branch 
refs/heads/master from [~mcgilman]
[ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=4432958 ]

NIFI-3643: - Addressing incorrect capitalization in VersionInfoDTO in 
NiFiVersion.

This closes #1626.

Signed-off-by: Bryan Bende 


> nifi-api: deserialization error of nifiVersion field due to capitalization
> --
>
> Key: NIFI-3643
> URL: https://issues.apache.org/jira/browse/NIFI-3643
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.1.1
> Environment: OpenJDK8. gson, nifi-1.1.0
>Reporter: David Arllen
>Assignee: Matt Gilman
>Priority: Trivial
> Fix For: 1.2.0
>
>
> The response from `/nifi-api/system-diagnostics` includes the field name of 
> 'niFiVersion'.  The gson serialization library works on reflection and 
> expects 'nifiVersion' to be the field name because the class private field 
> 'nifiVersion' does not have a capital F.
> Resolution is expected to involve only the class 'VersionInfoDTO'.  The fix 
> would be matching the field capitalization with the capitalization of the 
> associated getter and setter.
> Without this fix, a gson FieldNamingStrategy is required to artifically map 
> the api-provided 'niFiVersion' field name to the reflection-expected 
> 'nifiVersion'



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (NIFI-3380) Multiple Versions of the Same Component

2017-03-27 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-3380?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15944006#comment-15944006
 ] 

ASF GitHub Bot commented on NIFI-3380:
--

Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/1627


> Multiple Versions of the Same Component
> ---
>
> Key: NIFI-3380
> URL: https://issues.apache.org/jira/browse/NIFI-3380
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core Framework
>Reporter: Bryan Bende
>Assignee: Matt Gilman
> Fix For: 1.2.0
>
> Attachments: nifi-example-processors-nar-1.0.nar, 
> nifi-example-processors-nar-2.0.nar, nifi-example-service-api-nar-1.0.nar, 
> nifi-example-service-api-nar-2.0.nar, nifi-example-service-nar-1.0.nar, 
> nifi-example-service-nar-1.1.nar, nifi-example-service-nar-2.0.nar
>
>
> This ticket is to track the work for supporting multiple versions of the same 
> component within NiFi. The overall design for this feature is described in 
> detail at the following wiki page:
> https://cwiki.apache.org/confluence/display/NIFI/Multiple+Versions+of+the+Same+Extension
> This ticket will track only the core NiFi work, and a separate ticket will be 
> created to track enhancements for the NAR Maven Plugin.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (NIFI-3643) nifi-api: deserialization error of nifiVersion field due to capitalization

2017-03-27 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-3643?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15944008#comment-15944008
 ] 

ASF GitHub Bot commented on NIFI-3643:
--

Github user bbende commented on the issue:

https://github.com/apache/nifi/pull/1626
  
Looks good, will merge to master


> nifi-api: deserialization error of nifiVersion field due to capitalization
> --
>
> Key: NIFI-3643
> URL: https://issues.apache.org/jira/browse/NIFI-3643
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.1.1
> Environment: OpenJDK8. gson, nifi-1.1.0
>Reporter: David Arllen
>Assignee: Matt Gilman
>Priority: Trivial
> Fix For: 1.2.0
>
>
> The response from `/nifi-api/system-diagnostics` includes the field name of 
> 'niFiVersion'.  The gson serialization library works on reflection and 
> expects 'nifiVersion' to be the field name because the class private field 
> 'nifiVersion' does not have a capital F.
> Resolution is expected to involve only the class 'VersionInfoDTO'.  The fix 
> would be matching the field capitalization with the capitalization of the 
> associated getter and setter.
> Without this fix, a gson FieldNamingStrategy is required to artifically map 
> the api-provided 'niFiVersion' field name to the reflection-expected 
> 'nifiVersion'



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] nifi issue #1626: NIFI-3643: Fixing capitalization in Java Property

2017-03-27 Thread bbende
Github user bbende commented on the issue:

https://github.com/apache/nifi/pull/1626
  
Looks good, will merge to master


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1627: NIFI-3380: Fixing paths in generated documentation

2017-03-27 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/1627


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (NIFI-3380) Multiple Versions of the Same Component

2017-03-27 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-3380?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15944001#comment-15944001
 ] 

ASF GitHub Bot commented on NIFI-3380:
--

Github user bbende commented on the issue:

https://github.com/apache/nifi/pull/1627
  
Looks good, will merge to master


> Multiple Versions of the Same Component
> ---
>
> Key: NIFI-3380
> URL: https://issues.apache.org/jira/browse/NIFI-3380
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core Framework
>Reporter: Bryan Bende
>Assignee: Matt Gilman
> Fix For: 1.2.0
>
> Attachments: nifi-example-processors-nar-1.0.nar, 
> nifi-example-processors-nar-2.0.nar, nifi-example-service-api-nar-1.0.nar, 
> nifi-example-service-api-nar-2.0.nar, nifi-example-service-nar-1.0.nar, 
> nifi-example-service-nar-1.1.nar, nifi-example-service-nar-2.0.nar
>
>
> This ticket is to track the work for supporting multiple versions of the same 
> component within NiFi. The overall design for this feature is described in 
> detail at the following wiki page:
> https://cwiki.apache.org/confluence/display/NIFI/Multiple+Versions+of+the+Same+Extension
> This ticket will track only the core NiFi work, and a separate ticket will be 
> created to track enhancements for the NAR Maven Plugin.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] nifi issue #1627: NIFI-3380: Fixing paths in generated documentation

2017-03-27 Thread bbende
Github user bbende commented on the issue:

https://github.com/apache/nifi/pull/1627
  
Looks good, will merge to master


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #1626: NIFI-3643: Fixing capitalization in Java Property

2017-03-27 Thread joat1
Github user joat1 commented on the issue:

https://github.com/apache/nifi/pull/1626
  
Just tested.  The fix enables gson serialization and deserialization.  
Thanks.  Hopefully the pull req will be accepted quickly.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (NIFI-3643) nifi-api: deserialization error of nifiVersion field due to capitalization

2017-03-27 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-3643?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15943645#comment-15943645
 ] 

ASF GitHub Bot commented on NIFI-3643:
--

Github user joat1 commented on the issue:

https://github.com/apache/nifi/pull/1626
  
Just tested.  The fix enables gson serialization and deserialization.  
Thanks.  Hopefully the pull req will be accepted quickly.


> nifi-api: deserialization error of nifiVersion field due to capitalization
> --
>
> Key: NIFI-3643
> URL: https://issues.apache.org/jira/browse/NIFI-3643
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.1.1
> Environment: OpenJDK8. gson, nifi-1.1.0
>Reporter: David Arllen
>Assignee: Matt Gilman
>Priority: Trivial
> Fix For: 1.2.0
>
>
> The response from `/nifi-api/system-diagnostics` includes the field name of 
> 'niFiVersion'.  The gson serialization library works on reflection and 
> expects 'nifiVersion' to be the field name because the class private field 
> 'nifiVersion' does not have a capital F.
> Resolution is expected to involve only the class 'VersionInfoDTO'.  The fix 
> would be matching the field capitalization with the capitalization of the 
> associated getter and setter.
> Without this fix, a gson FieldNamingStrategy is required to artifically map 
> the api-provided 'niFiVersion' field name to the reflection-expected 
> 'nifiVersion'



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (NIFI-2961) Create EncryptAttribute processor

2017-03-27 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-2961?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15943633#comment-15943633
 ] 

ASF GitHub Bot commented on NIFI-2961:
--

Github user HandOfGod94 commented on the issue:

https://github.com/apache/nifi/pull/1294
  
Hey @alopresto,
So I have both versions present on my local.
1. The original refactored version.
2. The one I've suggested above, which one do you suggest we should go with.

I found 2nd approach to be more cleaner then the first, but still would 
like to know your opinion as this approach essentially required deletion and 
addition of attributes. I am not sure whether it's advisable or not.


> Create EncryptAttribute processor
> -
>
> Key: NIFI-2961
> URL: https://issues.apache.org/jira/browse/NIFI-2961
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Affects Versions: 1.0.0
>Reporter: Andy LoPresto
>  Labels: attributes, encryption, security
>
> Similar to {{EncryptContent}}, the {{EncryptAttribute}} processor would allow 
> individual (and multiple) flowfile attributes to be encrypted (either 
> in-place or to a new attribute key) with various encryption algorithms (AES, 
> RSA, PBE, and PGP). 
> Specific compatibility with the {{OpenSSL EVP_BytesToKey}}, {{PBKDF2}}, 
> {{scrypt}}, and {{bcrypt}} key derivation functions should be included. 
> The processor should provide the boolean option to encrypt or decrypt (only 
> one operation per instance of the processor). The processor should also allow 
> Base64 encoding (aka ASCII armor) for the encrypted attributes to prevent 
> byte escaping/data loss. 
> If [dangerous processor 
> annotations|https://cwiki.apache.org/confluence/display/NIFI/Security+Feature+Roadmap]
>  are introduced, this processor should be marked as such and the 
> corresponding attribute protection (i.e. provenance before/after, etc.) 
> should be applied. 
> Originally requested in this [Stack Overflow 
> question|https://stackoverflow.com/questions/40294945/nifi-encrypt-json].  



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] nifi issue #1294: NIFI-2961 Create EncryptAttribute processor

2017-03-27 Thread HandOfGod94
Github user HandOfGod94 commented on the issue:

https://github.com/apache/nifi/pull/1294
  
Hey @alopresto,
So I have both versions present on my local.
1. The original refactored version.
2. The one I've suggested above, which one do you suggest we should go with.

I found 2nd approach to be more cleaner then the first, but still would 
like to know your opinion as this approach essentially required deletion and 
addition of attributes. I am not sure whether it's advisable or not.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Resolved] (NIFI-3477) Any changes to nifi registry properties needs nifi services restart

2017-03-27 Thread Joseph Witt (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-3477?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Joseph Witt resolved NIFI-3477.
---
Resolution: Duplicate

> Any changes to nifi registry properties needs nifi services restart
> ---
>
> Key: NIFI-3477
> URL: https://issues.apache.org/jira/browse/NIFI-3477
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Configuration Management, Variable Registry
>Affects Versions: 1.1.0
> Environment: Red Hat Enterprise Linux Server release 7.2 
> (Maipo),Apache NiFi - Version 1.1.0.2.1.1.0-2
>Reporter: balakrishnan ramasamy
>
> $NIFI_HOME/conf/nifi.properties 
> # external properties files for variable registry
> # supports a comma delimited list of file locations
> nifi.variable.registry.properties=/home/user/nifi_prop/customn_environment.properties
> The above customn_environment.properties contains the below properties
> user_name=bala
> password=bala_password
> going forward if we add new properties to customn_environment.properties as 
> below,
> beeline_url=beeline.url
> need Nifi service restart, but our business requirement needs the property to 
> reflect without a restart.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (NIFI-3477) Any changes to nifi registry properties needs nifi services restart

2017-03-27 Thread Mark Payne (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-3477?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mark Payne updated NIFI-3477:
-
Fix Version/s: (was: 1.1.0)

> Any changes to nifi registry properties needs nifi services restart
> ---
>
> Key: NIFI-3477
> URL: https://issues.apache.org/jira/browse/NIFI-3477
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Configuration Management, Variable Registry
>Affects Versions: 1.1.0
> Environment: Red Hat Enterprise Linux Server release 7.2 
> (Maipo),Apache NiFi - Version 1.1.0.2.1.1.0-2
>Reporter: balakrishnan ramasamy
>
> $NIFI_HOME/conf/nifi.properties 
> # external properties files for variable registry
> # supports a comma delimited list of file locations
> nifi.variable.registry.properties=/home/user/nifi_prop/customn_environment.properties
> The above customn_environment.properties contains the below properties
> user_name=bala
> password=bala_password
> going forward if we add new properties to customn_environment.properties as 
> below,
> beeline_url=beeline.url
> need Nifi service restart, but our business requirement needs the property to 
> reflect without a restart.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (NIFI-3603) localization needed

2017-03-27 Thread Mark Payne (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-3603?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mark Payne updated NIFI-3603:
-
Fix Version/s: (was: 1.1.0)

> localization needed
> ---
>
> Key: NIFI-3603
> URL: https://issues.apache.org/jira/browse/NIFI-3603
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Core UI
>Affects Versions: 1.1.0
>Reporter: PetterWang
>  Labels: features
>
> JSP :
> 1.Use the JSTL standard fmt tag for multilingualization
> 1.The resource file is placed under the org.apache.nifi.web.resources 
> package
> 2.Resource file name rules Messages_en.properties English resource 
> file, Messages_cn.properties for Chinese resource file
> Bulletin-board-content-bullet-bullet-bullet-shell-content-bullet
> 3.Will be around the jsp Chinese characters in accordance with  message key = "canvas.about-dailog.nf-version" /> replace
> 2.In nifi-web-ui in the web.xml configuration in the following content 
> specified properties file location, has been used language
>
> javax.servlet.jsp.jstl.fmt.localizationContext  param-name>
> org.apache.nifi.web.resources.Messages  param-value>
>
>
> javax.servlet.jsp.jstl.fmt.locale 
> en 
>
> 3.Add the * .properties file to the nifi-web-ui pom.xml war plugin
>
> org.apache.maven.plugins 
> maven-war-plugin 
>
> $ {staging.dir} /WEB-INF/web.xml 
>
> 
> src / main / java / org / apache / 
> nifi / web / resources / 
> WEB-INF / classes / org / apache 
> / nifi / web / resources / 
>
> *. Properties 
>
> true 
>
> 4.JSP page of the internationalization of the two general situation
>1. Directly replace the tag in the text content, for example: replace 
> english content
>   Relationships  Replace with the result
>"partials.connection-details.Relationships" /> 
> 2.Replace the div in the attribute content, for example: replace the 
> test content
> 
>  "partials.connection-details.configuration-tab.read-only-relationship-names-container.title"
>  var = "Relationships"
>  
> 3.In the jsp page to increase the fmt reference
> <% @ Taglib uri = "http://java.sun.com/jsp/jstl/fmt; prefix = 
> "fmt"%>
>4.Properties file rules
> There are two rules
>  Bulletin-board-content-bullet-bullet-bullet-shell-content-bullet
>  If the text information is not unique xpath or id, you can use the 
> original information to name enable-controller-service-dialog.Canceling = 
> Canceling ...
> JS :
> 1.Make nf / globalization / resources.js add translation files
> 2.Global.js initialization, according to the current run of the file name and 
> messsage to determine the resources to find json
> 3.Add the resources.js javascript into the war package in pif.xml for 
> nifi-web-ui
>
>org.apache.maven.plugins 
>maven-war-plugin 
>2.5 
>   
>   
>  Js / nf / globalization / resources.js
> 4.Add a reference to resources.js in jsp
>   Modify: nifi-nar-bundles / nifi-framework-bundle / nifi-framework / 
> nifi-web / nifi-web-ui / src / main / webapp / WEB-INF / pages / 
> bulletin-board.jsp
>   Modify: nifi-nar-bundles / nifi-framework-bundle / nifi-framework / 
> nifi-web / nifi-web-ui / src / main / webapp / WEB-INF / pages / canvas.jsp
>   Modify: nifi-nar-bundles / nifi-framework-bundle / nifi-framework / 
> nifi-web / nifi-web-ui / src / main / webapp / WEB-INF / pages / cluster.jsp
>   Modify: nifi-nar-bundles / nifi-framework-bundle / nifi-framework / 
> nifi-web / nifi-web-ui / src / main / webapp / WEB-INF / pages / counters.jsp
>   Modify: nifi-nar-bundles / nifi-framework-bundle / nifi-framework / 
> nifi-web / nifi-web-ui / src / main / webapp / WEB-INF / pages / history.jsp
>   Modify: nifi-nar-bundles / nifi-framework-bundle / nifi-framework / 
> nifi-web / nifi-web-ui / src / main / webapp / WEB-INF / pages / login.jsp
>   Modify: nifi-nar-bundles / nifi-framework-bundle / nifi-framework / 
> nifi-web / nifi-web-ui / src / main / webapp / WEB-INF / pages / 
> provenance.jsp
>   Modify: nifi-nar-bundles / nifi-framework-bundle / 

[jira] [Commented] (NIFI-3650) Ensure travis-ci won't cache nifi artifacts

2017-03-27 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-3650?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15943453#comment-15943453
 ] 

ASF GitHub Bot commented on NIFI-3650:
--

Github user jfrazee commented on the issue:

https://github.com/apache/nifi/pull/1625
  
@apiri Oh, I see now. You're saying move the one in `script` to 
`before_cache`.


> Ensure travis-ci won't cache nifi artifacts
> ---
>
> Key: NIFI-3650
> URL: https://issues.apache.org/jira/browse/NIFI-3650
> Project: Apache NiFi
>  Issue Type: Bug
>Reporter: Andre F de Miranda
>Assignee: Andre F de Miranda
>
> while caching dependencies is a fair way of improving travis-ci build times, 
> the existence of cached NiFi artifacts within maven's ~/.m2 directory may 
> hide issues caused by incomplete code refactoring that would otherwise be 
> triggered had travis-ci been running a truly clean build.
> We should find a way of balancing caching and possible build false negatives.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] nifi issue #1625: NIFI-3650 - Adjust travis to forcefuly remove $HOME/.m2/re...

2017-03-27 Thread jfrazee
Github user jfrazee commented on the issue:

https://github.com/apache/nifi/pull/1625
  
@apiri Oh, I see now. You're saying move the one in `script` to 
`before_cache`.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (NIFI-3650) Ensure travis-ci won't cache nifi artifacts

2017-03-27 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-3650?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15943448#comment-15943448
 ] 

ASF GitHub Bot commented on NIFI-3650:
--

Github user jfrazee commented on the issue:

https://github.com/apache/nifi/pull/1625
  
@apiri I think there might be two tasks here to make sure everything is 
copacetic: (1) remove that part of the tree so we don't bother caching 
it/downloading it period and (2) removing it before the build just to make sure 
it's not there (will need that the first time this runs anyway if we don't 
invalidate the cache).

So I think it makes sense to add this but also keep the other.


> Ensure travis-ci won't cache nifi artifacts
> ---
>
> Key: NIFI-3650
> URL: https://issues.apache.org/jira/browse/NIFI-3650
> Project: Apache NiFi
>  Issue Type: Bug
>Reporter: Andre F de Miranda
>Assignee: Andre F de Miranda
>
> while caching dependencies is a fair way of improving travis-ci build times, 
> the existence of cached NiFi artifacts within maven's ~/.m2 directory may 
> hide issues caused by incomplete code refactoring that would otherwise be 
> triggered had travis-ci been running a truly clean build.
> We should find a way of balancing caching and possible build false negatives.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] nifi issue #1625: NIFI-3650 - Adjust travis to forcefuly remove $HOME/.m2/re...

2017-03-27 Thread jfrazee
Github user jfrazee commented on the issue:

https://github.com/apache/nifi/pull/1625
  
@apiri I think there might be two tasks here to make sure everything is 
copacetic: (1) remove that part of the tree so we don't bother caching 
it/downloading it period and (2) removing it before the build just to make sure 
it's not there (will need that the first time this runs anyway if we don't 
invalidate the cache).

So I think it makes sense to add this but also keep the other.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #1618: NIFI-3413: Add GetChangeDataCaptureMySQL processor

2017-03-27 Thread mattyb149
Github user mattyb149 commented on the issue:

https://github.com/apache/nifi/pull/1618
  
@phrocker Thank you for the review! I reproduced your "no password" issue 
and found that BinaryLogClient expects a non-null password, so I added code to 
set the password to the empty string if none was provided in the property.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (NIFI-3413) Implement a GetChangeDataCapture processor

2017-03-27 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-3413?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15943395#comment-15943395
 ] 

ASF GitHub Bot commented on NIFI-3413:
--

Github user mattyb149 commented on the issue:

https://github.com/apache/nifi/pull/1618
  
@phrocker Thank you for the review! I reproduced your "no password" issue 
and found that BinaryLogClient expects a non-null password, so I added code to 
set the password to the empty string if none was provided in the property.


> Implement a GetChangeDataCapture processor
> --
>
> Key: NIFI-3413
> URL: https://issues.apache.org/jira/browse/NIFI-3413
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Reporter: Matt Burgess
>Assignee: Matt Burgess
>
> Database systems such as MySQL, Oracle, and SQL Server allow access to their 
> transactional logs and such, in order for external clients to have a "change 
> data capture" (CDC) capability. I propose a GetChangeDataCapture processor to 
> enable this in NiFi.
> The processor would be configured with a DBCPConnectionPool controller 
> service, as well as a Database Type property (similar to the one in 
> QueryDatabaseTable) for database-specific handling. Additional properties 
> might include the CDC table name, etc.  Additional database-specific 
> properties could be handled using dynamic properties (and the documentation 
> should reflect this).
> The processor would accept no incoming connections (it is a "Get" or source 
> processor), would be intended to run on the primary node only as a single 
> threaded processor, and would generate a flow file for each operation 
> (INSERT, UPDATE, DELETE, e,g,) in one or some number of formats (JSON, e.g.). 
> The flow files would be transferred in time order (to enable a replication 
> solution, for example), perhaps with some auto-incrementing attribute to also 
> indicate order if need be.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (NIFI-3413) Implement a GetChangeDataCapture processor

2017-03-27 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-3413?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15943391#comment-15943391
 ] 

ASF GitHub Bot commented on NIFI-3413:
--

Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1618#discussion_r108189411
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/GetChangeDataCaptureMySQL.java
 ---
@@ -0,0 +1,879 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import com.github.shyiko.mysql.binlog.BinaryLogClient;
+import com.github.shyiko.mysql.binlog.event.Event;
+import com.github.shyiko.mysql.binlog.event.EventHeaderV4;
+import com.github.shyiko.mysql.binlog.event.EventType;
+import com.github.shyiko.mysql.binlog.event.QueryEventData;
+import com.github.shyiko.mysql.binlog.event.RotateEventData;
+import com.github.shyiko.mysql.binlog.event.TableMapEventData;
+import org.apache.nifi.annotation.behavior.DynamicProperties;
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.Stateful;
+import org.apache.nifi.annotation.behavior.TriggerSerially;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.PropertyValue;
+import org.apache.nifi.components.state.Scope;
+import org.apache.nifi.components.state.StateManager;
+import org.apache.nifi.components.state.StateMap;
+import org.apache.nifi.distributed.cache.client.Deserializer;
+import org.apache.nifi.distributed.cache.client.DistributedMapCacheClient;
+import org.apache.nifi.distributed.cache.client.Serializer;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.AbstractSessionFactoryProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.ProcessSessionFactory;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.processors.standard.db.CDCException;
+import org.apache.nifi.processors.standard.db.event.ColumnDefinition;
+import org.apache.nifi.processors.standard.db.event.RowEventException;
+import org.apache.nifi.processors.standard.db.event.TableInfo;
+import org.apache.nifi.processors.standard.db.event.TableInfoCacheKey;
+import org.apache.nifi.processors.standard.db.event.io.EventWriter;
+import 
org.apache.nifi.processors.standard.db.impl.mysql.event.BeginTransactionEventInfo;
+import org.apache.nifi.processors.standard.db.impl.mysql.RawBinlogEvent;
+import 
org.apache.nifi.processors.standard.db.impl.mysql.BinlogEventListener;
+import 
org.apache.nifi.processors.standard.db.impl.mysql.event.BinlogEventInfo;
+import 
org.apache.nifi.processors.standard.db.impl.mysql.event.CommitTransactionEventInfo;
+import 
org.apache.nifi.processors.standard.db.impl.mysql.event.DeleteRowsEventInfo;
+import 
org.apache.nifi.processors.standard.db.impl.mysql.event.SchemaChangeEventInfo;
+import 
org.apache.nifi.processors.standard.db.impl.mysql.event.UpdateRowsEventInfo;
+import 
org.apache.nifi.processors.standard.db.impl.mysql.event.InsertRowsEventInfo;
+import 
org.apache.nifi.processors.standard.db.impl.mysql.event.io.BeginTransactionEventWriter;
+import 

[GitHub] nifi pull request #1618: NIFI-3413: Add GetChangeDataCaptureMySQL processor

2017-03-27 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1618#discussion_r108189411
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/GetChangeDataCaptureMySQL.java
 ---
@@ -0,0 +1,879 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import com.github.shyiko.mysql.binlog.BinaryLogClient;
+import com.github.shyiko.mysql.binlog.event.Event;
+import com.github.shyiko.mysql.binlog.event.EventHeaderV4;
+import com.github.shyiko.mysql.binlog.event.EventType;
+import com.github.shyiko.mysql.binlog.event.QueryEventData;
+import com.github.shyiko.mysql.binlog.event.RotateEventData;
+import com.github.shyiko.mysql.binlog.event.TableMapEventData;
+import org.apache.nifi.annotation.behavior.DynamicProperties;
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.Stateful;
+import org.apache.nifi.annotation.behavior.TriggerSerially;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.PropertyValue;
+import org.apache.nifi.components.state.Scope;
+import org.apache.nifi.components.state.StateManager;
+import org.apache.nifi.components.state.StateMap;
+import org.apache.nifi.distributed.cache.client.Deserializer;
+import org.apache.nifi.distributed.cache.client.DistributedMapCacheClient;
+import org.apache.nifi.distributed.cache.client.Serializer;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.AbstractSessionFactoryProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.ProcessSessionFactory;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.processors.standard.db.CDCException;
+import org.apache.nifi.processors.standard.db.event.ColumnDefinition;
+import org.apache.nifi.processors.standard.db.event.RowEventException;
+import org.apache.nifi.processors.standard.db.event.TableInfo;
+import org.apache.nifi.processors.standard.db.event.TableInfoCacheKey;
+import org.apache.nifi.processors.standard.db.event.io.EventWriter;
+import 
org.apache.nifi.processors.standard.db.impl.mysql.event.BeginTransactionEventInfo;
+import org.apache.nifi.processors.standard.db.impl.mysql.RawBinlogEvent;
+import 
org.apache.nifi.processors.standard.db.impl.mysql.BinlogEventListener;
+import 
org.apache.nifi.processors.standard.db.impl.mysql.event.BinlogEventInfo;
+import 
org.apache.nifi.processors.standard.db.impl.mysql.event.CommitTransactionEventInfo;
+import 
org.apache.nifi.processors.standard.db.impl.mysql.event.DeleteRowsEventInfo;
+import 
org.apache.nifi.processors.standard.db.impl.mysql.event.SchemaChangeEventInfo;
+import 
org.apache.nifi.processors.standard.db.impl.mysql.event.UpdateRowsEventInfo;
+import 
org.apache.nifi.processors.standard.db.impl.mysql.event.InsertRowsEventInfo;
+import 
org.apache.nifi.processors.standard.db.impl.mysql.event.io.BeginTransactionEventWriter;
+import 
org.apache.nifi.processors.standard.db.impl.mysql.event.io.CommitTransactionEventWriter;
+import 
org.apache.nifi.processors.standard.db.impl.mysql.event.io.DeleteRowsWriter;
+import 
org.apache.nifi.processors.standard.db.impl.mysql.event.io.InsertRowsWriter;
+import 

[jira] [Commented] (NIFI-3413) Implement a GetChangeDataCapture processor

2017-03-27 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-3413?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15943387#comment-15943387
 ] 

ASF GitHub Bot commented on NIFI-3413:
--

Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1618#discussion_r108188878
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/GetChangeDataCaptureMySQL.java
 ---
@@ -0,0 +1,879 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import com.github.shyiko.mysql.binlog.BinaryLogClient;
+import com.github.shyiko.mysql.binlog.event.Event;
+import com.github.shyiko.mysql.binlog.event.EventHeaderV4;
+import com.github.shyiko.mysql.binlog.event.EventType;
+import com.github.shyiko.mysql.binlog.event.QueryEventData;
+import com.github.shyiko.mysql.binlog.event.RotateEventData;
+import com.github.shyiko.mysql.binlog.event.TableMapEventData;
+import org.apache.nifi.annotation.behavior.DynamicProperties;
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.Stateful;
+import org.apache.nifi.annotation.behavior.TriggerSerially;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.PropertyValue;
+import org.apache.nifi.components.state.Scope;
+import org.apache.nifi.components.state.StateManager;
+import org.apache.nifi.components.state.StateMap;
+import org.apache.nifi.distributed.cache.client.Deserializer;
+import org.apache.nifi.distributed.cache.client.DistributedMapCacheClient;
+import org.apache.nifi.distributed.cache.client.Serializer;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.AbstractSessionFactoryProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.ProcessSessionFactory;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.processors.standard.db.CDCException;
+import org.apache.nifi.processors.standard.db.event.ColumnDefinition;
+import org.apache.nifi.processors.standard.db.event.RowEventException;
+import org.apache.nifi.processors.standard.db.event.TableInfo;
+import org.apache.nifi.processors.standard.db.event.TableInfoCacheKey;
+import org.apache.nifi.processors.standard.db.event.io.EventWriter;
+import 
org.apache.nifi.processors.standard.db.impl.mysql.event.BeginTransactionEventInfo;
+import org.apache.nifi.processors.standard.db.impl.mysql.RawBinlogEvent;
+import 
org.apache.nifi.processors.standard.db.impl.mysql.BinlogEventListener;
+import 
org.apache.nifi.processors.standard.db.impl.mysql.event.BinlogEventInfo;
+import 
org.apache.nifi.processors.standard.db.impl.mysql.event.CommitTransactionEventInfo;
+import 
org.apache.nifi.processors.standard.db.impl.mysql.event.DeleteRowsEventInfo;
+import 
org.apache.nifi.processors.standard.db.impl.mysql.event.SchemaChangeEventInfo;
+import 
org.apache.nifi.processors.standard.db.impl.mysql.event.UpdateRowsEventInfo;
+import 
org.apache.nifi.processors.standard.db.impl.mysql.event.InsertRowsEventInfo;
+import 
org.apache.nifi.processors.standard.db.impl.mysql.event.io.BeginTransactionEventWriter;
+import 

[GitHub] nifi pull request #1618: NIFI-3413: Add GetChangeDataCaptureMySQL processor

2017-03-27 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1618#discussion_r108188878
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/GetChangeDataCaptureMySQL.java
 ---
@@ -0,0 +1,879 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import com.github.shyiko.mysql.binlog.BinaryLogClient;
+import com.github.shyiko.mysql.binlog.event.Event;
+import com.github.shyiko.mysql.binlog.event.EventHeaderV4;
+import com.github.shyiko.mysql.binlog.event.EventType;
+import com.github.shyiko.mysql.binlog.event.QueryEventData;
+import com.github.shyiko.mysql.binlog.event.RotateEventData;
+import com.github.shyiko.mysql.binlog.event.TableMapEventData;
+import org.apache.nifi.annotation.behavior.DynamicProperties;
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.Stateful;
+import org.apache.nifi.annotation.behavior.TriggerSerially;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.PropertyValue;
+import org.apache.nifi.components.state.Scope;
+import org.apache.nifi.components.state.StateManager;
+import org.apache.nifi.components.state.StateMap;
+import org.apache.nifi.distributed.cache.client.Deserializer;
+import org.apache.nifi.distributed.cache.client.DistributedMapCacheClient;
+import org.apache.nifi.distributed.cache.client.Serializer;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.AbstractSessionFactoryProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.ProcessSessionFactory;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.processors.standard.db.CDCException;
+import org.apache.nifi.processors.standard.db.event.ColumnDefinition;
+import org.apache.nifi.processors.standard.db.event.RowEventException;
+import org.apache.nifi.processors.standard.db.event.TableInfo;
+import org.apache.nifi.processors.standard.db.event.TableInfoCacheKey;
+import org.apache.nifi.processors.standard.db.event.io.EventWriter;
+import 
org.apache.nifi.processors.standard.db.impl.mysql.event.BeginTransactionEventInfo;
+import org.apache.nifi.processors.standard.db.impl.mysql.RawBinlogEvent;
+import 
org.apache.nifi.processors.standard.db.impl.mysql.BinlogEventListener;
+import 
org.apache.nifi.processors.standard.db.impl.mysql.event.BinlogEventInfo;
+import 
org.apache.nifi.processors.standard.db.impl.mysql.event.CommitTransactionEventInfo;
+import 
org.apache.nifi.processors.standard.db.impl.mysql.event.DeleteRowsEventInfo;
+import 
org.apache.nifi.processors.standard.db.impl.mysql.event.SchemaChangeEventInfo;
+import 
org.apache.nifi.processors.standard.db.impl.mysql.event.UpdateRowsEventInfo;
+import 
org.apache.nifi.processors.standard.db.impl.mysql.event.InsertRowsEventInfo;
+import 
org.apache.nifi.processors.standard.db.impl.mysql.event.io.BeginTransactionEventWriter;
+import 
org.apache.nifi.processors.standard.db.impl.mysql.event.io.CommitTransactionEventWriter;
+import 
org.apache.nifi.processors.standard.db.impl.mysql.event.io.DeleteRowsWriter;
+import 
org.apache.nifi.processors.standard.db.impl.mysql.event.io.InsertRowsWriter;
+import 

[jira] [Commented] (NIFI-3413) Implement a GetChangeDataCapture processor

2017-03-27 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-3413?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15943379#comment-15943379
 ] 

ASF GitHub Bot commented on NIFI-3413:
--

Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1618#discussion_r108187176
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/db/impl/mysql/event/io/InsertRowsWriter.java
 ---
@@ -0,0 +1,90 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard.db.impl.mysql.event.io;
+
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processors.standard.GetChangeDataCaptureMySQL;
+import org.apache.nifi.processors.standard.db.event.ColumnDefinition;
+import 
org.apache.nifi.processors.standard.db.impl.mysql.event.InsertRowsEventInfo;
+
+import java.io.IOException;
+import java.io.Serializable;
+import java.util.BitSet;
+import java.util.concurrent.atomic.AtomicLong;
+
+import static 
org.apache.nifi.processors.standard.db.impl.mysql.MySQLCDCUtils.getWritableObject;
+
+/**
+ * A writer class to output MySQL binlog "write rows" (aka INSERT) events 
to flow file(s).
+ */
+public class InsertRowsWriter extends 
AbstractBinlogTableEventWriter {
+
+/**
+ * Creates and transfers a new flow file whose contents are the 
JSON-serialized value of the specified event, and the sequence ID attribute set
+ *
+ * @param session   A reference to a ProcessSession from which the 
flow file(s) will be created and transferred
+ * @param eventInfo An event whose value will become the contents of 
the flow file
+ * @return The next available CDC sequence ID for use by the CDC 
processor
+ */
+public long writeEvent(final ProcessSession session, final 
InsertRowsEventInfo eventInfo, final long currentSequenceId) {
+final AtomicLong seqId = new AtomicLong(currentSequenceId);
+for (Serializable[] row : eventInfo.getRows()) {
+
+FlowFile flowFile = session.create();
+flowFile = session.write(flowFile, outputStream -> {
+
+super.startJson(outputStream, eventInfo);
+super.writeJson(eventInfo);
+
+final BitSet bitSet = eventInfo.getIncludedColumns();
+writeRow(eventInfo, row, bitSet);
+
+super.endJson();
+});
+
+flowFile = session.putAllAttributes(flowFile, 
getCommonAttributes(seqId.get(), eventInfo));
+session.transfer(flowFile, 
GetChangeDataCaptureMySQL.REL_SUCCESS);
+seqId.getAndIncrement();
+}
+return seqId.get();
+}
+
+protected void writeRow(InsertRowsEventInfo event, Serializable[] row, 
BitSet includedColumns) throws IOException {
+jsonGenerator.writeArrayFieldStart("columns");
+int i = includedColumns.nextSetBit(0);
+while (i != -1) {
+jsonGenerator.writeStartObject();
+jsonGenerator.writeNumberField("id", i + 1);
+ColumnDefinition columnDefinition = event.getColumnByIndex(i);
+Integer columnType = null;
+if (columnDefinition != null) {
+jsonGenerator.writeStringField("name", 
columnDefinition.getName());
+columnType = (int) columnDefinition.getType();
--- End diff --

Per the above comment/response, I've changed the getType() method to return 
an int, I only use Integer in case we don't have a column definition, then we 
can write "null" as the value for the column type. I'd prefer to use the JDBC 
Types constants rather than our own enum, as that is what will be used 99% of 
the time.


> Implement a GetChangeDataCapture processor
> 

[GitHub] nifi pull request #1618: NIFI-3413: Add GetChangeDataCaptureMySQL processor

2017-03-27 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1618#discussion_r108187176
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/db/impl/mysql/event/io/InsertRowsWriter.java
 ---
@@ -0,0 +1,90 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard.db.impl.mysql.event.io;
+
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processors.standard.GetChangeDataCaptureMySQL;
+import org.apache.nifi.processors.standard.db.event.ColumnDefinition;
+import 
org.apache.nifi.processors.standard.db.impl.mysql.event.InsertRowsEventInfo;
+
+import java.io.IOException;
+import java.io.Serializable;
+import java.util.BitSet;
+import java.util.concurrent.atomic.AtomicLong;
+
+import static 
org.apache.nifi.processors.standard.db.impl.mysql.MySQLCDCUtils.getWritableObject;
+
+/**
+ * A writer class to output MySQL binlog "write rows" (aka INSERT) events 
to flow file(s).
+ */
+public class InsertRowsWriter extends 
AbstractBinlogTableEventWriter {
+
+/**
+ * Creates and transfers a new flow file whose contents are the 
JSON-serialized value of the specified event, and the sequence ID attribute set
+ *
+ * @param session   A reference to a ProcessSession from which the 
flow file(s) will be created and transferred
+ * @param eventInfo An event whose value will become the contents of 
the flow file
+ * @return The next available CDC sequence ID for use by the CDC 
processor
+ */
+public long writeEvent(final ProcessSession session, final 
InsertRowsEventInfo eventInfo, final long currentSequenceId) {
+final AtomicLong seqId = new AtomicLong(currentSequenceId);
+for (Serializable[] row : eventInfo.getRows()) {
+
+FlowFile flowFile = session.create();
+flowFile = session.write(flowFile, outputStream -> {
+
+super.startJson(outputStream, eventInfo);
+super.writeJson(eventInfo);
+
+final BitSet bitSet = eventInfo.getIncludedColumns();
+writeRow(eventInfo, row, bitSet);
+
+super.endJson();
+});
+
+flowFile = session.putAllAttributes(flowFile, 
getCommonAttributes(seqId.get(), eventInfo));
+session.transfer(flowFile, 
GetChangeDataCaptureMySQL.REL_SUCCESS);
+seqId.getAndIncrement();
+}
+return seqId.get();
+}
+
+protected void writeRow(InsertRowsEventInfo event, Serializable[] row, 
BitSet includedColumns) throws IOException {
+jsonGenerator.writeArrayFieldStart("columns");
+int i = includedColumns.nextSetBit(0);
+while (i != -1) {
+jsonGenerator.writeStartObject();
+jsonGenerator.writeNumberField("id", i + 1);
+ColumnDefinition columnDefinition = event.getColumnByIndex(i);
+Integer columnType = null;
+if (columnDefinition != null) {
+jsonGenerator.writeStringField("name", 
columnDefinition.getName());
+columnType = (int) columnDefinition.getType();
--- End diff --

Per the above comment/response, I've changed the getType() method to return 
an int, I only use Integer in case we don't have a column definition, then we 
can write "null" as the value for the column type. I'd prefer to use the JDBC 
Types constants rather than our own enum, as that is what will be used 99% of 
the time.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with 

[jira] [Commented] (NIFI-3413) Implement a GetChangeDataCapture processor

2017-03-27 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-3413?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15943366#comment-15943366
 ] 

ASF GitHub Bot commented on NIFI-3413:
--

Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1618#discussion_r108184785
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/db/impl/mysql/event/io/AbstractBinlogTableEventWriter.java
 ---
@@ -0,0 +1,65 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard.db.impl.mysql.event.io;
+
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processors.standard.GetChangeDataCaptureMySQL;
+import 
org.apache.nifi.processors.standard.db.impl.mysql.event.BinlogTableEventInfo;
+
+import java.io.IOException;
+
+/**
+ * An abstract base class for writing MYSQL binlog events into flow 
file(s), e.g.
+ */
+public abstract class AbstractBinlogTableEventWriter extends AbstractBinlogEventWriter {
+
+protected void writeJson(T event) throws IOException {
+super.writeJson(event);
+if (event.getDatabaseName() != null) {
+jsonGenerator.writeStringField("database", 
event.getDatabaseName());
+} else {
+jsonGenerator.writeNullField("database");
+}
+if (event.getTableName() != null) {
+jsonGenerator.writeStringField("table_name", 
event.getTableName());
+} else {
+jsonGenerator.writeNullField("table_name");
+}
+if (event.getTableId() != null) {
+jsonGenerator.writeNumberField("table_id", event.getTableId());
+} else {
+jsonGenerator.writeNullField("table_id");
+}
+}
+
+// Default implementation for binlog events
+@Override
+public long writeEvent(ProcessSession session, T eventInfo, long 
currentSequenceId) {
+FlowFile flowFile = session.create();
+flowFile = session.write(flowFile, (outputStream) -> {
+super.startJson(outputStream, eventInfo);
+super.writeJson(eventInfo);
--- End diff --

Good catch! Strangely I didn't see it manifest itself in the unit tests 
(multiple fields, e.g.), perhaps the JSON generator keeps a Map at each level 
and they just kept getting overwritten. In any case, I will remove it (and 
update the comments for this class to include "table-related" events so as not 
to match the same comments from its parent class)


> Implement a GetChangeDataCapture processor
> --
>
> Key: NIFI-3413
> URL: https://issues.apache.org/jira/browse/NIFI-3413
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Reporter: Matt Burgess
>Assignee: Matt Burgess
>
> Database systems such as MySQL, Oracle, and SQL Server allow access to their 
> transactional logs and such, in order for external clients to have a "change 
> data capture" (CDC) capability. I propose a GetChangeDataCapture processor to 
> enable this in NiFi.
> The processor would be configured with a DBCPConnectionPool controller 
> service, as well as a Database Type property (similar to the one in 
> QueryDatabaseTable) for database-specific handling. Additional properties 
> might include the CDC table name, etc.  Additional database-specific 
> properties could be handled using dynamic properties (and the documentation 
> should reflect this).
> The processor would accept no incoming connections (it is a "Get" or source 
> processor), would be intended to run on the primary node only as a single 
> threaded processor, and would generate a flow file for each operation 
> (INSERT, UPDATE, DELETE, e,g,) in one or some number of formats (JSON, e.g.). 
> 

[GitHub] nifi pull request #1618: NIFI-3413: Add GetChangeDataCaptureMySQL processor

2017-03-27 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1618#discussion_r108184785
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/db/impl/mysql/event/io/AbstractBinlogTableEventWriter.java
 ---
@@ -0,0 +1,65 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard.db.impl.mysql.event.io;
+
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processors.standard.GetChangeDataCaptureMySQL;
+import 
org.apache.nifi.processors.standard.db.impl.mysql.event.BinlogTableEventInfo;
+
+import java.io.IOException;
+
+/**
+ * An abstract base class for writing MYSQL binlog events into flow 
file(s), e.g.
+ */
+public abstract class AbstractBinlogTableEventWriter extends AbstractBinlogEventWriter {
+
+protected void writeJson(T event) throws IOException {
+super.writeJson(event);
+if (event.getDatabaseName() != null) {
+jsonGenerator.writeStringField("database", 
event.getDatabaseName());
+} else {
+jsonGenerator.writeNullField("database");
+}
+if (event.getTableName() != null) {
+jsonGenerator.writeStringField("table_name", 
event.getTableName());
+} else {
+jsonGenerator.writeNullField("table_name");
+}
+if (event.getTableId() != null) {
+jsonGenerator.writeNumberField("table_id", event.getTableId());
+} else {
+jsonGenerator.writeNullField("table_id");
+}
+}
+
+// Default implementation for binlog events
+@Override
+public long writeEvent(ProcessSession session, T eventInfo, long 
currentSequenceId) {
+FlowFile flowFile = session.create();
+flowFile = session.write(flowFile, (outputStream) -> {
+super.startJson(outputStream, eventInfo);
+super.writeJson(eventInfo);
--- End diff --

Good catch! Strangely I didn't see it manifest itself in the unit tests 
(multiple fields, e.g.), perhaps the JSON generator keeps a Map at each level 
and they just kept getting overwritten. In any case, I will remove it (and 
update the comments for this class to include "table-related" events so as not 
to match the same comments from its parent class)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Comment Edited] (NIFI-3625) Add JSON support to PutHiveStreaming

2017-03-27 Thread Ryan Persaud (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-3625?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15942751#comment-15942751
 ] 

Ryan Persaud edited comment on NIFI-3625 at 3/27/17 2:28 PM:
-

I was experimenting with streaming some JSON data into a partitioned table in a 
HDP 2.5 sandbox tonight, and I encountered an Exception (below).  I built from 
master (552148e9e7d45be4d298ee48afd7471405a5bfad) and tested with the 'old' 
PutHiveStreaming processor, and I got the same error.   From what I can tell, 
the error occurs whenever partition columns are specified in the 
PutHiveStreaming processor.

On a hunch I reverted HiveUtils and HiveWriter back to the versions from 
8/4/2016 (3943d72e95ff7b18c32d12020d34f134f4e86125), and I hacked them up a bit 
to work with the newer versions of PutHiveStreaming and TestPutHiveStreaming.  
I was able to successfully stream into a table. 

Has any one else encountered these issues since NIFI-3574 and NIFI-3530 have 
been resolved?  Any thoughts on how to proceed?

Here are the PutHiveStreaming properties:


  
hive-stream-metastore-uri
thrift://sandbox.hortonworks.com:9083
  
  
hive-config-resources
/home/rpersaud/shared/hive-site.xml
  
  
hive-stream-database-name
default
  
  
hive-stream-table-name
test_err
  
  
hive-stream-partition-cols
src
  
  
hive-stream-autocreate-partition
true
  
  
hive-stream-max-open-connections
8
  
  
hive-stream-heartbeat-interval
60
  
  
hive-stream-transactions-per-batch
100
  
  
hive-stream-records-per-transaction
1
  
  
Kerberos Principal
  
  
Kerberos Keytab
  


Here's the exception that I got from the master build 
(552148e9e7d45be4d298ee48afd7471405a5bfad).  I see the same exception when I 
build with my branch (322f36dd82507633a7d7e2c23122eb59530c8967), but the line 
numbers are different in PutHiveStreaming since the code has changed.:

2017-03-27 08:18:58,730 ERROR [Timer-Driven Process Thread-5] hive.log Got 
exception: java.lang.NullPointerException null
java.lang.NullPointerException: null
at 
org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaStoreFilterHook.getFilteredObjects(AuthorizationMetaStoreFilterHook.java:77)
 ~[hive-exec-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at 
org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaStoreFilterHook.filterDatabases(AuthorizationMetaStoreFilterHook.java:54)
 ~[hive-exec-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabases(HiveMetaStoreClient.java:1046)
 ~[hive-metastore-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at 
org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient.isOpen(HiveClientCache.java:367)
 [hive-hcatalog-core-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
~[na:1.8.0_121]
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
~[na:1.8.0_121]
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 ~[na:1.8.0_121]
at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_121]
at 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:155)
 [hive-metastore-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at com.sun.proxy.$Proxy127.isOpen(Unknown Source) [na:na]
at 
org.apache.hive.hcatalog.common.HiveClientCache.get(HiveClientCache.java:205) 
[hive-hcatalog-core-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at 
org.apache.hive.hcatalog.common.HCatUtil.getHiveMetastoreClient(HCatUtil.java:558)
 [hive-hcatalog-core-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at 
org.apache.hive.hcatalog.streaming.AbstractRecordWriter.(AbstractRecordWriter.java:94)
 [hive-hcatalog-streaming-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at 
org.apache.hive.hcatalog.streaming.StrictJsonWriter.(StrictJsonWriter.java:82)
 [hive-hcatalog-streaming-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at 
org.apache.hive.hcatalog.streaming.StrictJsonWriter.(StrictJsonWriter.java:60)
 [hive-hcatalog-streaming-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at 
org.apache.nifi.util.hive.HiveWriter.getRecordWriter(HiveWriter.java:84) 
[nifi-hive-processors-1.2.0-SNAPSHOT.jar:1.2.0-SNAPSHOT]
at 

[jira] [Commented] (NIFI-118) Remove ellipsis plugin

2017-03-27 Thread Matt Gilman (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-118?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15943356#comment-15943356
 ] 

Matt Gilman commented on NIFI-118:
--

Technically, yes, this JIRA is still valid. The ellipsis plugin supports 
multi-line ellipsis which is not supported by pure CSS. Once this is supported 
in CSS or when we no longer have a need for multi-line ellipsis we should get 
rid of this code.

> Remove ellipsis plugin
> --
>
> Key: NIFI-118
> URL: https://issues.apache.org/jira/browse/NIFI-118
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core UI
>Reporter: Matt Gilman
>Priority: Minor
>
> Allow browsers to handle all ellipsis. Not sure how to handle the multiline 
> ellipsis required for the Processor capability description.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (NIFI-3413) Implement a GetChangeDataCapture processor

2017-03-27 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-3413?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15943353#comment-15943353
 ] 

ASF GitHub Bot commented on NIFI-3413:
--

Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1618#discussion_r108182707
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/db/event/TableInfoCacheKey.java
 ---
@@ -0,0 +1,95 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard.db.event;
+
+import 
org.apache.nifi.distributed.cache.client.exception.SerializationException;
+
+import java.io.IOException;
+import java.io.OutputStream;
+
+import static 
org.apache.nifi.processors.standard.db.event.TableInfo.DB_TABLE_NAME_DELIMITER;
+
+/**
+ * This class represents a key in a cache that contains information 
(column definitions, e.g.) for a database table
+ */
+public class TableInfoCacheKey {
+
+private final String databaseName;
+private final String tableName;
+private final long tableId;
+private final String uuidPrefix;
+
+public TableInfoCacheKey(String uuidPrefix, String databaseName, 
String tableName, long tableId) {
+this.uuidPrefix = uuidPrefix;
+this.databaseName = databaseName;
+this.tableName = tableName;
+this.tableId = tableId;
+}
+
+@Override
+public boolean equals(Object o) {
+if (this == o) return true;
+if (o == null || getClass() != o.getClass()) return false;
+
+TableInfoCacheKey that = (TableInfoCacheKey) o;
+
+if (tableId != that.tableId) return false;
+if (databaseName != null ? !databaseName.equals(that.databaseName) 
: that.databaseName != null) return false;
--- End diff --

Good point, will change my equals() overrides to use EqualsBuilder


> Implement a GetChangeDataCapture processor
> --
>
> Key: NIFI-3413
> URL: https://issues.apache.org/jira/browse/NIFI-3413
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Reporter: Matt Burgess
>Assignee: Matt Burgess
>
> Database systems such as MySQL, Oracle, and SQL Server allow access to their 
> transactional logs and such, in order for external clients to have a "change 
> data capture" (CDC) capability. I propose a GetChangeDataCapture processor to 
> enable this in NiFi.
> The processor would be configured with a DBCPConnectionPool controller 
> service, as well as a Database Type property (similar to the one in 
> QueryDatabaseTable) for database-specific handling. Additional properties 
> might include the CDC table name, etc.  Additional database-specific 
> properties could be handled using dynamic properties (and the documentation 
> should reflect this).
> The processor would accept no incoming connections (it is a "Get" or source 
> processor), would be intended to run on the primary node only as a single 
> threaded processor, and would generate a flow file for each operation 
> (INSERT, UPDATE, DELETE, e,g,) in one or some number of formats (JSON, e.g.). 
> The flow files would be transferred in time order (to enable a replication 
> solution, for example), perhaps with some auto-incrementing attribute to also 
> indicate order if need be.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Comment Edited] (NIFI-3625) Add JSON support to PutHiveStreaming

2017-03-27 Thread Ryan Persaud (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-3625?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15942751#comment-15942751
 ] 

Ryan Persaud edited comment on NIFI-3625 at 3/27/17 2:26 PM:
-

I was experimenting with streaming some JSON data into a partitioned table in a 
HDP 2.5 sandbox tonight, and I encountered an Exception (below).  I built from 
master (552148e9e7d45be4d298ee48afd7471405a5bfad) and tested with the 'old' 
PutHiveStreaming processor, and I got the same error.   From what I can tell, 
the error occurs whenever partition columns are specified in the 
PutHiveStreaming processor.

On a hunch I reverted HiveUtils and HiveWriter back to the versions from 
8/4/2016 (3943d72e95ff7b18c32d12020d34f134f4e86125), and I hacked them up a bit 
to work with the newer versions of PutHiveStreaming and TestPutHiveStreaming.  
I was able to successfully stream into a table. 

Has any one else encountered these issues since NIFI-3574 and NIFI-3530 have 
been resolved?  Any thoughts on how to proceed?

Here are the PutHiveStreaming properties:


  
hive-stream-metastore-uri
thrift://sandbox.hortonworks.com:9083
  
  
hive-config-resources
/home/rpersaud/shared/hive-site.xml
  
  
hive-stream-database-name
default
  
  
hive-stream-table-name
test_err
  
  
hive-stream-partition-cols
src
  
  
hive-stream-autocreate-partition
true
  
  
hive-stream-max-open-connections
8
  
  
hive-stream-heartbeat-interval
60
  
  
hive-stream-transactions-per-batch
100
  
  
hive-stream-records-per-transaction
1
  
  
Kerberos Principal
  
  
Kerberos Keytab
  


Here's the exception that I got from the master build 
(552148e9e7d45be4d298ee48afd7471405a5bfad).  I see the same exception when I 
build with 322f36dd82507633a7d7e2c23122eb59530c8967, but the line numbers are 
different:

2017-03-27 08:18:58,730 ERROR [Timer-Driven Process Thread-5] hive.log Got 
exception: java.lang.NullPointerException null
java.lang.NullPointerException: null
at 
org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaStoreFilterHook.getFilteredObjects(AuthorizationMetaStoreFilterHook.java:77)
 ~[hive-exec-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at 
org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaStoreFilterHook.filterDatabases(AuthorizationMetaStoreFilterHook.java:54)
 ~[hive-exec-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabases(HiveMetaStoreClient.java:1046)
 ~[hive-metastore-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at 
org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient.isOpen(HiveClientCache.java:367)
 [hive-hcatalog-core-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
~[na:1.8.0_121]
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
~[na:1.8.0_121]
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 ~[na:1.8.0_121]
at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_121]
at 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:155)
 [hive-metastore-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at com.sun.proxy.$Proxy127.isOpen(Unknown Source) [na:na]
at 
org.apache.hive.hcatalog.common.HiveClientCache.get(HiveClientCache.java:205) 
[hive-hcatalog-core-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at 
org.apache.hive.hcatalog.common.HCatUtil.getHiveMetastoreClient(HCatUtil.java:558)
 [hive-hcatalog-core-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at 
org.apache.hive.hcatalog.streaming.AbstractRecordWriter.(AbstractRecordWriter.java:94)
 [hive-hcatalog-streaming-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at 
org.apache.hive.hcatalog.streaming.StrictJsonWriter.(StrictJsonWriter.java:82)
 [hive-hcatalog-streaming-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at 
org.apache.hive.hcatalog.streaming.StrictJsonWriter.(StrictJsonWriter.java:60)
 [hive-hcatalog-streaming-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at 
org.apache.nifi.util.hive.HiveWriter.getRecordWriter(HiveWriter.java:84) 
[nifi-hive-processors-1.2.0-SNAPSHOT.jar:1.2.0-SNAPSHOT]
at org.apache.nifi.util.hive.HiveWriter.(HiveWriter.java:71) 

[GitHub] nifi pull request #1618: NIFI-3413: Add GetChangeDataCaptureMySQL processor

2017-03-27 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1618#discussion_r108182707
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/db/event/TableInfoCacheKey.java
 ---
@@ -0,0 +1,95 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard.db.event;
+
+import 
org.apache.nifi.distributed.cache.client.exception.SerializationException;
+
+import java.io.IOException;
+import java.io.OutputStream;
+
+import static 
org.apache.nifi.processors.standard.db.event.TableInfo.DB_TABLE_NAME_DELIMITER;
+
+/**
+ * This class represents a key in a cache that contains information 
(column definitions, e.g.) for a database table
+ */
+public class TableInfoCacheKey {
+
+private final String databaseName;
+private final String tableName;
+private final long tableId;
+private final String uuidPrefix;
+
+public TableInfoCacheKey(String uuidPrefix, String databaseName, 
String tableName, long tableId) {
+this.uuidPrefix = uuidPrefix;
+this.databaseName = databaseName;
+this.tableName = tableName;
+this.tableId = tableId;
+}
+
+@Override
+public boolean equals(Object o) {
+if (this == o) return true;
+if (o == null || getClass() != o.getClass()) return false;
+
+TableInfoCacheKey that = (TableInfoCacheKey) o;
+
+if (tableId != that.tableId) return false;
+if (databaseName != null ? !databaseName.equals(that.databaseName) 
: that.databaseName != null) return false;
--- End diff --

Good point, will change my equals() overrides to use EqualsBuilder


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Updated] (NIFI-3380) Multiple Versions of the Same Component

2017-03-27 Thread Matt Gilman (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-3380?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matt Gilman updated NIFI-3380:
--
Status: Patch Available  (was: Reopened)

> Multiple Versions of the Same Component
> ---
>
> Key: NIFI-3380
> URL: https://issues.apache.org/jira/browse/NIFI-3380
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core Framework
>Reporter: Bryan Bende
>Assignee: Matt Gilman
> Fix For: 1.2.0
>
> Attachments: nifi-example-processors-nar-1.0.nar, 
> nifi-example-processors-nar-2.0.nar, nifi-example-service-api-nar-1.0.nar, 
> nifi-example-service-api-nar-2.0.nar, nifi-example-service-nar-1.0.nar, 
> nifi-example-service-nar-1.1.nar, nifi-example-service-nar-2.0.nar
>
>
> This ticket is to track the work for supporting multiple versions of the same 
> component within NiFi. The overall design for this feature is described in 
> detail at the following wiki page:
> https://cwiki.apache.org/confluence/display/NIFI/Multiple+Versions+of+the+Same+Extension
> This ticket will track only the core NiFi work, and a separate ticket will be 
> created to track enhancements for the NAR Maven Plugin.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (NIFI-3380) Multiple Versions of the Same Component

2017-03-27 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-3380?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15943347#comment-15943347
 ] 

ASF GitHub Bot commented on NIFI-3380:
--

GitHub user mcgilman opened a pull request:

https://github.com/apache/nifi/pull/1627

NIFI-3380: Fixing paths in generated documentation

NIFI-3380:
- Addressing issues with paths in generated documentation.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/mcgilman/nifi NIFI-3380

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/1627.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #1627


commit b03b9b70fb4529498cd6d1717afea25a19d8c73f
Author: Matt Gilman 
Date:   2017-03-27T14:21:01Z

NIFI-3380:
- Addressing issues with paths in generated documentation.




> Multiple Versions of the Same Component
> ---
>
> Key: NIFI-3380
> URL: https://issues.apache.org/jira/browse/NIFI-3380
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core Framework
>Reporter: Bryan Bende
>Assignee: Matt Gilman
> Fix For: 1.2.0
>
> Attachments: nifi-example-processors-nar-1.0.nar, 
> nifi-example-processors-nar-2.0.nar, nifi-example-service-api-nar-1.0.nar, 
> nifi-example-service-api-nar-2.0.nar, nifi-example-service-nar-1.0.nar, 
> nifi-example-service-nar-1.1.nar, nifi-example-service-nar-2.0.nar
>
>
> This ticket is to track the work for supporting multiple versions of the same 
> component within NiFi. The overall design for this feature is described in 
> detail at the following wiki page:
> https://cwiki.apache.org/confluence/display/NIFI/Multiple+Versions+of+the+Same+Extension
> This ticket will track only the core NiFi work, and a separate ticket will be 
> created to track enhancements for the NAR Maven Plugin.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] nifi pull request #1627: NIFI-3380: Fixing paths in generated documentation

2017-03-27 Thread mcgilman
GitHub user mcgilman opened a pull request:

https://github.com/apache/nifi/pull/1627

NIFI-3380: Fixing paths in generated documentation

NIFI-3380:
- Addressing issues with paths in generated documentation.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/mcgilman/nifi NIFI-3380

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/1627.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #1627


commit b03b9b70fb4529498cd6d1717afea25a19d8c73f
Author: Matt Gilman 
Date:   2017-03-27T14:21:01Z

NIFI-3380:
- Addressing issues with paths in generated documentation.




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (NIFI-3413) Implement a GetChangeDataCapture processor

2017-03-27 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-3413?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15943341#comment-15943341
 ] 

ASF GitHub Bot commented on NIFI-3413:
--

Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1618#discussion_r108180477
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/db/event/TableInfo.java
 ---
@@ -0,0 +1,146 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard.db.event;
+
+import 
org.apache.nifi.distributed.cache.client.exception.DeserializationException;
+import 
org.apache.nifi.distributed.cache.client.exception.SerializationException;
+
+import java.io.IOException;
+import java.io.OutputStream;
+import java.util.ArrayList;
+import java.util.List;
+import java.util.stream.Collectors;
+
+/**
+ * A POJO for holding table information related to update events.
+ */
+public class TableInfo {
+
+final static String DB_TABLE_NAME_DELIMITER = "@!@";
+
+private String databaseName;
+private String tableName;
+private Long tableId;
+private List columns;
+
+public TableInfo(String databaseName, String tableName, Long tableId, 
List columns) {
+this.databaseName = databaseName;
+this.tableName = tableName;
+this.tableId = tableId;
+this.columns = columns;
+}
+
+public String getDatabaseName() {
+return databaseName;
+}
+
+public String getTableName() {
+return tableName;
+}
+
+public void setTableName(String tableName) {
+this.tableName = tableName;
+}
+
+public Long getTableId() {
+return tableId;
+}
+
+public List getColumns() {
+return columns;
+}
+
+public void setColumns(List columns) {
+this.columns = columns;
+}
+
+@Override
+public boolean equals(Object o) {
+if (this == o) return true;
+if (o == null || getClass() != o.getClass()) return false;
+
+TableInfo tableInfo = (TableInfo) o;
+
+if (!databaseName.equals(tableInfo.databaseName)) return false;
+if (!tableName.equals(tableInfo.tableName)) return false;
+if (!tableId.equals(tableInfo.tableId)) return false;
+return columns != null ? columns.equals(tableInfo.columns) : 
tableInfo.columns == null;
+}
+
+@Override
+public int hashCode() {
+int result = databaseName.hashCode();
+result = 31 * result + tableName.hashCode();
+result = 31 * result + tableId.hashCode();
+result = 31 * result + (columns != null ? columns.hashCode() : 0);
+return result;
+}
+
+public static class Serializer implements 
org.apache.nifi.distributed.cache.client.Serializer {
+
+@Override
+public void serialize(TableInfo value, OutputStream output) throws 
SerializationException, IOException {
+StringBuilder sb = new StringBuilder(value.getDatabaseName());
+sb.append(DB_TABLE_NAME_DELIMITER);
+sb.append(value.getTableName());
+sb.append(DB_TABLE_NAME_DELIMITER);
+sb.append(value.getTableId());
+List columnDefinitions = value.getColumns();
+if (columnDefinitions != null && !columnDefinitions.isEmpty()) 
{
+sb.append(DB_TABLE_NAME_DELIMITER);
+sb.append(columnDefinitions.stream().map((col) -> 
col.getName() + DB_TABLE_NAME_DELIMITER + 
col.getType()).collect(Collectors.joining(DB_TABLE_NAME_DELIMITER)));
+}
+output.write(sb.toString().getBytes());
+}
+}
+
+public static class 

[GitHub] nifi pull request #1618: NIFI-3413: Add GetChangeDataCaptureMySQL processor

2017-03-27 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1618#discussion_r108180477
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/db/event/TableInfo.java
 ---
@@ -0,0 +1,146 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard.db.event;
+
+import 
org.apache.nifi.distributed.cache.client.exception.DeserializationException;
+import 
org.apache.nifi.distributed.cache.client.exception.SerializationException;
+
+import java.io.IOException;
+import java.io.OutputStream;
+import java.util.ArrayList;
+import java.util.List;
+import java.util.stream.Collectors;
+
+/**
+ * A POJO for holding table information related to update events.
+ */
+public class TableInfo {
+
+final static String DB_TABLE_NAME_DELIMITER = "@!@";
+
+private String databaseName;
+private String tableName;
+private Long tableId;
+private List columns;
+
+public TableInfo(String databaseName, String tableName, Long tableId, 
List columns) {
+this.databaseName = databaseName;
+this.tableName = tableName;
+this.tableId = tableId;
+this.columns = columns;
+}
+
+public String getDatabaseName() {
+return databaseName;
+}
+
+public String getTableName() {
+return tableName;
+}
+
+public void setTableName(String tableName) {
+this.tableName = tableName;
+}
+
+public Long getTableId() {
+return tableId;
+}
+
+public List getColumns() {
+return columns;
+}
+
+public void setColumns(List columns) {
+this.columns = columns;
+}
+
+@Override
+public boolean equals(Object o) {
+if (this == o) return true;
+if (o == null || getClass() != o.getClass()) return false;
+
+TableInfo tableInfo = (TableInfo) o;
+
+if (!databaseName.equals(tableInfo.databaseName)) return false;
+if (!tableName.equals(tableInfo.tableName)) return false;
+if (!tableId.equals(tableInfo.tableId)) return false;
+return columns != null ? columns.equals(tableInfo.columns) : 
tableInfo.columns == null;
+}
+
+@Override
+public int hashCode() {
+int result = databaseName.hashCode();
+result = 31 * result + tableName.hashCode();
+result = 31 * result + tableId.hashCode();
+result = 31 * result + (columns != null ? columns.hashCode() : 0);
+return result;
+}
+
+public static class Serializer implements 
org.apache.nifi.distributed.cache.client.Serializer {
+
+@Override
+public void serialize(TableInfo value, OutputStream output) throws 
SerializationException, IOException {
+StringBuilder sb = new StringBuilder(value.getDatabaseName());
+sb.append(DB_TABLE_NAME_DELIMITER);
+sb.append(value.getTableName());
+sb.append(DB_TABLE_NAME_DELIMITER);
+sb.append(value.getTableId());
+List columnDefinitions = value.getColumns();
+if (columnDefinitions != null && !columnDefinitions.isEmpty()) 
{
+sb.append(DB_TABLE_NAME_DELIMITER);
+sb.append(columnDefinitions.stream().map((col) -> 
col.getName() + DB_TABLE_NAME_DELIMITER + 
col.getType()).collect(Collectors.joining(DB_TABLE_NAME_DELIMITER)));
+}
+output.write(sb.toString().getBytes());
+}
+}
+
+public static class Deserializer implements 
org.apache.nifi.distributed.cache.client.Deserializer {
+
+@Override
+public TableInfo deserialize(byte[] input) throws 
DeserializationException, IOException {
+// 

[GitHub] nifi pull request #1618: NIFI-3413: Add GetChangeDataCaptureMySQL processor

2017-03-27 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1618#discussion_r108176438
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/db/event/ColumnDefinition.java
 ---
@@ -0,0 +1,70 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard.db.event;
+
+/**
+ * A class that specifies a definition for a relational table column, 
including type, name, etc.
+ */
+public class ColumnDefinition {
+
+private byte type;
+private String name = "";
+
+public ColumnDefinition(byte type) {
--- End diff --

The type values may change depending on the implementation (MySQL, Oracle, 
e.g.) although hopefully they are all using [JDBC 
Types](http://docs.oracle.com/javase/8/docs/api/constant-values.html#java.sql.Types.BIT).
 With that said, I can't for the life of me remember why I chose "byte" instead 
of int; the only place it's stored (currently) is from a ResultSetMetaData 
object which returns an int for getColumnType().

I will change the type to an int and then folks can use the JDBC Type 
constants in code if they like. How does that sound?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (NIFI-3413) Implement a GetChangeDataCapture processor

2017-03-27 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-3413?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15943320#comment-15943320
 ] 

ASF GitHub Bot commented on NIFI-3413:
--

Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1618#discussion_r108176438
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/db/event/ColumnDefinition.java
 ---
@@ -0,0 +1,70 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard.db.event;
+
+/**
+ * A class that specifies a definition for a relational table column, 
including type, name, etc.
+ */
+public class ColumnDefinition {
+
+private byte type;
+private String name = "";
+
+public ColumnDefinition(byte type) {
--- End diff --

The type values may change depending on the implementation (MySQL, Oracle, 
e.g.) although hopefully they are all using [JDBC 
Types](http://docs.oracle.com/javase/8/docs/api/constant-values.html#java.sql.Types.BIT).
 With that said, I can't for the life of me remember why I chose "byte" instead 
of int; the only place it's stored (currently) is from a ResultSetMetaData 
object which returns an int for getColumnType().

I will change the type to an int and then folks can use the JDBC Type 
constants in code if they like. How does that sound?


> Implement a GetChangeDataCapture processor
> --
>
> Key: NIFI-3413
> URL: https://issues.apache.org/jira/browse/NIFI-3413
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Reporter: Matt Burgess
>Assignee: Matt Burgess
>
> Database systems such as MySQL, Oracle, and SQL Server allow access to their 
> transactional logs and such, in order for external clients to have a "change 
> data capture" (CDC) capability. I propose a GetChangeDataCapture processor to 
> enable this in NiFi.
> The processor would be configured with a DBCPConnectionPool controller 
> service, as well as a Database Type property (similar to the one in 
> QueryDatabaseTable) for database-specific handling. Additional properties 
> might include the CDC table name, etc.  Additional database-specific 
> properties could be handled using dynamic properties (and the documentation 
> should reflect this).
> The processor would accept no incoming connections (it is a "Get" or source 
> processor), would be intended to run on the primary node only as a single 
> threaded processor, and would generate a flow file for each operation 
> (INSERT, UPDATE, DELETE, e,g,) in one or some number of formats (JSON, e.g.). 
> The flow files would be transferred in time order (to enable a replication 
> solution, for example), perhaps with some auto-incrementing attribute to also 
> indicate order if need be.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Reopened] (NIFI-3380) Multiple Versions of the Same Component

2017-03-27 Thread Matt Gilman (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-3380?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matt Gilman reopened NIFI-3380:
---
  Assignee: Matt Gilman  (was: Bryan Bende)

Reopening to address path issues in generated documentation.

> Multiple Versions of the Same Component
> ---
>
> Key: NIFI-3380
> URL: https://issues.apache.org/jira/browse/NIFI-3380
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core Framework
>Reporter: Bryan Bende
>Assignee: Matt Gilman
> Fix For: 1.2.0
>
> Attachments: nifi-example-processors-nar-1.0.nar, 
> nifi-example-processors-nar-2.0.nar, nifi-example-service-api-nar-1.0.nar, 
> nifi-example-service-api-nar-2.0.nar, nifi-example-service-nar-1.0.nar, 
> nifi-example-service-nar-1.1.nar, nifi-example-service-nar-2.0.nar
>
>
> This ticket is to track the work for supporting multiple versions of the same 
> component within NiFi. The overall design for this feature is described in 
> detail at the following wiki page:
> https://cwiki.apache.org/confluence/display/NIFI/Multiple+Versions+of+the+Same+Extension
> This ticket will track only the core NiFi work, and a separate ticket will be 
> created to track enhancements for the NAR Maven Plugin.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] nifi issue #1596: NIFI-3596 - added attributes to GenerateTableFetch process...

2017-03-27 Thread mattyb149
Github user mattyb149 commented on the issue:

https://github.com/apache/nifi/pull/1596
  
+1 LGTM, built and ran unit tests, also ran with a real NiFi, verified all 
the attributes were added and correct. Thanks for the contribution! Merging to 
master


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (NIFI-3650) Ensure travis-ci won't cache nifi artifacts

2017-03-27 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-3650?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15943314#comment-15943314
 ] 

ASF GitHub Bot commented on NIFI-3650:
--

Github user apiri commented on the issue:

https://github.com/apache/nifi/pull/1625
  
@trixpan Do you think it makes more sense to use the before_cache directive 
instead?  (https://docs.travis-ci.com/user/caching/#before_cache-phase)  May 
just be struggling in the morning, but I believe this is likely the preferred 
approach.


> Ensure travis-ci won't cache nifi artifacts
> ---
>
> Key: NIFI-3650
> URL: https://issues.apache.org/jira/browse/NIFI-3650
> Project: Apache NiFi
>  Issue Type: Bug
>Reporter: Andre F de Miranda
>Assignee: Andre F de Miranda
>
> while caching dependencies is a fair way of improving travis-ci build times, 
> the existence of cached NiFi artifacts within maven's ~/.m2 directory may 
> hide issues caused by incomplete code refactoring that would otherwise be 
> triggered had travis-ci been running a truly clean build.
> We should find a way of balancing caching and possible build false negatives.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] nifi issue #1625: NIFI-3650 - Adjust travis to forcefuly remove $HOME/.m2/re...

2017-03-27 Thread apiri
Github user apiri commented on the issue:

https://github.com/apache/nifi/pull/1625
  
@trixpan Do you think it makes more sense to use the before_cache directive 
instead?  (https://docs.travis-ci.com/user/caching/#before_cache-phase)  May 
just be struggling in the morning, but I believe this is likely the preferred 
approach.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (NIFI-3596) GenerateTableFetch - Add attributes to generated flow files to ease SQL query overwrite

2017-03-27 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-3596?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15943288#comment-15943288
 ] 

ASF subversion and git services commented on NIFI-3596:
---

Commit ced6708d4bca6d179e5f9fbc0b2324102879d21e in nifi's branch 
refs/heads/master from [~pvillard]
[ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=ced6708 ]

NIFI-3596 - added attributes to GenerateTableFetch processor

Signed-off-by: Matt Burgess 

Updated test to check selected column names

Signed-off-by: Matt Burgess 

This closes #1596


> GenerateTableFetch - Add attributes to generated flow files to ease SQL query 
> overwrite
> ---
>
> Key: NIFI-3596
> URL: https://issues.apache.org/jira/browse/NIFI-3596
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Reporter: Pierre Villard
>Assignee: Pierre Villard
>Priority: Trivial
> Fix For: 1.2.0
>
>
> The GenerateTableFetch processor will generate a SQL query based on the 
> provided parameters but, if the specific DB adapter is not available, it 
> might be necessary to overwrite the SQL query in the ExecuteSQL processor. To 
> do that it would be nice to have each part of the query as attributes to take 
> advantage of expression language.
> Current workaround (when using GenerateTableFetch against a DB2 database for 
> example) is to have an intermediary ExtractText processor and have a regex 
> extracting each part of the generated query.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (NIFI-3596) GenerateTableFetch - Add attributes to generated flow files to ease SQL query overwrite

2017-03-27 Thread Matt Burgess (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-3596?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matt Burgess updated NIFI-3596:
---
Resolution: Fixed
Status: Resolved  (was: Patch Available)

> GenerateTableFetch - Add attributes to generated flow files to ease SQL query 
> overwrite
> ---
>
> Key: NIFI-3596
> URL: https://issues.apache.org/jira/browse/NIFI-3596
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Reporter: Pierre Villard
>Assignee: Pierre Villard
>Priority: Trivial
> Fix For: 1.2.0
>
>
> The GenerateTableFetch processor will generate a SQL query based on the 
> provided parameters but, if the specific DB adapter is not available, it 
> might be necessary to overwrite the SQL query in the ExecuteSQL processor. To 
> do that it would be nice to have each part of the query as attributes to take 
> advantage of expression language.
> Current workaround (when using GenerateTableFetch against a DB2 database for 
> example) is to have an intermediary ExtractText processor and have a regex 
> extracting each part of the generated query.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] nifi pull request #1596: NIFI-3596 - added attributes to GenerateTableFetch ...

2017-03-27 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/1596


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (NIFI-3596) GenerateTableFetch - Add attributes to generated flow files to ease SQL query overwrite

2017-03-27 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-3596?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15943291#comment-15943291
 ] 

ASF GitHub Bot commented on NIFI-3596:
--

Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/1596


> GenerateTableFetch - Add attributes to generated flow files to ease SQL query 
> overwrite
> ---
>
> Key: NIFI-3596
> URL: https://issues.apache.org/jira/browse/NIFI-3596
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Reporter: Pierre Villard
>Assignee: Pierre Villard
>Priority: Trivial
> Fix For: 1.2.0
>
>
> The GenerateTableFetch processor will generate a SQL query based on the 
> provided parameters but, if the specific DB adapter is not available, it 
> might be necessary to overwrite the SQL query in the ExecuteSQL processor. To 
> do that it would be nice to have each part of the query as attributes to take 
> advantage of expression language.
> Current workaround (when using GenerateTableFetch against a DB2 database for 
> example) is to have an intermediary ExtractText processor and have a regex 
> extracting each part of the generated query.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (NIFI-3596) GenerateTableFetch - Add attributes to generated flow files to ease SQL query overwrite

2017-03-27 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-3596?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15943287#comment-15943287
 ] 

ASF GitHub Bot commented on NIFI-3596:
--

Github user mattyb149 commented on the issue:

https://github.com/apache/nifi/pull/1596
  
+1 LGTM, built and ran unit tests, also ran with a real NiFi, verified all 
the attributes were added and correct. Thanks for the contribution! Merging to 
master


> GenerateTableFetch - Add attributes to generated flow files to ease SQL query 
> overwrite
> ---
>
> Key: NIFI-3596
> URL: https://issues.apache.org/jira/browse/NIFI-3596
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Reporter: Pierre Villard
>Assignee: Pierre Villard
>Priority: Trivial
> Fix For: 1.2.0
>
>
> The GenerateTableFetch processor will generate a SQL query based on the 
> provided parameters but, if the specific DB adapter is not available, it 
> might be necessary to overwrite the SQL query in the ExecuteSQL processor. To 
> do that it would be nice to have each part of the query as attributes to take 
> advantage of expression language.
> Current workaround (when using GenerateTableFetch against a DB2 database for 
> example) is to have an intermediary ExtractText processor and have a regex 
> extracting each part of the generated query.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (NIFI-1705) AttributesToCSV

2017-03-27 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-1705?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15943276#comment-15943276
 ] 

ASF GitHub Bot commented on NIFI-1705:
--

Github user joetrite commented on the issue:

https://github.com/apache/nifi/pull/1589
  
@mattyb149 thanks for reviewing, let me have a think on these things.  i 
like the idea of allowing the attribute list to be dynamic.


> AttributesToCSV
> ---
>
> Key: NIFI-1705
> URL: https://issues.apache.org/jira/browse/NIFI-1705
> Project: Apache NiFi
>  Issue Type: Sub-task
>  Components: Extensions
>Reporter: Randy Gelhausen
>
> Create a new processor which converts a Flowfile's attributes into CSV 
> content.
> Should support the same configuration options as the AttributesToJSON 
> processor



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] nifi issue #1589: NIFI-1705 Adding AttributesToCSV processor

2017-03-27 Thread joetrite
Github user joetrite commented on the issue:

https://github.com/apache/nifi/pull/1589
  
@mattyb149 thanks for reviewing, let me have a think on these things.  i 
like the idea of allowing the attribute list to be dynamic.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (NIFI-3643) nifi-api: deserialization error of nifiVersion field due to capitalization

2017-03-27 Thread Matt Gilman (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-3643?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15943232#comment-15943232
 ] 

Matt Gilman commented on NIFI-3643:
---

[~joat1] Thanks for reporting this! Feel free to check out the PR. I'm not 
using gson serialization but I understand the issue. I made the suggested 
(backward compatible) change by updating the capitalization of the private 
member variable. Would be nice to have the extra verification that gson 
serialization working. Thanks!

> nifi-api: deserialization error of nifiVersion field due to capitalization
> --
>
> Key: NIFI-3643
> URL: https://issues.apache.org/jira/browse/NIFI-3643
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.1.1
> Environment: OpenJDK8. gson, nifi-1.1.0
>Reporter: David Arllen
>Assignee: Matt Gilman
>Priority: Trivial
> Fix For: 1.2.0
>
>
> The response from `/nifi-api/system-diagnostics` includes the field name of 
> 'niFiVersion'.  The gson serialization library works on reflection and 
> expects 'nifiVersion' to be the field name because the class private field 
> 'nifiVersion' does not have a capital F.
> Resolution is expected to involve only the class 'VersionInfoDTO'.  The fix 
> would be matching the field capitalization with the capitalization of the 
> associated getter and setter.
> Without this fix, a gson FieldNamingStrategy is required to artifically map 
> the api-provided 'niFiVersion' field name to the reflection-expected 
> 'nifiVersion'



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (NIFI-3643) nifi-api: deserialization error of nifiVersion field due to capitalization

2017-03-27 Thread Matt Gilman (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-3643?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matt Gilman updated NIFI-3643:
--
Fix Version/s: (was: 1.1.1)
Affects Version/s: (was: 1.2.0)
   Status: Patch Available  (was: In Progress)

> nifi-api: deserialization error of nifiVersion field due to capitalization
> --
>
> Key: NIFI-3643
> URL: https://issues.apache.org/jira/browse/NIFI-3643
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.1.1
> Environment: OpenJDK8. gson, nifi-1.1.0
>Reporter: David Arllen
>Assignee: Matt Gilman
>Priority: Trivial
> Fix For: 1.2.0
>
>
> The response from `/nifi-api/system-diagnostics` includes the field name of 
> 'niFiVersion'.  The gson serialization library works on reflection and 
> expects 'nifiVersion' to be the field name because the class private field 
> 'nifiVersion' does not have a capital F.
> Resolution is expected to involve only the class 'VersionInfoDTO'.  The fix 
> would be matching the field capitalization with the capitalization of the 
> associated getter and setter.
> Without this fix, a gson FieldNamingStrategy is required to artifically map 
> the api-provided 'niFiVersion' field name to the reflection-expected 
> 'nifiVersion'



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (NIFI-3643) nifi-api: deserialization error of nifiVersion field due to capitalization

2017-03-27 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-3643?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15943229#comment-15943229
 ] 

ASF GitHub Bot commented on NIFI-3643:
--

GitHub user mcgilman opened a pull request:

https://github.com/apache/nifi/pull/1626

NIFI-3643: Fixing capitalization in Java Property

NIFI-3643:
- Addressing incorrect capitalization in VersionInfoDTO in NiFiVersion.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/mcgilman/nifi NIFI-3643

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/1626.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #1626


commit 7ecdfe0bad832f9d469960fe561b3b47add206a0
Author: Matt Gilman 
Date:   2017-03-27T13:06:37Z

NIFI-3643:
- Addressing incorrect capitalization in VersionInfoDTO in NiFiVersion.




> nifi-api: deserialization error of nifiVersion field due to capitalization
> --
>
> Key: NIFI-3643
> URL: https://issues.apache.org/jira/browse/NIFI-3643
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.2.0, 1.1.1
> Environment: OpenJDK8. gson, nifi-1.1.0
>Reporter: David Arllen
>Assignee: Matt Gilman
>Priority: Trivial
> Fix For: 1.2.0, 1.1.1
>
>
> The response from `/nifi-api/system-diagnostics` includes the field name of 
> 'niFiVersion'.  The gson serialization library works on reflection and 
> expects 'nifiVersion' to be the field name because the class private field 
> 'nifiVersion' does not have a capital F.
> Resolution is expected to involve only the class 'VersionInfoDTO'.  The fix 
> would be matching the field capitalization with the capitalization of the 
> associated getter and setter.
> Without this fix, a gson FieldNamingStrategy is required to artifically map 
> the api-provided 'niFiVersion' field name to the reflection-expected 
> 'nifiVersion'



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] nifi pull request #1626: NIFI-3643: Fixing capitalization in Java Property

2017-03-27 Thread mcgilman
GitHub user mcgilman opened a pull request:

https://github.com/apache/nifi/pull/1626

NIFI-3643: Fixing capitalization in Java Property

NIFI-3643:
- Addressing incorrect capitalization in VersionInfoDTO in NiFiVersion.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/mcgilman/nifi NIFI-3643

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/1626.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #1626


commit 7ecdfe0bad832f9d469960fe561b3b47add206a0
Author: Matt Gilman 
Date:   2017-03-27T13:06:37Z

NIFI-3643:
- Addressing incorrect capitalization in VersionInfoDTO in NiFiVersion.




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Updated] (NIFIREG-2) Design logo for Registry

2017-03-27 Thread Rob Moran (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFIREG-2?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Rob Moran updated NIFIREG-2:

Attachment: registry-logo-concept_2017-03-27.png

I've updated the design integrating the latest feedback. The description has 
been updated with the addition of the curved line and use of NiFi/MiNiFi colors.

> Design logo for Registry
> 
>
> Key: NIFIREG-2
> URL: https://issues.apache.org/jira/browse/NIFIREG-2
> Project: NiFi Registry
>  Issue Type: Task
>Reporter: Rob Moran
>Assignee: Rob Moran
>Priority: Minor
> Attachments: registry-logo-concept_2017-03-27.png
>
>
> The attached image contains the proposed logo design for Registry. The points 
> below describe some of the thinking behind it:
> * Relationship to NiFi and MiNiFi through the use of the same color palette, 
> typeface, and block elements representing bits of data
> * For Registry these blocks also represent the storage/organization aspect 
> through their even distribution and arrangement
> * The 3 gradated blocks across the top – forming the terminal part of a 
> lowercase *r* – represent movement (e.g., a versioned flow being saved to 
> NiFi or imported to NiFi from the registry)
> * Relating back to the original water/flow concept of NiFi, the curved line 
> integrated into the gradated blocks represent the continuous motion of 
> flowing water
> * The light gray block helps with idea of storage as previously mentioned, 
> but also alludes to unused storage/free space
> * The gray block also helps establish the strong diagonal slicing through it 
> and the lowest green block. Again this helps with the idea of movement, but 
> more so speaks to how Registry operates in the background, tucked away, 
> largely unseen by NiFi operators as it facilitates deployment tasks



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (NIFIREG-2) Design logo for Registry

2017-03-27 Thread Rob Moran (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFIREG-2?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Rob Moran updated NIFIREG-2:

Description: 
The attached image contains the proposed logo design for Registry. The points 
below describe some of the thinking behind it:
* Relationship to NiFi and MiNiFi through the use of the same color palette, 
typeface, and block elements representing bits of data
* For Registry these blocks also represent the storage/organization aspect 
through their even distribution and arrangement
* The 3 gradated blocks across the top – forming the terminal part of a 
lowercase *r* – represent movement (e.g., a versioned flow being saved to NiFi 
or imported to NiFi from the registry)
* Relating back to the original water/flow concept of NiFi, the curved line 
integrated into the gradated blocks represent the continuous motion of flowing 
water
* The light gray block helps with idea of storage as previously mentioned, but 
also alludes to unused storage/free space
* The gray block also helps establish the strong diagonal slicing through it 
and the lowest green block. Again this helps with the idea of movement, but 
more so speaks to how Registry operates in the background, tucked away, largely 
unseen by NiFi operators as it facilitates deployment tasks

  was:
The attached image contains variations of the proposed logo design for 
Registry. The points below describe some of the thinking behind it:
* Relationship to NiFi and MiNiFi through the use of the same typeface and use 
of blocks representing bits of data
* For Registry these blocks also represent the storage/organization aspect 
through their even distribution and arrangement
* The 3 gradated blocks across the top – forming the terminal part of a 
lowercase *r* – represent movement (e.g., a versioned flow being saved to NiFi 
or imported to NiFi from the registry)
* The solid gray (in color versions) and outlined block (in one-color versions) 
help with idea of storage as previously mentioned, but also allude to unused 
storage/free space
* The gray block also helps establish the strong diagonal slicing through it 
and the lowest green block. Again this helps with the idea of movement, but 
more so speaks to how Registry operates in the background, tucked away, largely 
unseen by NiFi operators
* A departure from the NiFi color palette signifies how Registry functions more 
as a standalone application


> Design logo for Registry
> 
>
> Key: NIFIREG-2
> URL: https://issues.apache.org/jira/browse/NIFIREG-2
> Project: NiFi Registry
>  Issue Type: Task
>Reporter: Rob Moran
>Assignee: Rob Moran
>Priority: Minor
>
> The attached image contains the proposed logo design for Registry. The points 
> below describe some of the thinking behind it:
> * Relationship to NiFi and MiNiFi through the use of the same color palette, 
> typeface, and block elements representing bits of data
> * For Registry these blocks also represent the storage/organization aspect 
> through their even distribution and arrangement
> * The 3 gradated blocks across the top – forming the terminal part of a 
> lowercase *r* – represent movement (e.g., a versioned flow being saved to 
> NiFi or imported to NiFi from the registry)
> * Relating back to the original water/flow concept of NiFi, the curved line 
> integrated into the gradated blocks represent the continuous motion of 
> flowing water
> * The light gray block helps with idea of storage as previously mentioned, 
> but also alludes to unused storage/free space
> * The gray block also helps establish the strong diagonal slicing through it 
> and the lowest green block. Again this helps with the idea of movement, but 
> more so speaks to how Registry operates in the background, tucked away, 
> largely unseen by NiFi operators as it facilitates deployment tasks



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Assigned] (NIFI-3643) nifi-api: deserialization error of nifiVersion field due to capitalization

2017-03-27 Thread Matt Gilman (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-3643?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matt Gilman reassigned NIFI-3643:
-

Assignee: Matt Gilman

> nifi-api: deserialization error of nifiVersion field due to capitalization
> --
>
> Key: NIFI-3643
> URL: https://issues.apache.org/jira/browse/NIFI-3643
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.2.0, 1.1.1
> Environment: OpenJDK8. gson, nifi-1.1.0
>Reporter: David Arllen
>Assignee: Matt Gilman
>Priority: Trivial
> Fix For: 1.2.0, 1.1.1
>
>
> The response from `/nifi-api/system-diagnostics` includes the field name of 
> 'niFiVersion'.  The gson serialization library works on reflection and 
> expects 'nifiVersion' to be the field name because the class private field 
> 'nifiVersion' does not have a capital F.
> Resolution is expected to involve only the class 'VersionInfoDTO'.  The fix 
> would be matching the field capitalization with the capitalization of the 
> associated getter and setter.
> Without this fix, a gson FieldNamingStrategy is required to artifically map 
> the api-provided 'niFiVersion' field name to the reflection-expected 
> 'nifiVersion'



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (NIFIREG-2) Design logo for Registry

2017-03-27 Thread Rob Moran (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFIREG-2?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Rob Moran updated NIFIREG-2:

Attachment: (was: registry-logo-concept.png)

> Design logo for Registry
> 
>
> Key: NIFIREG-2
> URL: https://issues.apache.org/jira/browse/NIFIREG-2
> Project: NiFi Registry
>  Issue Type: Task
>Reporter: Rob Moran
>Assignee: Rob Moran
>Priority: Minor
>
> The attached image contains variations of the proposed logo design for 
> Registry. The points below describe some of the thinking behind it:
> * Relationship to NiFi and MiNiFi through the use of the same typeface and 
> use of blocks representing bits of data
> * For Registry these blocks also represent the storage/organization aspect 
> through their even distribution and arrangement
> * The 3 gradated blocks across the top – forming the terminal part of a 
> lowercase *r* – represent movement (e.g., a versioned flow being saved to 
> NiFi or imported to NiFi from the registry)
> * The solid gray (in color versions) and outlined block (in one-color 
> versions) help with idea of storage as previously mentioned, but also allude 
> to unused storage/free space
> * The gray block also helps establish the strong diagonal slicing through it 
> and the lowest green block. Again this helps with the idea of movement, but 
> more so speaks to how Registry operates in the background, tucked away, 
> largely unseen by NiFi operators
> * A departure from the NiFi color palette signifies how Registry functions 
> more as a standalone application



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Comment Edited] (NIFI-3625) Add JSON support to PutHiveStreaming

2017-03-27 Thread Ryan Persaud (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-3625?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15942751#comment-15942751
 ] 

Ryan Persaud edited comment on NIFI-3625 at 3/27/17 7:42 AM:
-

I was experimenting with streaming some JSON data into a partitioned table in a 
HDP 2.5 sandbox tonight, and I encountered an Exception (below).  I built from 
master (552148e9e7d45be4d298ee48afd7471405a5bfad) and tested with the 'old' 
PutHiveStreaming processor, and I got the same error.   From what I can tell, 
the error occurs whenever partition columns are specified in the 
PutHiveStreaming processor.

On a hunch I reverted HiveUtils and HiveWriter back to the versions from 
8/4/2016 (3943d72e95ff7b18c32d12020d34f134f4e86125), and I hacked them up a bit 
to work with the newer versions of PutHiveStreaming and TestPutHiveStreaming.  
I was able to successfully stream into a table. 

Has any one else encountered these issues since NIFI-3574 and NIFI-3530 have 
been resolved?  Any thoughts on how to proceed?

Here are the PutHiveStreaming properties:


  
hive-stream-metastore-uri
thrift://sandbox.hortonworks.com:9083
  
  
hive-config-resources
/home/rpersaud/shared/hive-site.xml
  
  
hive-stream-database-name
default
  
  
hive-stream-table-name
test_err
  
  
hive-stream-partition-cols
src
  
  
hive-stream-autocreate-partition
true
  
  
hive-stream-max-open-connections
8
  
  
hive-stream-heartbeat-interval
60
  
  
hive-stream-transactions-per-batch
100
  
  
hive-stream-records-per-transaction
1
  
  
Kerberos Principal
  
  
Kerberos Keytab
  


2017-03-26 23:50:31,460 ERROR [Timer-Driven Process Thread-6] hive.log Got 
exception: java.lang.NullPointerException null
java.lang.NullPointerException: null
at 
org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaStoreFilterHook.getFilteredObjects(AuthorizationMetaStoreFilterHook.java:77)
 ~[hive-exec-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at 
org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaStoreFilterHook.filterDatabases(AuthorizationMetaStoreFilterHook.java:54)
 ~[hive-exec-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabases(HiveMetaStoreClient.java:1046)
 ~[hive-metastore-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at 
org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient.isOpen(HiveClientCache.java:367)
 [hive-hcatalog-core-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
~[na:1.8.0_121]
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
~[na:1.8.0_121]
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 ~[na:1.8.0_121]
at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_121]
at 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:155)
 [hive-metastore-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at com.sun.proxy.$Proxy130.isOpen(Unknown Source) [na:na]
at 
org.apache.hive.hcatalog.common.HiveClientCache.get(HiveClientCache.java:205) 
[hive-hcatalog-core-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at 
org.apache.hive.hcatalog.common.HCatUtil.getHiveMetastoreClient(HCatUtil.java:558)
 [hive-hcatalog-core-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at 
org.apache.hive.hcatalog.streaming.AbstractRecordWriter.(AbstractRecordWriter.java:94)
 [hive-hcatalog-streaming-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at 
org.apache.hive.hcatalog.streaming.StrictJsonWriter.(StrictJsonWriter.java:82)
 [hive-hcatalog-streaming-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at 
org.apache.hive.hcatalog.streaming.StrictJsonWriter.(StrictJsonWriter.java:60)
 [hive-hcatalog-streaming-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at 
org.apache.nifi.util.hive.HiveWriter.getRecordWriter(HiveWriter.java:84) 
[nifi-hive-processors-1.2.0-SNAPSHOT.jar:1.2.0-SNAPSHOT]
at org.apache.nifi.util.hive.HiveWriter.(HiveWriter.java:71) 
[nifi-hive-processors-1.2.0-SNAPSHOT.jar:1.2.0-SNAPSHOT]
at 
org.apache.nifi.util.hive.HiveUtils.makeHiveWriter(HiveUtils.java:46) 
[nifi-hive-processors-1.2.0-SNAPSHOT.jar:1.2.0-SNAPSHOT]
at 

[jira] [Comment Edited] (NIFI-3625) Add JSON support to PutHiveStreaming

2017-03-27 Thread Ryan Persaud (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-3625?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15942751#comment-15942751
 ] 

Ryan Persaud edited comment on NIFI-3625 at 3/27/17 7:29 AM:
-

I was experimenting with streaming some JSON data into a partitioned table in a 
HDP 2.5 sandbox tonight, and I encountered an Exception (below).  I built from 
master (552148e9e7d45be4d298ee48afd7471405a5bfad) and tested with the 'old' 
PutHiveStreaming processor, and I got the same error.   From what I can tell, 
the error occurs whenever partition columns are specified in the 
PutHiveStreaming processor.

On a hunch I reverted HiveUtils and HiveWriter back to the versions from 
8/4/2016 (3943d72e95ff7b18c32d12020d34f134f4e86125), and I hacked them up a bit 
to work with the newer versions of PutHiveStreaming and TestPutHiveStreaming.  
I was able to successfully stream into a table. 

Has any one else encountered these issues since NIFI-3574 and NIFI-3530 have 
been resolved?  Any thoughts on how to proceed?

2017-03-26 23:50:31,460 ERROR [Timer-Driven Process Thread-6] hive.log Got 
exception: java.lang.NullPointerException null
java.lang.NullPointerException: null
at 
org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaStoreFilterHook.getFilteredObjects(AuthorizationMetaStoreFilterHook.java:77)
 ~[hive-exec-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at 
org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaStoreFilterHook.filterDatabases(AuthorizationMetaStoreFilterHook.java:54)
 ~[hive-exec-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabases(HiveMetaStoreClient.java:1046)
 ~[hive-metastore-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at 
org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient.isOpen(HiveClientCache.java:367)
 [hive-hcatalog-core-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
~[na:1.8.0_121]
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
~[na:1.8.0_121]
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 ~[na:1.8.0_121]
at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_121]
at 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:155)
 [hive-metastore-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at com.sun.proxy.$Proxy130.isOpen(Unknown Source) [na:na]
at 
org.apache.hive.hcatalog.common.HiveClientCache.get(HiveClientCache.java:205) 
[hive-hcatalog-core-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at 
org.apache.hive.hcatalog.common.HCatUtil.getHiveMetastoreClient(HCatUtil.java:558)
 [hive-hcatalog-core-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at 
org.apache.hive.hcatalog.streaming.AbstractRecordWriter.(AbstractRecordWriter.java:94)
 [hive-hcatalog-streaming-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at 
org.apache.hive.hcatalog.streaming.StrictJsonWriter.(StrictJsonWriter.java:82)
 [hive-hcatalog-streaming-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at 
org.apache.hive.hcatalog.streaming.StrictJsonWriter.(StrictJsonWriter.java:60)
 [hive-hcatalog-streaming-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at 
org.apache.nifi.util.hive.HiveWriter.getRecordWriter(HiveWriter.java:84) 
[nifi-hive-processors-1.2.0-SNAPSHOT.jar:1.2.0-SNAPSHOT]
at org.apache.nifi.util.hive.HiveWriter.(HiveWriter.java:71) 
[nifi-hive-processors-1.2.0-SNAPSHOT.jar:1.2.0-SNAPSHOT]
at 
org.apache.nifi.util.hive.HiveUtils.makeHiveWriter(HiveUtils.java:46) 
[nifi-hive-processors-1.2.0-SNAPSHOT.jar:1.2.0-SNAPSHOT]
at 
org.apache.nifi.processors.hive.PutHiveStreaming.makeHiveWriter(PutHiveStreaming.java:1011)
 [nifi-hive-processors-1.2.0-SNAPSHOT.jar:1.2.0-SNAPSHOT]
at 
org.apache.nifi.processors.hive.PutHiveStreaming.getOrCreateWriter(PutHiveStreaming.java:922)
 [nifi-hive-processors-1.2.0-SNAPSHOT.jar:1.2.0-SNAPSHOT]
at 
org.apache.nifi.processors.hive.PutHiveStreaming.writeToHive(PutHiveStreaming.java:405)
 [nifi-hive-processors-1.2.0-SNAPSHOT.jar:1.2.0-SNAPSHOT]
at 
org.apache.nifi.processors.hive.PutHiveStreaming.lambda$processJSON$5(PutHiveStreaming.java:702)
 [nifi-hive-processors-1.2.0-SNAPSHOT.jar:1.2.0-SNAPSHOT]
at 
org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:2120)
 ~[na:na]
at 
org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:2090)
 ~[na:na]
at 
org.apache.nifi.processors.hive.PutHiveStreaming.processJSON(PutHiveStreaming.java:669)
 

[jira] [Commented] (NIFI-3625) Add JSON support to PutHiveStreaming

2017-03-27 Thread Ryan Persaud (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-3625?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15942751#comment-15942751
 ] 

Ryan Persaud commented on NIFI-3625:


I was experimenting with streaming some JSON data into a partitioned table in a 
HDP 2.5 sandbox tonight, and I encountered an Exception (below).  I built from 
master (552148e9e7d45be4d298ee48afd7471405a5bfad) and tested with the 'old' 
PutHiveStreaming processor, and I got the same error.   From what I can tell, 
the error occurs whenever partition columns are specified in the 
PutHiveStreaming processor.

On a hunch I reverted HiveUtils and HiveWriter back to the versions from 
8/4/2016 (3943d72e95ff7b18c32d12020d34f134f4e86125), and I hacked them up a bit 
to work with the newer versions of PutHiveStreaming and TestPutHiveStreaming.  
I was able to successfully stream into a table. 

Has any one else encountered these issues after NIFI-3574?  Any thoughts on how 
to proceed?

2017-03-26 23:50:31,460 ERROR [Timer-Driven Process Thread-6] hive.log Got 
exception: java.lang.NullPointerException null
java.lang.NullPointerException: null
at 
org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaStoreFilterHook.getFilteredObjects(AuthorizationMetaStoreFilterHook.java:77)
 ~[hive-exec-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at 
org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaStoreFilterHook.filterDatabases(AuthorizationMetaStoreFilterHook.java:54)
 ~[hive-exec-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabases(HiveMetaStoreClient.java:1046)
 ~[hive-metastore-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at 
org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient.isOpen(HiveClientCache.java:367)
 [hive-hcatalog-core-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
~[na:1.8.0_121]
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
~[na:1.8.0_121]
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 ~[na:1.8.0_121]
at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_121]
at 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:155)
 [hive-metastore-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at com.sun.proxy.$Proxy130.isOpen(Unknown Source) [na:na]
at 
org.apache.hive.hcatalog.common.HiveClientCache.get(HiveClientCache.java:205) 
[hive-hcatalog-core-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at 
org.apache.hive.hcatalog.common.HCatUtil.getHiveMetastoreClient(HCatUtil.java:558)
 [hive-hcatalog-core-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at 
org.apache.hive.hcatalog.streaming.AbstractRecordWriter.(AbstractRecordWriter.java:94)
 [hive-hcatalog-streaming-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at 
org.apache.hive.hcatalog.streaming.StrictJsonWriter.(StrictJsonWriter.java:82)
 [hive-hcatalog-streaming-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at 
org.apache.hive.hcatalog.streaming.StrictJsonWriter.(StrictJsonWriter.java:60)
 [hive-hcatalog-streaming-1.2.1000.2.5.0.0-1245.jar:1.2.1000.2.5.0.0-1245]
at 
org.apache.nifi.util.hive.HiveWriter.getRecordWriter(HiveWriter.java:84) 
[nifi-hive-processors-1.2.0-SNAPSHOT.jar:1.2.0-SNAPSHOT]
at org.apache.nifi.util.hive.HiveWriter.(HiveWriter.java:71) 
[nifi-hive-processors-1.2.0-SNAPSHOT.jar:1.2.0-SNAPSHOT]
at 
org.apache.nifi.util.hive.HiveUtils.makeHiveWriter(HiveUtils.java:46) 
[nifi-hive-processors-1.2.0-SNAPSHOT.jar:1.2.0-SNAPSHOT]
at 
org.apache.nifi.processors.hive.PutHiveStreaming.makeHiveWriter(PutHiveStreaming.java:1011)
 [nifi-hive-processors-1.2.0-SNAPSHOT.jar:1.2.0-SNAPSHOT]
at 
org.apache.nifi.processors.hive.PutHiveStreaming.getOrCreateWriter(PutHiveStreaming.java:922)
 [nifi-hive-processors-1.2.0-SNAPSHOT.jar:1.2.0-SNAPSHOT]
at 
org.apache.nifi.processors.hive.PutHiveStreaming.writeToHive(PutHiveStreaming.java:405)
 [nifi-hive-processors-1.2.0-SNAPSHOT.jar:1.2.0-SNAPSHOT]
at 
org.apache.nifi.processors.hive.PutHiveStreaming.lambda$processJSON$5(PutHiveStreaming.java:702)
 [nifi-hive-processors-1.2.0-SNAPSHOT.jar:1.2.0-SNAPSHOT]
at 
org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:2120)
 ~[na:na]
at 
org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:2090)
 ~[na:na]
at 
org.apache.nifi.processors.hive.PutHiveStreaming.processJSON(PutHiveStreaming.java:669)
 [nifi-hive-processors-1.2.0-SNAPSHOT.jar:1.2.0-SNAPSHOT]
at