[jira] [Commented] (NIFI-3666) Skipped tests on windows need to be validated or fixed

2017-04-23 Thread Koji Kawamura (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-3666?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15980611#comment-15980611
 ] 

Koji Kawamura commented on NIFI-3666:
-

I see, created a sub-ticket. Thanks [~joewitt]

> Skipped tests on windows need to be validated or fixed
> --
>
> Key: NIFI-3666
> URL: https://issues.apache.org/jira/browse/NIFI-3666
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework, Extensions
>Reporter: Joseph Witt
>Priority: Critical
>
> In NIFI-3440 a number of relatively recently created tests were failing on 
> windows.  These tests were skipped when running on windows to help keep the 
> build moving along and to continue to test regressions on older more stable 
> tests.  However, this approach leaves room for error because we must go 
> through each and validate whether it was a bad test that needs to be fixed to 
> be more stable/portable OR whether the test exposed a bug in the code and its 
> behavior on windows.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Resolved] (NIFI-3729) TlsToolkit using .equals() comparison for parent and root

2017-04-23 Thread Koji Kawamura (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-3729?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Koji Kawamura resolved NIFI-3729.
-
Resolution: Fixed

Fixed by PR1641.
https://github.com/apache/nifi/pull/1641/files

> TlsToolkit using .equals() comparison for parent and root
> -
>
> Key: NIFI-3729
> URL: https://issues.apache.org/jira/browse/NIFI-3729
> Project: Apache NiFi
>  Issue Type: Sub-task
>  Components: Tools and Build
> Environment: Windows
>Reporter: Koji Kawamura
>Assignee: Bryan Rosander
> Fix For: 1.2.0
>
>
> TlsToolkitStandaloneCommandLine.java uses '==' to compare file path and it 
> fails on Windows.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Created] (NIFI-3729) TlsToolkit using .equals() comparison for parent and root

2017-04-23 Thread Koji Kawamura (JIRA)
Koji Kawamura created NIFI-3729:
---

 Summary: TlsToolkit using .equals() comparison for parent and root
 Key: NIFI-3729
 URL: https://issues.apache.org/jira/browse/NIFI-3729
 Project: Apache NiFi
  Issue Type: Sub-task
  Components: Tools and Build
 Environment: Windows
Reporter: Koji Kawamura
Assignee: Bryan Rosander
 Fix For: 1.2.0


TlsToolkitStandaloneCommandLine.java uses '==' to compare file path and it 
fails on Windows.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (NIFI-3704) Add PutDatabaseRecord processor

2017-04-23 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-3704?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15980605#comment-15980605
 ] 

ASF GitHub Bot commented on NIFI-3704:
--

Github user ijokarumawak commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1677#discussion_r112848809
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/PutDatabaseRecord.java
 ---
@@ -0,0 +1,1058 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.ReadsAttribute;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.AllowableValue;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.dbcp.DBCPService;
+import org.apache.nifi.expression.AttributeExpression;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RowRecordReaderFactory;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordField;
+import org.apache.nifi.serialization.record.RecordSchema;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.sql.Connection;
+import java.sql.DatabaseMetaData;
+import java.sql.PreparedStatement;
+import java.sql.ResultSet;
+import java.sql.ResultSetMetaData;
+import java.sql.SQLException;
+import java.sql.Statement;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.LinkedHashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.concurrent.TimeUnit;
+import java.util.concurrent.atomic.AtomicInteger;
+import java.util.stream.IntStream;
+
+
+@EventDriven
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@Tags({"sql", "record", "jdbc", "put", "database", "update", "insert", 
"delete"})
+@CapabilityDescription("The PutDatabaseRecord processor uses a specified 
RecordReader to input (possibly multiple) records from an incoming flow file. 
These records are translated to SQL "
++ "statements and executed as a single batch. If any errors occur, 
the flow file is routed to failure or retry, and if the records are transmitted 
successfully, the incoming flow file is "
++ "routed to success.  The type of statement executed by the 
processor is specified via the Statement Type property, which accepts some 
hard-coded values such as INSERT, UPDATE, and DELETE, "
++ "as well as 'Use statement.type Attribute', which causes the 
processor to get the statement type from a flow file attribute.  IMPORTANT: If 
the Statement Type is UPDATE, then the incoming "
++ "records must not alter the value(s) of the primary keys (or 
user-specified Update Keys). If such records are encountered, the UPDATE 
statement issued to the database may do nothing "
++ "(if no existing records with the new primary 

[GitHub] nifi pull request #1677: NIFI-3704: Add PutDatabaseRecord processor

2017-04-23 Thread ijokarumawak
Github user ijokarumawak commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1677#discussion_r112848809
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/PutDatabaseRecord.java
 ---
@@ -0,0 +1,1058 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.ReadsAttribute;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.AllowableValue;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.dbcp.DBCPService;
+import org.apache.nifi.expression.AttributeExpression;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RowRecordReaderFactory;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordField;
+import org.apache.nifi.serialization.record.RecordSchema;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.sql.Connection;
+import java.sql.DatabaseMetaData;
+import java.sql.PreparedStatement;
+import java.sql.ResultSet;
+import java.sql.ResultSetMetaData;
+import java.sql.SQLException;
+import java.sql.Statement;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.LinkedHashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.concurrent.TimeUnit;
+import java.util.concurrent.atomic.AtomicInteger;
+import java.util.stream.IntStream;
+
+
+@EventDriven
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@Tags({"sql", "record", "jdbc", "put", "database", "update", "insert", 
"delete"})
+@CapabilityDescription("The PutDatabaseRecord processor uses a specified 
RecordReader to input (possibly multiple) records from an incoming flow file. 
These records are translated to SQL "
++ "statements and executed as a single batch. If any errors occur, 
the flow file is routed to failure or retry, and if the records are transmitted 
successfully, the incoming flow file is "
++ "routed to success.  The type of statement executed by the 
processor is specified via the Statement Type property, which accepts some 
hard-coded values such as INSERT, UPDATE, and DELETE, "
++ "as well as 'Use statement.type Attribute', which causes the 
processor to get the statement type from a flow file attribute.  IMPORTANT: If 
the Statement Type is UPDATE, then the incoming "
++ "records must not alter the value(s) of the primary keys (or 
user-specified Update Keys). If such records are encountered, the UPDATE 
statement issued to the database may do nothing "
++ "(if no existing records with the new primary key values are 
found), or could inadvertently corrupt the existing data (by changing records 
for which the new values of the primary keys "
++ "exist).")
+@ReadsAttribute(attribute = 

[jira] [Commented] (NIFI-3704) Add PutDatabaseRecord processor

2017-04-23 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-3704?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15980606#comment-15980606
 ] 

ASF GitHub Bot commented on NIFI-3704:
--

Github user ijokarumawak commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1677#discussion_r112849492
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/PutDatabaseRecord.java
 ---
@@ -0,0 +1,1058 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.ReadsAttribute;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.AllowableValue;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.dbcp.DBCPService;
+import org.apache.nifi.expression.AttributeExpression;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RowRecordReaderFactory;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordField;
+import org.apache.nifi.serialization.record.RecordSchema;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.sql.Connection;
+import java.sql.DatabaseMetaData;
+import java.sql.PreparedStatement;
+import java.sql.ResultSet;
+import java.sql.ResultSetMetaData;
+import java.sql.SQLException;
+import java.sql.Statement;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.LinkedHashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.concurrent.TimeUnit;
+import java.util.concurrent.atomic.AtomicInteger;
+import java.util.stream.IntStream;
+
+
+@EventDriven
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@Tags({"sql", "record", "jdbc", "put", "database", "update", "insert", 
"delete"})
+@CapabilityDescription("The PutDatabaseRecord processor uses a specified 
RecordReader to input (possibly multiple) records from an incoming flow file. 
These records are translated to SQL "
++ "statements and executed as a single batch. If any errors occur, 
the flow file is routed to failure or retry, and if the records are transmitted 
successfully, the incoming flow file is "
++ "routed to success.  The type of statement executed by the 
processor is specified via the Statement Type property, which accepts some 
hard-coded values such as INSERT, UPDATE, and DELETE, "
++ "as well as 'Use statement.type Attribute', which causes the 
processor to get the statement type from a flow file attribute.  IMPORTANT: If 
the Statement Type is UPDATE, then the incoming "
++ "records must not alter the value(s) of the primary keys (or 
user-specified Update Keys). If such records are encountered, the UPDATE 
statement issued to the database may do nothing "
++ "(if no existing records with the new primary 

[GitHub] nifi pull request #1677: NIFI-3704: Add PutDatabaseRecord processor

2017-04-23 Thread ijokarumawak
Github user ijokarumawak commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1677#discussion_r112849492
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/PutDatabaseRecord.java
 ---
@@ -0,0 +1,1058 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.ReadsAttribute;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.AllowableValue;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.dbcp.DBCPService;
+import org.apache.nifi.expression.AttributeExpression;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RowRecordReaderFactory;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordField;
+import org.apache.nifi.serialization.record.RecordSchema;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.sql.Connection;
+import java.sql.DatabaseMetaData;
+import java.sql.PreparedStatement;
+import java.sql.ResultSet;
+import java.sql.ResultSetMetaData;
+import java.sql.SQLException;
+import java.sql.Statement;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.LinkedHashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.concurrent.TimeUnit;
+import java.util.concurrent.atomic.AtomicInteger;
+import java.util.stream.IntStream;
+
+
+@EventDriven
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@Tags({"sql", "record", "jdbc", "put", "database", "update", "insert", 
"delete"})
+@CapabilityDescription("The PutDatabaseRecord processor uses a specified 
RecordReader to input (possibly multiple) records from an incoming flow file. 
These records are translated to SQL "
++ "statements and executed as a single batch. If any errors occur, 
the flow file is routed to failure or retry, and if the records are transmitted 
successfully, the incoming flow file is "
++ "routed to success.  The type of statement executed by the 
processor is specified via the Statement Type property, which accepts some 
hard-coded values such as INSERT, UPDATE, and DELETE, "
++ "as well as 'Use statement.type Attribute', which causes the 
processor to get the statement type from a flow file attribute.  IMPORTANT: If 
the Statement Type is UPDATE, then the incoming "
++ "records must not alter the value(s) of the primary keys (or 
user-specified Update Keys). If such records are encountered, the UPDATE 
statement issued to the database may do nothing "
++ "(if no existing records with the new primary key values are 
found), or could inadvertently corrupt the existing data (by changing records 
for which the new values of the primary keys "
++ "exist).")
+@ReadsAttribute(attribute = 

[jira] [Created] (NIFI-3728) CaptureChangeMySQL to capture truncate table statement

2017-04-23 Thread Koji Kawamura (JIRA)
Koji Kawamura created NIFI-3728:
---

 Summary: CaptureChangeMySQL to capture truncate table statement
 Key: NIFI-3728
 URL: https://issues.apache.org/jira/browse/NIFI-3728
 Project: Apache NiFi
  Issue Type: Bug
  Components: Extensions
Reporter: Koji Kawamura


CaptureChangeMySQL has several specific queries to capture, such as 'alter 
table', 'create table' and 'drop table' ... etc. However, it currently does not 
capture 'truncate table' and it should be added.

Also, this maybe an off topic but I think these [DDL | 
https://en.wikipedia.org/wiki/Data_definition_language] statements can be 
dangerous and some use-cases may require human intervention before executing 
such, instead of fully automating with PutDatabaseRecord. It will be useful if 
CaptureChangeMySQL has a 'Capture DDL statements' property.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Created] (NIFI-3727) Add ImpFuzzy support to FuzzyHashContent

2017-04-23 Thread Andre F de Miranda (JIRA)
Andre F de Miranda created NIFI-3727:


 Summary: Add ImpFuzzy support to FuzzyHashContent
 Key: NIFI-3727
 URL: https://issues.apache.org/jira/browse/NIFI-3727
 Project: Apache NiFi
  Issue Type: Improvement
Reporter: Andre F de Miranda
Assignee: Andre F de Miranda


FuzzyHashContent could support PE specific hash functions such as  ImpFuzzy or 
ImpHash to provide PE better hashing coverage



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Created] (NIFI-3726) Create FuzzyHash comparison processor

2017-04-23 Thread Andre F de Miranda (JIRA)
Andre F de Miranda created NIFI-3726:


 Summary: Create FuzzyHash comparison processor
 Key: NIFI-3726
 URL: https://issues.apache.org/jira/browse/NIFI-3726
 Project: Apache NiFi
  Issue Type: Improvement
Reporter: Andre F de Miranda
Assignee: Andre F de Miranda


Now that NiFi cyber-security package supports "Fuzzy Hashing" it may be a good 
idea to support a processor that makes use of it for comparison and routing of 
matches





--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Resolved] (NIFI-880) Add bootstrap.conf configuration to the administrators guide

2017-04-23 Thread Andre F de Miranda (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-880?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Andre F de Miranda resolved NIFI-880.
-
   Resolution: Fixed
Fix Version/s: 0.3.0

Already fixed by commit 992e84102786b9e2f0cf906b55243779e1707c10 (i.e. NIFI-948)



> Add bootstrap.conf configuration to the administrators guide
> 
>
> Key: NIFI-880
> URL: https://issues.apache.org/jira/browse/NIFI-880
> Project: Apache NiFi
>  Issue Type: Improvement
>Reporter: Joseph Percivall
>Assignee: Andre F de Miranda
> Fix For: 0.3.0
>
>
> Currently the NiFi System Administrator's Guide does not have any information 
> on the bootstrap.conf configuration. It should be a part of it.
> https://nifi.apache.org/docs/nifi-docs/html/administration-guide.html



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] nifi issue #1016: NIFI-2724 New JMX Processor

2017-04-23 Thread trixpan
Github user trixpan commented on the issue:

https://github.com/apache/nifi/pull/1016
  
hey @brianburnett thank you for submiting a new commit. 

You seem to have merged the mater into your PR? You should have instead 
rebased the PR to master.

Please refer to the following comments to revisit how to fix it.

https://github.com/apache/nifi/pull/1016#issuecomment-280458891

Cheers!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (NIFI-2724) JMX Processor

2017-04-23 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-2724?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15980568#comment-15980568
 ] 

ASF GitHub Bot commented on NIFI-2724:
--

Github user trixpan commented on the issue:

https://github.com/apache/nifi/pull/1016
  
hey @brianburnett thank you for submiting a new commit. 

You seem to have merged the mater into your PR? You should have instead 
rebased the PR to master.

Please refer to the following comments to revisit how to fix it.

https://github.com/apache/nifi/pull/1016#issuecomment-280458891

Cheers!


> JMX Processor
> -
>
> Key: NIFI-2724
> URL: https://issues.apache.org/jira/browse/NIFI-2724
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Affects Versions: 1.0.0
> Environment: All platforms with Java RMI support for JMX
>Reporter: Brian Burnett
>Assignee: Andre F de Miranda
>Priority: Minor
>  Labels: processor
> Attachments: 0001-NIFI-2724-New-JMX-Processor.patch
>
>   Original Estimate: 24h
>  Remaining Estimate: 24h
>
> The JMX Processor feature addition includes only GetJMX without 
> SecurityManager capabilities at this time.  The processor in its current 
> state is capable of pulling MBean Property and Attribute values from a remote 
> RMI Server.  Each set of Mbean data is wrapped in a JSON formatted FlowFile 
> for downstream processing.  It has the ability to control content with 
> whitelist and blacklist properties.
> Possible use for this processor and the reason it was created is to help make 
> sense of Kafka server metrics.
> Will followup with a SecurityManager Context Service and PutJMX Processor.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)