[GitHub] nifi issue #2180: Added GetMongoAggregation to support running Mongo aggrega...

2018-01-26 Thread MikeThomsen
Github user MikeThomsen commented on the issue:

https://github.com/apache/nifi/pull/2180
  
@mattyb149 We should be good to go now. I just checked in a change that 
addresses the few minor points left over.


---


[jira] [Commented] (NIFI-4289) Implement put processor for InfluxDB

2018-01-26 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4289?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16341785#comment-16341785
 ] 

ASF GitHub Bot commented on NIFI-4289:
--

Github user MikeThomsen commented on the issue:

https://github.com/apache/nifi/pull/2101
  
@mans2singh Figured out the issue, it's a time conversion. I added 
```:multiply(100)``` to the EL and it worked. InfluxDB uses nanoseconds; I 
was supplying milliseconds.

@joewitt FWIW +1 LGTM on merging.


> Implement put processor for InfluxDB
> 
>
> Key: NIFI-4289
> URL: https://issues.apache.org/jira/browse/NIFI-4289
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Affects Versions: 1.3.0
> Environment: All
>Reporter: Mans Singh
>Assignee: Mans Singh
>Priority: Minor
>  Labels: insert, measurements,, put, timeseries
>
> Support inserting time series measurements into InfluxDB.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (NIFIREG-130) Refreshing the Change Log updates the versions displayed in the flow details, but doesn't update the Versions visible when the flow is collapsed

2018-01-26 Thread Andrew Lim (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFIREG-130?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Andrew Lim updated NIFIREG-130:
---
Description: 
Steps to reproduce:
 # Have a flow in the registry with 1 version.  Find this flow in the registry 
and it will display with the Flow Name - Bucket and VERSIONS with "1" below it. 
 Expand the flow to see the details and under Change Log it will display 
"Version 1"
 # In NiFi, make local changes to the flow and commit the changes to save 
Version 2 of the flow.
 # Back in the Registry, select "Refresh" icon.  The Flow details will now show 
"Version 2" and "Version 1".
 # But in the section that shows the Flow Name - Bucket, VERSIONS will still 
show "1" below it, when it should have been updated to "2".

See attached screenshot.

  was:
Steps to reproduce:
 # Have a flow in the registry with 1 version.  Find this flow in the registry 
and it will display with the Flow Name - Bucket and VERSIONS with "1" below it. 
 Expand the flow to see the details and under Change Log it will display 
"Version 1"
 # In NiFi, make local changes to the flow and commit the changes to save 
Version 2 of the flow.
 # Back in the Registry, select "Refresh" icon.  The Flow details will now show 
"Version 2" and "Version 1".
 # But in the section that shows the Flow Name - Bucket, VERSIONS will still 
show "1" below it, when it should have been updated to "2".


> Refreshing the Change Log updates the versions displayed in the flow details, 
> but doesn't update the Versions visible when the flow is collapsed
> 
>
> Key: NIFIREG-130
> URL: https://issues.apache.org/jira/browse/NIFIREG-130
> Project: NiFi Registry
>  Issue Type: Bug
>Affects Versions: 0.1.0
>Reporter: Andrew Lim
>Priority: Minor
> Attachments: NIFIREG-130.png
>
>
> Steps to reproduce:
>  # Have a flow in the registry with 1 version.  Find this flow in the 
> registry and it will display with the Flow Name - Bucket and VERSIONS with 
> "1" below it.  Expand the flow to see the details and under Change Log it 
> will display "Version 1"
>  # In NiFi, make local changes to the flow and commit the changes to save 
> Version 2 of the flow.
>  # Back in the Registry, select "Refresh" icon.  The Flow details will now 
> show "Version 2" and "Version 1".
>  # But in the section that shows the Flow Name - Bucket, VERSIONS will still 
> show "1" below it, when it should have been updated to "2".
> See attached screenshot.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (NIFIREG-130) Refreshing the Change Log updates the versions displayed in the flow details, but doesn't update the Versions visible when the flow is collapsed

2018-01-26 Thread Andrew Lim (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFIREG-130?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Andrew Lim updated NIFIREG-130:
---
Attachment: NIFIREG-130.png

> Refreshing the Change Log updates the versions displayed in the flow details, 
> but doesn't update the Versions visible when the flow is collapsed
> 
>
> Key: NIFIREG-130
> URL: https://issues.apache.org/jira/browse/NIFIREG-130
> Project: NiFi Registry
>  Issue Type: Bug
>Affects Versions: 0.1.0
>Reporter: Andrew Lim
>Priority: Minor
> Attachments: NIFIREG-130.png
>
>
> Steps to reproduce:
>  # Have a flow in the registry with 1 version.  Find this flow in the 
> registry and it will display with the Flow Name - Bucket and VERSIONS with 
> "1" below it.  Expand the flow to see the details and under Change Log it 
> will display "Version 1"
>  # In NiFi, make local changes to the flow and commit the changes to save 
> Version 2 of the flow.
>  # Back in the Registry, select "Refresh" icon.  The Flow details will now 
> show "Version 2" and "Version 1".
>  # But in the section that shows the Flow Name - Bucket, VERSIONS will still 
> show "1" below it, when it should have been updated to "2".
> See attached screenshot.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (NIFIREG-130) Refreshing the Change Log updates the versions displayed in the flow details, but doesn't update the Versions visible when the flow is collapsed

2018-01-26 Thread Andrew Lim (JIRA)
Andrew Lim created NIFIREG-130:
--

 Summary: Refreshing the Change Log updates the versions displayed 
in the flow details, but doesn't update the Versions visible when the flow is 
collapsed
 Key: NIFIREG-130
 URL: https://issues.apache.org/jira/browse/NIFIREG-130
 Project: NiFi Registry
  Issue Type: Bug
Affects Versions: 0.1.0
Reporter: Andrew Lim


Steps to reproduce:
 # Have a flow in the registry with 1 version.  Find this flow in the registry 
and it will display with the Flow Name - Bucket and VERSIONS with "1" below it. 
 Expand the flow to see the details and under Change Log it will display 
"Version 1"
 # In NiFi, make local changes to the flow and commit the changes to save 
Version 2 of the flow.
 # Back in the Registry, select "Refresh" icon.  The Flow details will now show 
"Version 2" and "Version 1".
 # But in the section that shows the Flow Name - Bucket, VERSIONS will still 
show "1" below it, when it should have been updated to "2".



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (NIFI-4822) ValidateRecord does not maintain order of CSV records

2018-01-26 Thread Bryan Bende (JIRA)
Bryan Bende created NIFI-4822:
-

 Summary: ValidateRecord does not maintain order of CSV records
 Key: NIFI-4822
 URL: https://issues.apache.org/jira/browse/NIFI-4822
 Project: Apache NiFi
  Issue Type: Bug
Affects Versions: 1.5.0, 1.4.0
Reporter: Bryan Bende


If you have ValidateRecord configured with a CSV reader and CSV writer and send 
in some valid data, the flow file is routed to "valid", but the columns are 
written out in a different order than there were read.

This means if the next processor is another record-oriented processor using the 
exact same schema and reader, it will fail to read it because the first column 
won't be what it expects.

>From doing some digging, it appears that in WriteCsvResult there is a method 
>getFieldNames() that does this:
{code:java}
final Set allFields = new LinkedHashSet<>();
allFields.addAll(record.getRawFieldNames());
allFields.addAll(recordSchema.getFieldNames());{code}
In this case, record.getRawFieldNames() is coming from the keyset of a HashMap 
which means it is not maintaining the order the fields were read in.

CsvRecordReader line 97:
{code:java}
final Map values = new HashMap<>(recordFields.size() * 2);{code}
{color:#80} {color}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4289) Implement put processor for InfluxDB

2018-01-26 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4289?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16341574#comment-16341574
 ] 

ASF GitHub Bot commented on NIFI-4289:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2101#discussion_r164218451
  
--- Diff: 
nifi-nar-bundles/nifi-influxdb-bundle/nifi-influxdb-processors/src/main/java/org/apache/nifi/processors/influxdb/AbstractInfluxDBProcessor.java
 ---
@@ -0,0 +1,142 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.influxdb;
+
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.DataUnit;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.influxdb.InfluxDB;
+import org.influxdb.InfluxDBFactory;
+
+/**
+ * Abstract base class for InfluxDB processors
+ */
+abstract class AbstractInfluxDBProcessor extends AbstractProcessor {
+
+protected static final PropertyDescriptor CHARSET = new 
PropertyDescriptor.Builder()
+.name("influxdb-charset")
+.displayName("Character Set")
+.description("Specifies the character set of the document 
data.")
+.required(true)
+.defaultValue("UTF-8")
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.CHARACTER_SET_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor INFLUX_DB_URL = new 
PropertyDescriptor.Builder()
+.name("influxdb-url")
+.displayName("InfluxDB connection url")
+.description("InfluxDB url to connect to")
+.required(true)
+.addValidator(StandardValidators.URL_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor DB_NAME = new 
PropertyDescriptor.Builder()
+.name("influxdb-dbname")
+.displayName("DB Name")
+.description("InfluxDB database to connect to")
+.required(true)
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor USERNAME = new 
PropertyDescriptor.Builder()
+.name("influxdb-username")
+.displayName("Username")
+.required(false)
+.description("Username for accessing InfluxDB")
+.addValidator(Validator.VALID)
+.build();
+
+public static final PropertyDescriptor PASSWORD = new 
PropertyDescriptor.Builder()
+.name("influxdb-password")
+.displayName("Password")
+.description("Password for user")
+.addValidator(Validator.VALID)
+.sensitive(true)
+.build();
+
+protected static final PropertyDescriptor MAX_RECORDS_SIZE = new 
PropertyDescriptor.Builder()
+.name("influxdb-max-records-size")
+.displayName("Max size of records")
+.description("Maximum size of records allowed to be posted in 
one batch")
+.defaultValue("1 MB")
+.required(true)
+.addValidator(StandardValidators.DATA_SIZE_VALIDATOR)
+.build();
+
+static final Relationship REL_SUCCESS = new 
Relationship.Builder().name("success")
+.description("Sucessful FlowFiles are routed to this 
relationship").build();
+
+static final 

[GitHub] nifi pull request #2101: NIFI-4289 - InfluxDB put processor

2018-01-26 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2101#discussion_r164218451
  
--- Diff: 
nifi-nar-bundles/nifi-influxdb-bundle/nifi-influxdb-processors/src/main/java/org/apache/nifi/processors/influxdb/AbstractInfluxDBProcessor.java
 ---
@@ -0,0 +1,142 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.influxdb;
+
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.DataUnit;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.influxdb.InfluxDB;
+import org.influxdb.InfluxDBFactory;
+
+/**
+ * Abstract base class for InfluxDB processors
+ */
+abstract class AbstractInfluxDBProcessor extends AbstractProcessor {
+
+protected static final PropertyDescriptor CHARSET = new 
PropertyDescriptor.Builder()
+.name("influxdb-charset")
+.displayName("Character Set")
+.description("Specifies the character set of the document 
data.")
+.required(true)
+.defaultValue("UTF-8")
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.CHARACTER_SET_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor INFLUX_DB_URL = new 
PropertyDescriptor.Builder()
+.name("influxdb-url")
+.displayName("InfluxDB connection url")
+.description("InfluxDB url to connect to")
+.required(true)
+.addValidator(StandardValidators.URL_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor DB_NAME = new 
PropertyDescriptor.Builder()
+.name("influxdb-dbname")
+.displayName("DB Name")
+.description("InfluxDB database to connect to")
+.required(true)
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor USERNAME = new 
PropertyDescriptor.Builder()
+.name("influxdb-username")
+.displayName("Username")
+.required(false)
+.description("Username for accessing InfluxDB")
+.addValidator(Validator.VALID)
+.build();
+
+public static final PropertyDescriptor PASSWORD = new 
PropertyDescriptor.Builder()
+.name("influxdb-password")
+.displayName("Password")
+.description("Password for user")
+.addValidator(Validator.VALID)
+.sensitive(true)
+.build();
+
+protected static final PropertyDescriptor MAX_RECORDS_SIZE = new 
PropertyDescriptor.Builder()
+.name("influxdb-max-records-size")
+.displayName("Max size of records")
+.description("Maximum size of records allowed to be posted in 
one batch")
+.defaultValue("1 MB")
+.required(true)
+.addValidator(StandardValidators.DATA_SIZE_VALIDATOR)
+.build();
+
+static final Relationship REL_SUCCESS = new 
Relationship.Builder().name("success")
+.description("Sucessful FlowFiles are routed to this 
relationship").build();
+
+static final Relationship REL_FAILURE = new 
Relationship.Builder().name("failure")
+.description("Failed FlowFiles are routed to this 
relationship").build();
+
+public static final String INFLUX_DB_ERROR_MESSAGE = 

[jira] [Commented] (NIFI-4289) Implement put processor for InfluxDB

2018-01-26 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4289?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16341567#comment-16341567
 ] 

ASF GitHub Bot commented on NIFI-4289:
--

Github user MikeThomsen commented on the issue:

https://github.com/apache/nifi/pull/2101
  
Everything I insert gets defaulted to some time around the epoch. I tried 
your sample, but it doesn't seem to work for me at least (data gets written, 
but it doesn't have the appropriate timestamp):

```
water,country=US,city=sf rain=1,humidity=0.6 ${now():toNumber()}
```

Ex output:

```
water,country=US,city=sf rain=1,humidity=0.6 1516999446718
```

Using this query in Chronograf, I get the following results:

```
SELECT * FROM "test_data"."autogen"."water"
```

https://user-images.githubusercontent.com/108184/35459708-4ccb9ba6-02af-11e8-852f-7aeabd1806dc.png;>

So overall LGTM on code quality, but we need to figure out if this is just 
something wrong on my end or with the processor before someone merges it.


> Implement put processor for InfluxDB
> 
>
> Key: NIFI-4289
> URL: https://issues.apache.org/jira/browse/NIFI-4289
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Affects Versions: 1.3.0
> Environment: All
>Reporter: Mans Singh
>Assignee: Mans Singh
>Priority: Minor
>  Labels: insert, measurements,, put, timeseries
>
> Support inserting time series measurements into InfluxDB.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2101: NIFI-4289 - InfluxDB put processor

2018-01-26 Thread MikeThomsen
Github user MikeThomsen commented on the issue:

https://github.com/apache/nifi/pull/2101
  
Everything I insert gets defaulted to some time around the epoch. I tried 
your sample, but it doesn't seem to work for me at least (data gets written, 
but it doesn't have the appropriate timestamp):

```
water,country=US,city=sf rain=1,humidity=0.6 ${now():toNumber()}
```

Ex output:

```
water,country=US,city=sf rain=1,humidity=0.6 1516999446718
```

Using this query in Chronograf, I get the following results:

```
SELECT * FROM "test_data"."autogen"."water"
```

https://user-images.githubusercontent.com/108184/35459708-4ccb9ba6-02af-11e8-852f-7aeabd1806dc.png;>

So overall LGTM on code quality, but we need to figure out if this is just 
something wrong on my end or with the processor before someone merges it.


---


[jira] [Commented] (NIFI-4289) Implement put processor for InfluxDB

2018-01-26 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4289?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16341554#comment-16341554
 ] 

ASF GitHub Bot commented on NIFI-4289:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2101#discussion_r164210019
  
--- Diff: 
nifi-nar-bundles/nifi-influxdb-bundle/nifi-influxdb-processors/src/main/java/org/apache/nifi/processors/influxdb/AbstractInfluxDBProcessor.java
 ---
@@ -0,0 +1,142 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.influxdb;
+
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.DataUnit;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.influxdb.InfluxDB;
+import org.influxdb.InfluxDBFactory;
+
+/**
+ * Abstract base class for InfluxDB processors
+ */
+abstract class AbstractInfluxDBProcessor extends AbstractProcessor {
+
+protected static final PropertyDescriptor CHARSET = new 
PropertyDescriptor.Builder()
+.name("influxdb-charset")
+.displayName("Character Set")
+.description("Specifies the character set of the document 
data.")
+.required(true)
+.defaultValue("UTF-8")
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.CHARACTER_SET_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor INFLUX_DB_URL = new 
PropertyDescriptor.Builder()
+.name("influxdb-url")
+.displayName("InfluxDB connection url")
+.description("InfluxDB url to connect to")
+.required(true)
+.addValidator(StandardValidators.URL_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor DB_NAME = new 
PropertyDescriptor.Builder()
+.name("influxdb-dbname")
+.displayName("DB Name")
+.description("InfluxDB database to connect to")
+.required(true)
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor USERNAME = new 
PropertyDescriptor.Builder()
+.name("influxdb-username")
+.displayName("Username")
+.required(false)
+.description("Username for accessing InfluxDB")
+.addValidator(Validator.VALID)
+.build();
+
+public static final PropertyDescriptor PASSWORD = new 
PropertyDescriptor.Builder()
+.name("influxdb-password")
+.displayName("Password")
+.description("Password for user")
+.addValidator(Validator.VALID)
+.sensitive(true)
+.build();
+
+protected static final PropertyDescriptor MAX_RECORDS_SIZE = new 
PropertyDescriptor.Builder()
+.name("influxdb-max-records-size")
+.displayName("Max size of records")
+.description("Maximum size of records allowed to be posted in 
one batch")
+.defaultValue("1 MB")
+.required(true)
+.addValidator(StandardValidators.DATA_SIZE_VALIDATOR)
+.build();
+
+static final Relationship REL_SUCCESS = new 
Relationship.Builder().name("success")
+.description("Sucessful FlowFiles are routed to this 
relationship").build();
+
+static final 

[jira] [Commented] (NIFI-4289) Implement put processor for InfluxDB

2018-01-26 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4289?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16341549#comment-16341549
 ] 

ASF GitHub Bot commented on NIFI-4289:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2101#discussion_r164208755
  
--- Diff: 
nifi-nar-bundles/nifi-influxdb-bundle/nifi-influxdb-processors/src/main/java/org/apache/nifi/processors/influxdb/AbstractInfluxDBProcessor.java
 ---
@@ -0,0 +1,142 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.influxdb;
+
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.DataUnit;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.influxdb.InfluxDB;
+import org.influxdb.InfluxDBFactory;
+
+/**
+ * Abstract base class for InfluxDB processors
+ */
+abstract class AbstractInfluxDBProcessor extends AbstractProcessor {
+
+protected static final PropertyDescriptor CHARSET = new 
PropertyDescriptor.Builder()
+.name("influxdb-charset")
+.displayName("Character Set")
+.description("Specifies the character set of the document 
data.")
+.required(true)
+.defaultValue("UTF-8")
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.CHARACTER_SET_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor INFLUX_DB_URL = new 
PropertyDescriptor.Builder()
+.name("influxdb-url")
+.displayName("InfluxDB connection url")
+.description("InfluxDB url to connect to")
+.required(true)
+.addValidator(StandardValidators.URL_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor DB_NAME = new 
PropertyDescriptor.Builder()
+.name("influxdb-dbname")
+.displayName("DB Name")
+.description("InfluxDB database to connect to")
+.required(true)
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor USERNAME = new 
PropertyDescriptor.Builder()
+.name("influxdb-username")
+.displayName("Username")
+.required(false)
+.description("Username for accessing InfluxDB")
+.addValidator(Validator.VALID)
+.build();
+
+public static final PropertyDescriptor PASSWORD = new 
PropertyDescriptor.Builder()
+.name("influxdb-password")
+.displayName("Password")
+.description("Password for user")
+.addValidator(Validator.VALID)
--- End diff --

Try StandardValidators.NON_BLANK_VALIDATOR here too.


> Implement put processor for InfluxDB
> 
>
> Key: NIFI-4289
> URL: https://issues.apache.org/jira/browse/NIFI-4289
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Affects Versions: 1.3.0
> Environment: All
>Reporter: Mans Singh
>Assignee: Mans Singh
>Priority: Minor
>  Labels: insert, measurements,, put, timeseries
>
> Support inserting time series measurements into InfluxDB.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4289) Implement put processor for InfluxDB

2018-01-26 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4289?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16341559#comment-16341559
 ] 

ASF GitHub Bot commented on NIFI-4289:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2101#discussion_r164212590
  
--- Diff: 
nifi-nar-bundles/nifi-influxdb-bundle/nifi-influxdb-processors/src/test/java/org/apache/nifi/processors/influxdb/ITPutInfluxDBTest.java
 ---
@@ -0,0 +1,189 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.influxdb;
+
+import static org.junit.Assert.assertEquals;
+import java.util.List;
+import org.apache.nifi.util.MockFlowFile;
+import org.apache.nifi.util.TestRunner;
+import org.apache.nifi.util.TestRunners;
+import org.influxdb.InfluxDB;
+import org.influxdb.InfluxDBFactory;
+import org.influxdb.dto.Query;
+import org.influxdb.dto.QueryResult;
+import org.junit.After;
+import org.junit.Before;
+import org.junit.Ignore;
+import org.junit.Test;
+
+/**
+ * Integration test for InfluxDB. Please ensure that the InfluxDB is 
running
+ * on local host with default port and has database test with table test. 
Please set user
+ * and password if applicable before running the integration tests.
+ */
+@Ignore("Comment this out for running tests against a real instance of 
InfluxDB")
+public class ITPutInfluxDBTest {
+private TestRunner runner;
+private InfluxDB influxDB;
+private String dbName = "test";
+private String dbUrl = "http://localhost:8086;;
+private String user = "admin";
+private String password = "admin";
+
+@Before
+public void setUp() throws Exception {
+runner = TestRunners.newTestRunner(PutInfluxDB.class);
+runner.setProperty(PutInfluxDB.DB_NAME, dbName);
+runner.setProperty(PutInfluxDB.USERNAME, user);
+runner.setProperty(PutInfluxDB.PASSWORD, password);
+runner.setProperty(PutInfluxDB.INFLUX_DB_URL, dbUrl);
+runner.setProperty(PutInfluxDB.CHARSET, "UTF-8");
+
runner.setProperty(PutInfluxDB.CONSISTENCY_LEVEL,PutInfluxDB.CONSISTENCY_LEVEL_ONE.getValue());
+runner.setProperty(PutInfluxDB.RETENTION_POLICY,"autogen");
+runner.setProperty(PutInfluxDB.MAX_RECORDS_SIZE, "1 KB");
+runner.assertValid();
+
+influxDB = InfluxDBFactory.connect(dbUrl,user,password);
+
+if ( influxDB.databaseExists(dbName) ) {
+QueryResult result = influxDB.query(new Query("DROP 
measurement water", dbName));
+checkError(result);
+result = influxDB.query(new Query("DROP measurement testm", 
dbName));
+checkError(result);
+} else {
+influxDB.createDatabase(dbName);
+int max = 10;
+while (!influxDB.databaseExists(dbName) && (max-- < 0)) {
+Thread.sleep(5);
+}
+if ( ! influxDB.databaseExists(dbName) ) {
+throw new Exception("unable to create database " + dbName);
+}
+}
+}
+
+protected void checkError(QueryResult result) {
+if ( result.hasError() )
+throw new IllegalStateException("Error while dropping 
measurements " + result.getError());
+}
+
+@After
+public void tearDown() throws Exception {
+runner = null;
+if ( influxDB != null )
--- End diff --

Curly brackets...


> Implement put processor for InfluxDB
> 
>
> Key: NIFI-4289
> URL: https://issues.apache.org/jira/browse/NIFI-4289
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Affects Versions: 1.3.0
> Environment: All
>Reporter: Mans Singh
>

[jira] [Commented] (NIFI-4289) Implement put processor for InfluxDB

2018-01-26 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4289?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16341550#comment-16341550
 ] 

ASF GitHub Bot commented on NIFI-4289:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2101#discussion_r164209539
  
--- Diff: 
nifi-nar-bundles/nifi-influxdb-bundle/nifi-influxdb-processors/src/main/java/org/apache/nifi/processors/influxdb/AbstractInfluxDBProcessor.java
 ---
@@ -0,0 +1,142 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.influxdb;
+
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.DataUnit;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.influxdb.InfluxDB;
+import org.influxdb.InfluxDBFactory;
+
+/**
+ * Abstract base class for InfluxDB processors
+ */
+abstract class AbstractInfluxDBProcessor extends AbstractProcessor {
+
+protected static final PropertyDescriptor CHARSET = new 
PropertyDescriptor.Builder()
+.name("influxdb-charset")
+.displayName("Character Set")
+.description("Specifies the character set of the document 
data.")
+.required(true)
+.defaultValue("UTF-8")
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.CHARACTER_SET_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor INFLUX_DB_URL = new 
PropertyDescriptor.Builder()
+.name("influxdb-url")
+.displayName("InfluxDB connection url")
+.description("InfluxDB url to connect to")
+.required(true)
+.addValidator(StandardValidators.URL_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor DB_NAME = new 
PropertyDescriptor.Builder()
+.name("influxdb-dbname")
+.displayName("DB Name")
+.description("InfluxDB database to connect to")
+.required(true)
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor USERNAME = new 
PropertyDescriptor.Builder()
+.name("influxdb-username")
+.displayName("Username")
+.required(false)
+.description("Username for accessing InfluxDB")
+.addValidator(Validator.VALID)
+.build();
+
+public static final PropertyDescriptor PASSWORD = new 
PropertyDescriptor.Builder()
+.name("influxdb-password")
+.displayName("Password")
+.description("Password for user")
+.addValidator(Validator.VALID)
+.sensitive(true)
+.build();
+
+protected static final PropertyDescriptor MAX_RECORDS_SIZE = new 
PropertyDescriptor.Builder()
+.name("influxdb-max-records-size")
+.displayName("Max size of records")
+.description("Maximum size of records allowed to be posted in 
one batch")
+.defaultValue("1 MB")
+.required(true)
+.addValidator(StandardValidators.DATA_SIZE_VALIDATOR)
+.build();
+
+static final Relationship REL_SUCCESS = new 
Relationship.Builder().name("success")
+.description("Sucessful FlowFiles are routed to this 
relationship").build();
+
+static final 

[jira] [Commented] (NIFI-4289) Implement put processor for InfluxDB

2018-01-26 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4289?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16341548#comment-16341548
 ] 

ASF GitHub Bot commented on NIFI-4289:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2101#discussion_r164208704
  
--- Diff: 
nifi-nar-bundles/nifi-influxdb-bundle/nifi-influxdb-processors/src/main/java/org/apache/nifi/processors/influxdb/AbstractInfluxDBProcessor.java
 ---
@@ -0,0 +1,142 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.influxdb;
+
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.DataUnit;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.influxdb.InfluxDB;
+import org.influxdb.InfluxDBFactory;
+
+/**
+ * Abstract base class for InfluxDB processors
+ */
+abstract class AbstractInfluxDBProcessor extends AbstractProcessor {
+
+protected static final PropertyDescriptor CHARSET = new 
PropertyDescriptor.Builder()
+.name("influxdb-charset")
+.displayName("Character Set")
+.description("Specifies the character set of the document 
data.")
+.required(true)
+.defaultValue("UTF-8")
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.CHARACTER_SET_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor INFLUX_DB_URL = new 
PropertyDescriptor.Builder()
+.name("influxdb-url")
+.displayName("InfluxDB connection url")
+.description("InfluxDB url to connect to")
+.required(true)
+.addValidator(StandardValidators.URL_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor DB_NAME = new 
PropertyDescriptor.Builder()
+.name("influxdb-dbname")
+.displayName("DB Name")
+.description("InfluxDB database to connect to")
+.required(true)
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor USERNAME = new 
PropertyDescriptor.Builder()
+.name("influxdb-username")
+.displayName("Username")
+.required(false)
+.description("Username for accessing InfluxDB")
+.addValidator(Validator.VALID)
--- End diff --

Might to try StandardValidators.NON_BLANK_VALIDATOR here otherwise the user 
could try to send a blank username. Since it's not a required field, leaving it 
blank should mean that it's not being specified.


> Implement put processor for InfluxDB
> 
>
> Key: NIFI-4289
> URL: https://issues.apache.org/jira/browse/NIFI-4289
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Affects Versions: 1.3.0
> Environment: All
>Reporter: Mans Singh
>Assignee: Mans Singh
>Priority: Minor
>  Labels: insert, measurements,, put, timeseries
>
> Support inserting time series measurements into InfluxDB.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4289) Implement put processor for InfluxDB

2018-01-26 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4289?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16341553#comment-16341553
 ] 

ASF GitHub Bot commented on NIFI-4289:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2101#discussion_r164209671
  
--- Diff: 
nifi-nar-bundles/nifi-influxdb-bundle/nifi-influxdb-processors/src/main/java/org/apache/nifi/processors/influxdb/AbstractInfluxDBProcessor.java
 ---
@@ -0,0 +1,142 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.influxdb;
+
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.DataUnit;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.influxdb.InfluxDB;
+import org.influxdb.InfluxDBFactory;
+
+/**
+ * Abstract base class for InfluxDB processors
+ */
+abstract class AbstractInfluxDBProcessor extends AbstractProcessor {
+
+protected static final PropertyDescriptor CHARSET = new 
PropertyDescriptor.Builder()
+.name("influxdb-charset")
+.displayName("Character Set")
+.description("Specifies the character set of the document 
data.")
+.required(true)
+.defaultValue("UTF-8")
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.CHARACTER_SET_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor INFLUX_DB_URL = new 
PropertyDescriptor.Builder()
+.name("influxdb-url")
+.displayName("InfluxDB connection url")
+.description("InfluxDB url to connect to")
+.required(true)
+.addValidator(StandardValidators.URL_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor DB_NAME = new 
PropertyDescriptor.Builder()
+.name("influxdb-dbname")
+.displayName("DB Name")
+.description("InfluxDB database to connect to")
+.required(true)
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor USERNAME = new 
PropertyDescriptor.Builder()
+.name("influxdb-username")
+.displayName("Username")
+.required(false)
+.description("Username for accessing InfluxDB")
+.addValidator(Validator.VALID)
+.build();
+
+public static final PropertyDescriptor PASSWORD = new 
PropertyDescriptor.Builder()
+.name("influxdb-password")
+.displayName("Password")
+.description("Password for user")
+.addValidator(Validator.VALID)
+.sensitive(true)
+.build();
+
+protected static final PropertyDescriptor MAX_RECORDS_SIZE = new 
PropertyDescriptor.Builder()
+.name("influxdb-max-records-size")
+.displayName("Max size of records")
+.description("Maximum size of records allowed to be posted in 
one batch")
+.defaultValue("1 MB")
+.required(true)
+.addValidator(StandardValidators.DATA_SIZE_VALIDATOR)
+.build();
+
+static final Relationship REL_SUCCESS = new 
Relationship.Builder().name("success")
+.description("Sucessful FlowFiles are routed to this 
relationship").build();
+
+static final 

[GitHub] nifi pull request #2101: NIFI-4289 - InfluxDB put processor

2018-01-26 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2101#discussion_r164209603
  
--- Diff: 
nifi-nar-bundles/nifi-influxdb-bundle/nifi-influxdb-processors/src/main/java/org/apache/nifi/processors/influxdb/AbstractInfluxDBProcessor.java
 ---
@@ -0,0 +1,142 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.influxdb;
+
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.DataUnit;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.influxdb.InfluxDB;
+import org.influxdb.InfluxDBFactory;
+
+/**
+ * Abstract base class for InfluxDB processors
+ */
+abstract class AbstractInfluxDBProcessor extends AbstractProcessor {
+
+protected static final PropertyDescriptor CHARSET = new 
PropertyDescriptor.Builder()
+.name("influxdb-charset")
+.displayName("Character Set")
+.description("Specifies the character set of the document 
data.")
+.required(true)
+.defaultValue("UTF-8")
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.CHARACTER_SET_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor INFLUX_DB_URL = new 
PropertyDescriptor.Builder()
+.name("influxdb-url")
+.displayName("InfluxDB connection url")
+.description("InfluxDB url to connect to")
+.required(true)
+.addValidator(StandardValidators.URL_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor DB_NAME = new 
PropertyDescriptor.Builder()
+.name("influxdb-dbname")
+.displayName("DB Name")
+.description("InfluxDB database to connect to")
+.required(true)
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor USERNAME = new 
PropertyDescriptor.Builder()
+.name("influxdb-username")
+.displayName("Username")
+.required(false)
+.description("Username for accessing InfluxDB")
+.addValidator(Validator.VALID)
+.build();
+
+public static final PropertyDescriptor PASSWORD = new 
PropertyDescriptor.Builder()
+.name("influxdb-password")
+.displayName("Password")
+.description("Password for user")
+.addValidator(Validator.VALID)
+.sensitive(true)
+.build();
+
+protected static final PropertyDescriptor MAX_RECORDS_SIZE = new 
PropertyDescriptor.Builder()
+.name("influxdb-max-records-size")
+.displayName("Max size of records")
+.description("Maximum size of records allowed to be posted in 
one batch")
+.defaultValue("1 MB")
+.required(true)
+.addValidator(StandardValidators.DATA_SIZE_VALIDATOR)
+.build();
+
+static final Relationship REL_SUCCESS = new 
Relationship.Builder().name("success")
+.description("Sucessful FlowFiles are routed to this 
relationship").build();
+
+static final Relationship REL_FAILURE = new 
Relationship.Builder().name("failure")
+.description("Failed FlowFiles are routed to this 
relationship").build();
+
+public static final String INFLUX_DB_ERROR_MESSAGE = 

[jira] [Commented] (NIFI-4289) Implement put processor for InfluxDB

2018-01-26 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4289?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16341556#comment-16341556
 ] 

ASF GitHub Bot commented on NIFI-4289:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2101#discussion_r164209408
  
--- Diff: 
nifi-nar-bundles/nifi-influxdb-bundle/nifi-influxdb-processors/src/main/java/org/apache/nifi/processors/influxdb/AbstractInfluxDBProcessor.java
 ---
@@ -0,0 +1,142 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.influxdb;
+
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.DataUnit;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.influxdb.InfluxDB;
+import org.influxdb.InfluxDBFactory;
+
+/**
+ * Abstract base class for InfluxDB processors
+ */
+abstract class AbstractInfluxDBProcessor extends AbstractProcessor {
+
+protected static final PropertyDescriptor CHARSET = new 
PropertyDescriptor.Builder()
+.name("influxdb-charset")
+.displayName("Character Set")
+.description("Specifies the character set of the document 
data.")
+.required(true)
+.defaultValue("UTF-8")
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.CHARACTER_SET_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor INFLUX_DB_URL = new 
PropertyDescriptor.Builder()
+.name("influxdb-url")
+.displayName("InfluxDB connection url")
+.description("InfluxDB url to connect to")
+.required(true)
+.addValidator(StandardValidators.URL_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor DB_NAME = new 
PropertyDescriptor.Builder()
+.name("influxdb-dbname")
+.displayName("DB Name")
+.description("InfluxDB database to connect to")
+.required(true)
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor USERNAME = new 
PropertyDescriptor.Builder()
+.name("influxdb-username")
+.displayName("Username")
+.required(false)
+.description("Username for accessing InfluxDB")
+.addValidator(Validator.VALID)
+.build();
+
+public static final PropertyDescriptor PASSWORD = new 
PropertyDescriptor.Builder()
+.name("influxdb-password")
+.displayName("Password")
+.description("Password for user")
+.addValidator(Validator.VALID)
+.sensitive(true)
+.build();
+
+protected static final PropertyDescriptor MAX_RECORDS_SIZE = new 
PropertyDescriptor.Builder()
+.name("influxdb-max-records-size")
+.displayName("Max size of records")
+.description("Maximum size of records allowed to be posted in 
one batch")
+.defaultValue("1 MB")
+.required(true)
+.addValidator(StandardValidators.DATA_SIZE_VALIDATOR)
+.build();
+
+static final Relationship REL_SUCCESS = new 
Relationship.Builder().name("success")
+.description("Sucessful FlowFiles are routed to this 
relationship").build();
+
+static final 

[GitHub] nifi pull request #2101: NIFI-4289 - InfluxDB put processor

2018-01-26 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2101#discussion_r164209671
  
--- Diff: 
nifi-nar-bundles/nifi-influxdb-bundle/nifi-influxdb-processors/src/main/java/org/apache/nifi/processors/influxdb/AbstractInfluxDBProcessor.java
 ---
@@ -0,0 +1,142 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.influxdb;
+
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.DataUnit;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.influxdb.InfluxDB;
+import org.influxdb.InfluxDBFactory;
+
+/**
+ * Abstract base class for InfluxDB processors
+ */
+abstract class AbstractInfluxDBProcessor extends AbstractProcessor {
+
+protected static final PropertyDescriptor CHARSET = new 
PropertyDescriptor.Builder()
+.name("influxdb-charset")
+.displayName("Character Set")
+.description("Specifies the character set of the document 
data.")
+.required(true)
+.defaultValue("UTF-8")
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.CHARACTER_SET_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor INFLUX_DB_URL = new 
PropertyDescriptor.Builder()
+.name("influxdb-url")
+.displayName("InfluxDB connection url")
+.description("InfluxDB url to connect to")
+.required(true)
+.addValidator(StandardValidators.URL_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor DB_NAME = new 
PropertyDescriptor.Builder()
+.name("influxdb-dbname")
+.displayName("DB Name")
+.description("InfluxDB database to connect to")
+.required(true)
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor USERNAME = new 
PropertyDescriptor.Builder()
+.name("influxdb-username")
+.displayName("Username")
+.required(false)
+.description("Username for accessing InfluxDB")
+.addValidator(Validator.VALID)
+.build();
+
+public static final PropertyDescriptor PASSWORD = new 
PropertyDescriptor.Builder()
+.name("influxdb-password")
+.displayName("Password")
+.description("Password for user")
+.addValidator(Validator.VALID)
+.sensitive(true)
+.build();
+
+protected static final PropertyDescriptor MAX_RECORDS_SIZE = new 
PropertyDescriptor.Builder()
+.name("influxdb-max-records-size")
+.displayName("Max size of records")
+.description("Maximum size of records allowed to be posted in 
one batch")
+.defaultValue("1 MB")
+.required(true)
+.addValidator(StandardValidators.DATA_SIZE_VALIDATOR)
+.build();
+
+static final Relationship REL_SUCCESS = new 
Relationship.Builder().name("success")
+.description("Sucessful FlowFiles are routed to this 
relationship").build();
+
+static final Relationship REL_FAILURE = new 
Relationship.Builder().name("failure")
+.description("Failed FlowFiles are routed to this 
relationship").build();
+
+public static final String INFLUX_DB_ERROR_MESSAGE = 

[jira] [Commented] (NIFI-4289) Implement put processor for InfluxDB

2018-01-26 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4289?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16341552#comment-16341552
 ] 

ASF GitHub Bot commented on NIFI-4289:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2101#discussion_r164209928
  
--- Diff: 
nifi-nar-bundles/nifi-influxdb-bundle/nifi-influxdb-processors/src/main/java/org/apache/nifi/processors/influxdb/AbstractInfluxDBProcessor.java
 ---
@@ -0,0 +1,142 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.influxdb;
+
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.DataUnit;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.influxdb.InfluxDB;
+import org.influxdb.InfluxDBFactory;
+
+/**
+ * Abstract base class for InfluxDB processors
+ */
+abstract class AbstractInfluxDBProcessor extends AbstractProcessor {
+
+protected static final PropertyDescriptor CHARSET = new 
PropertyDescriptor.Builder()
+.name("influxdb-charset")
+.displayName("Character Set")
+.description("Specifies the character set of the document 
data.")
+.required(true)
+.defaultValue("UTF-8")
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.CHARACTER_SET_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor INFLUX_DB_URL = new 
PropertyDescriptor.Builder()
+.name("influxdb-url")
+.displayName("InfluxDB connection url")
+.description("InfluxDB url to connect to")
+.required(true)
+.addValidator(StandardValidators.URL_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor DB_NAME = new 
PropertyDescriptor.Builder()
+.name("influxdb-dbname")
+.displayName("DB Name")
+.description("InfluxDB database to connect to")
+.required(true)
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor USERNAME = new 
PropertyDescriptor.Builder()
+.name("influxdb-username")
+.displayName("Username")
+.required(false)
+.description("Username for accessing InfluxDB")
+.addValidator(Validator.VALID)
+.build();
+
+public static final PropertyDescriptor PASSWORD = new 
PropertyDescriptor.Builder()
+.name("influxdb-password")
+.displayName("Password")
+.description("Password for user")
+.addValidator(Validator.VALID)
+.sensitive(true)
+.build();
+
+protected static final PropertyDescriptor MAX_RECORDS_SIZE = new 
PropertyDescriptor.Builder()
+.name("influxdb-max-records-size")
+.displayName("Max size of records")
+.description("Maximum size of records allowed to be posted in 
one batch")
+.defaultValue("1 MB")
+.required(true)
+.addValidator(StandardValidators.DATA_SIZE_VALIDATOR)
+.build();
+
+static final Relationship REL_SUCCESS = new 
Relationship.Builder().name("success")
+.description("Sucessful FlowFiles are routed to this 
relationship").build();
+
+static final 

[jira] [Commented] (NIFI-4289) Implement put processor for InfluxDB

2018-01-26 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4289?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16341557#comment-16341557
 ] 

ASF GitHub Bot commented on NIFI-4289:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2101#discussion_r164212552
  
--- Diff: 
nifi-nar-bundles/nifi-influxdb-bundle/nifi-influxdb-processors/src/test/java/org/apache/nifi/processors/influxdb/ITPutInfluxDBTest.java
 ---
@@ -0,0 +1,189 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.influxdb;
+
+import static org.junit.Assert.assertEquals;
+import java.util.List;
+import org.apache.nifi.util.MockFlowFile;
+import org.apache.nifi.util.TestRunner;
+import org.apache.nifi.util.TestRunners;
+import org.influxdb.InfluxDB;
+import org.influxdb.InfluxDBFactory;
+import org.influxdb.dto.Query;
+import org.influxdb.dto.QueryResult;
+import org.junit.After;
+import org.junit.Before;
+import org.junit.Ignore;
+import org.junit.Test;
+
+/**
+ * Integration test for InfluxDB. Please ensure that the InfluxDB is 
running
+ * on local host with default port and has database test with table test. 
Please set user
+ * and password if applicable before running the integration tests.
+ */
+@Ignore("Comment this out for running tests against a real instance of 
InfluxDB")
+public class ITPutInfluxDBTest {
+private TestRunner runner;
+private InfluxDB influxDB;
+private String dbName = "test";
+private String dbUrl = "http://localhost:8086;;
+private String user = "admin";
+private String password = "admin";
+
+@Before
+public void setUp() throws Exception {
+runner = TestRunners.newTestRunner(PutInfluxDB.class);
+runner.setProperty(PutInfluxDB.DB_NAME, dbName);
+runner.setProperty(PutInfluxDB.USERNAME, user);
+runner.setProperty(PutInfluxDB.PASSWORD, password);
+runner.setProperty(PutInfluxDB.INFLUX_DB_URL, dbUrl);
+runner.setProperty(PutInfluxDB.CHARSET, "UTF-8");
+
runner.setProperty(PutInfluxDB.CONSISTENCY_LEVEL,PutInfluxDB.CONSISTENCY_LEVEL_ONE.getValue());
+runner.setProperty(PutInfluxDB.RETENTION_POLICY,"autogen");
+runner.setProperty(PutInfluxDB.MAX_RECORDS_SIZE, "1 KB");
+runner.assertValid();
+
+influxDB = InfluxDBFactory.connect(dbUrl,user,password);
+
+if ( influxDB.databaseExists(dbName) ) {
+QueryResult result = influxDB.query(new Query("DROP 
measurement water", dbName));
+checkError(result);
+result = influxDB.query(new Query("DROP measurement testm", 
dbName));
+checkError(result);
+} else {
+influxDB.createDatabase(dbName);
+int max = 10;
+while (!influxDB.databaseExists(dbName) && (max-- < 0)) {
+Thread.sleep(5);
+}
+if ( ! influxDB.databaseExists(dbName) ) {
+throw new Exception("unable to create database " + dbName);
+}
+}
+}
+
+protected void checkError(QueryResult result) {
+if ( result.hasError() )
--- End diff --

Please add curly brackets.


> Implement put processor for InfluxDB
> 
>
> Key: NIFI-4289
> URL: https://issues.apache.org/jira/browse/NIFI-4289
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Affects Versions: 1.3.0
> Environment: All
>Reporter: Mans Singh
>Assignee: Mans Singh
>Priority: Minor
>  Labels: insert, measurements,, put, timeseries
>
> Support inserting time series measurements into InfluxDB.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4289) Implement put processor for InfluxDB

2018-01-26 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4289?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16341558#comment-16341558
 ] 

ASF GitHub Bot commented on NIFI-4289:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2101#discussion_r164210261
  
--- Diff: 
nifi-nar-bundles/nifi-influxdb-bundle/nifi-influxdb-processors/src/main/java/org/apache/nifi/processors/influxdb/PutInfluxDB.java
 ---
@@ -0,0 +1,176 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.influxdb;
+
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.AllowableValue;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.influxdb.InfluxDB;
+import java.io.ByteArrayOutputStream;
+import java.nio.charset.Charset;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Set;
+
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@EventDriven
+@SupportsBatching
+@Tags({"influxdb", "measurement","insert", "write", "put", "timeseries"})
+@CapabilityDescription("Processor to write the content of a FlowFile (in 
line protocol 
https://docs.influxdata.com/influxdb/v1.3/write_protocols/line_protocol_tutorial/)
 to InfluxDB (https://www.influxdb.com/). "
++ "  The flow file can contain single measurement point or 
multiple measurement points separated by line seperator")
+@WritesAttributes({
+@WritesAttribute(attribute = 
AbstractInfluxDBProcessor.INFLUX_DB_ERROR_MESSAGE, description = "InfluxDB 
error message"),
+})
+public class PutInfluxDB extends AbstractInfluxDBProcessor {
+
+public static AllowableValue CONSISTENCY_LEVEL_ALL = new 
AllowableValue("ALL", "All", "Return success when all nodes have responded with 
write success");
+public static AllowableValue CONSISTENCY_LEVEL_ANY = new 
AllowableValue("ANY", "Any", "Return success when any nodes have responded with 
write success");
+public static AllowableValue CONSISTENCY_LEVEL_ONE = new 
AllowableValue("ONE", "One", "Return success when one node has responded with 
write success");
+public static AllowableValue CONSISTENCY_LEVEL_QUORUM = new 
AllowableValue("QUORUM", "Quorum", "Return success when a majority of nodes 
have responded with write success");
+
+public static final PropertyDescriptor CONSISTENCY_LEVEL = new 
PropertyDescriptor.Builder()
+.name("influxdb-consistency-level")
+.displayName("Consistency Level")
+.description("InfluxDB consistency level")
+.required(true)
+.defaultValue(CONSISTENCY_LEVEL_ONE.getValue())
+.expressionLanguageSupported(true)
+
.allowableValues(CONSISTENCY_LEVEL_ONE,CONSISTENCY_LEVEL_ANY,CONSISTENCY_LEVEL_ALL,CONSISTENCY_LEVEL_QUORUM)
--- End diff --

Nit: unneeded white space.


> Implement put processor for InfluxDB
> 
>
> Key: NIFI-4289
> URL: 

[GitHub] nifi pull request #2101: NIFI-4289 - InfluxDB put processor

2018-01-26 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2101#discussion_r164212464
  
--- Diff: 
nifi-nar-bundles/nifi-influxdb-bundle/nifi-influxdb-processors/src/test/java/org/apache/nifi/processors/influxdb/ITPutInfluxDBTest.java
 ---
@@ -0,0 +1,189 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.influxdb;
+
+import static org.junit.Assert.assertEquals;
+import java.util.List;
+import org.apache.nifi.util.MockFlowFile;
+import org.apache.nifi.util.TestRunner;
+import org.apache.nifi.util.TestRunners;
+import org.influxdb.InfluxDB;
+import org.influxdb.InfluxDBFactory;
+import org.influxdb.dto.Query;
+import org.influxdb.dto.QueryResult;
+import org.junit.After;
+import org.junit.Before;
+import org.junit.Ignore;
+import org.junit.Test;
+
+/**
+ * Integration test for InfluxDB. Please ensure that the InfluxDB is 
running
+ * on local host with default port and has database test with table test. 
Please set user
+ * and password if applicable before running the integration tests.
+ */
+@Ignore("Comment this out for running tests against a real instance of 
InfluxDB")
--- End diff --

Maven has the ability to do integration tests. You might want to think 
about setting up a new profile to do that instead of using Ignore. I have [an 
example here on this 
PR](https://github.com/MikeThomsen/nifi/commit/cbc2d61f2d98fdf29ff6f1a46fbb524691fd11c0#diff-9fbcd1bfda73f61e6c31fb2fcb3371f3R111).


---


[GitHub] nifi pull request #2101: NIFI-4289 - InfluxDB put processor

2018-01-26 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2101#discussion_r164209408
  
--- Diff: 
nifi-nar-bundles/nifi-influxdb-bundle/nifi-influxdb-processors/src/main/java/org/apache/nifi/processors/influxdb/AbstractInfluxDBProcessor.java
 ---
@@ -0,0 +1,142 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.influxdb;
+
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.DataUnit;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.influxdb.InfluxDB;
+import org.influxdb.InfluxDBFactory;
+
+/**
+ * Abstract base class for InfluxDB processors
+ */
+abstract class AbstractInfluxDBProcessor extends AbstractProcessor {
+
+protected static final PropertyDescriptor CHARSET = new 
PropertyDescriptor.Builder()
+.name("influxdb-charset")
+.displayName("Character Set")
+.description("Specifies the character set of the document 
data.")
+.required(true)
+.defaultValue("UTF-8")
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.CHARACTER_SET_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor INFLUX_DB_URL = new 
PropertyDescriptor.Builder()
+.name("influxdb-url")
+.displayName("InfluxDB connection url")
+.description("InfluxDB url to connect to")
+.required(true)
+.addValidator(StandardValidators.URL_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor DB_NAME = new 
PropertyDescriptor.Builder()
+.name("influxdb-dbname")
+.displayName("DB Name")
+.description("InfluxDB database to connect to")
+.required(true)
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor USERNAME = new 
PropertyDescriptor.Builder()
+.name("influxdb-username")
+.displayName("Username")
+.required(false)
+.description("Username for accessing InfluxDB")
+.addValidator(Validator.VALID)
+.build();
+
+public static final PropertyDescriptor PASSWORD = new 
PropertyDescriptor.Builder()
+.name("influxdb-password")
+.displayName("Password")
+.description("Password for user")
+.addValidator(Validator.VALID)
+.sensitive(true)
+.build();
+
+protected static final PropertyDescriptor MAX_RECORDS_SIZE = new 
PropertyDescriptor.Builder()
+.name("influxdb-max-records-size")
+.displayName("Max size of records")
+.description("Maximum size of records allowed to be posted in 
one batch")
+.defaultValue("1 MB")
+.required(true)
+.addValidator(StandardValidators.DATA_SIZE_VALIDATOR)
+.build();
+
+static final Relationship REL_SUCCESS = new 
Relationship.Builder().name("success")
+.description("Sucessful FlowFiles are routed to this 
relationship").build();
+
+static final Relationship REL_FAILURE = new 
Relationship.Builder().name("failure")
+.description("Failed FlowFiles are routed to this 
relationship").build();
+
+public static final String INFLUX_DB_ERROR_MESSAGE = 

[jira] [Commented] (NIFI-4289) Implement put processor for InfluxDB

2018-01-26 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4289?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16341551#comment-16341551
 ] 

ASF GitHub Bot commented on NIFI-4289:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2101#discussion_r164209603
  
--- Diff: 
nifi-nar-bundles/nifi-influxdb-bundle/nifi-influxdb-processors/src/main/java/org/apache/nifi/processors/influxdb/AbstractInfluxDBProcessor.java
 ---
@@ -0,0 +1,142 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.influxdb;
+
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.DataUnit;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.influxdb.InfluxDB;
+import org.influxdb.InfluxDBFactory;
+
+/**
+ * Abstract base class for InfluxDB processors
+ */
+abstract class AbstractInfluxDBProcessor extends AbstractProcessor {
+
+protected static final PropertyDescriptor CHARSET = new 
PropertyDescriptor.Builder()
+.name("influxdb-charset")
+.displayName("Character Set")
+.description("Specifies the character set of the document 
data.")
+.required(true)
+.defaultValue("UTF-8")
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.CHARACTER_SET_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor INFLUX_DB_URL = new 
PropertyDescriptor.Builder()
+.name("influxdb-url")
+.displayName("InfluxDB connection url")
+.description("InfluxDB url to connect to")
+.required(true)
+.addValidator(StandardValidators.URL_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor DB_NAME = new 
PropertyDescriptor.Builder()
+.name("influxdb-dbname")
+.displayName("DB Name")
+.description("InfluxDB database to connect to")
+.required(true)
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor USERNAME = new 
PropertyDescriptor.Builder()
+.name("influxdb-username")
+.displayName("Username")
+.required(false)
+.description("Username for accessing InfluxDB")
+.addValidator(Validator.VALID)
+.build();
+
+public static final PropertyDescriptor PASSWORD = new 
PropertyDescriptor.Builder()
+.name("influxdb-password")
+.displayName("Password")
+.description("Password for user")
+.addValidator(Validator.VALID)
+.sensitive(true)
+.build();
+
+protected static final PropertyDescriptor MAX_RECORDS_SIZE = new 
PropertyDescriptor.Builder()
+.name("influxdb-max-records-size")
+.displayName("Max size of records")
+.description("Maximum size of records allowed to be posted in 
one batch")
+.defaultValue("1 MB")
+.required(true)
+.addValidator(StandardValidators.DATA_SIZE_VALIDATOR)
+.build();
+
+static final Relationship REL_SUCCESS = new 
Relationship.Builder().name("success")
+.description("Sucessful FlowFiles are routed to this 
relationship").build();
+
+static final 

[jira] [Commented] (NIFI-4289) Implement put processor for InfluxDB

2018-01-26 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4289?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16341555#comment-16341555
 ] 

ASF GitHub Bot commented on NIFI-4289:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2101#discussion_r164212464
  
--- Diff: 
nifi-nar-bundles/nifi-influxdb-bundle/nifi-influxdb-processors/src/test/java/org/apache/nifi/processors/influxdb/ITPutInfluxDBTest.java
 ---
@@ -0,0 +1,189 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.influxdb;
+
+import static org.junit.Assert.assertEquals;
+import java.util.List;
+import org.apache.nifi.util.MockFlowFile;
+import org.apache.nifi.util.TestRunner;
+import org.apache.nifi.util.TestRunners;
+import org.influxdb.InfluxDB;
+import org.influxdb.InfluxDBFactory;
+import org.influxdb.dto.Query;
+import org.influxdb.dto.QueryResult;
+import org.junit.After;
+import org.junit.Before;
+import org.junit.Ignore;
+import org.junit.Test;
+
+/**
+ * Integration test for InfluxDB. Please ensure that the InfluxDB is 
running
+ * on local host with default port and has database test with table test. 
Please set user
+ * and password if applicable before running the integration tests.
+ */
+@Ignore("Comment this out for running tests against a real instance of 
InfluxDB")
--- End diff --

Maven has the ability to do integration tests. You might want to think 
about setting up a new profile to do that instead of using Ignore. I have [an 
example here on this 
PR](https://github.com/MikeThomsen/nifi/commit/cbc2d61f2d98fdf29ff6f1a46fbb524691fd11c0#diff-9fbcd1bfda73f61e6c31fb2fcb3371f3R111).


> Implement put processor for InfluxDB
> 
>
> Key: NIFI-4289
> URL: https://issues.apache.org/jira/browse/NIFI-4289
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Affects Versions: 1.3.0
> Environment: All
>Reporter: Mans Singh
>Assignee: Mans Singh
>Priority: Minor
>  Labels: insert, measurements,, put, timeseries
>
> Support inserting time series measurements into InfluxDB.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2101: NIFI-4289 - InfluxDB put processor

2018-01-26 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2101#discussion_r164209928
  
--- Diff: 
nifi-nar-bundles/nifi-influxdb-bundle/nifi-influxdb-processors/src/main/java/org/apache/nifi/processors/influxdb/AbstractInfluxDBProcessor.java
 ---
@@ -0,0 +1,142 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.influxdb;
+
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.DataUnit;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.influxdb.InfluxDB;
+import org.influxdb.InfluxDBFactory;
+
+/**
+ * Abstract base class for InfluxDB processors
+ */
+abstract class AbstractInfluxDBProcessor extends AbstractProcessor {
+
+protected static final PropertyDescriptor CHARSET = new 
PropertyDescriptor.Builder()
+.name("influxdb-charset")
+.displayName("Character Set")
+.description("Specifies the character set of the document 
data.")
+.required(true)
+.defaultValue("UTF-8")
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.CHARACTER_SET_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor INFLUX_DB_URL = new 
PropertyDescriptor.Builder()
+.name("influxdb-url")
+.displayName("InfluxDB connection url")
+.description("InfluxDB url to connect to")
+.required(true)
+.addValidator(StandardValidators.URL_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor DB_NAME = new 
PropertyDescriptor.Builder()
+.name("influxdb-dbname")
+.displayName("DB Name")
+.description("InfluxDB database to connect to")
+.required(true)
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor USERNAME = new 
PropertyDescriptor.Builder()
+.name("influxdb-username")
+.displayName("Username")
+.required(false)
+.description("Username for accessing InfluxDB")
+.addValidator(Validator.VALID)
+.build();
+
+public static final PropertyDescriptor PASSWORD = new 
PropertyDescriptor.Builder()
+.name("influxdb-password")
+.displayName("Password")
+.description("Password for user")
+.addValidator(Validator.VALID)
+.sensitive(true)
+.build();
+
+protected static final PropertyDescriptor MAX_RECORDS_SIZE = new 
PropertyDescriptor.Builder()
+.name("influxdb-max-records-size")
+.displayName("Max size of records")
+.description("Maximum size of records allowed to be posted in 
one batch")
+.defaultValue("1 MB")
+.required(true)
+.addValidator(StandardValidators.DATA_SIZE_VALIDATOR)
+.build();
+
+static final Relationship REL_SUCCESS = new 
Relationship.Builder().name("success")
+.description("Sucessful FlowFiles are routed to this 
relationship").build();
+
+static final Relationship REL_FAILURE = new 
Relationship.Builder().name("failure")
+.description("Failed FlowFiles are routed to this 
relationship").build();
+
+public static final String INFLUX_DB_ERROR_MESSAGE = 

[GitHub] nifi pull request #2101: NIFI-4289 - InfluxDB put processor

2018-01-26 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2101#discussion_r164210261
  
--- Diff: 
nifi-nar-bundles/nifi-influxdb-bundle/nifi-influxdb-processors/src/main/java/org/apache/nifi/processors/influxdb/PutInfluxDB.java
 ---
@@ -0,0 +1,176 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.influxdb;
+
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.AllowableValue;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.influxdb.InfluxDB;
+import java.io.ByteArrayOutputStream;
+import java.nio.charset.Charset;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Set;
+
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@EventDriven
+@SupportsBatching
+@Tags({"influxdb", "measurement","insert", "write", "put", "timeseries"})
+@CapabilityDescription("Processor to write the content of a FlowFile (in 
line protocol 
https://docs.influxdata.com/influxdb/v1.3/write_protocols/line_protocol_tutorial/)
 to InfluxDB (https://www.influxdb.com/). "
++ "  The flow file can contain single measurement point or 
multiple measurement points separated by line seperator")
+@WritesAttributes({
+@WritesAttribute(attribute = 
AbstractInfluxDBProcessor.INFLUX_DB_ERROR_MESSAGE, description = "InfluxDB 
error message"),
+})
+public class PutInfluxDB extends AbstractInfluxDBProcessor {
+
+public static AllowableValue CONSISTENCY_LEVEL_ALL = new 
AllowableValue("ALL", "All", "Return success when all nodes have responded with 
write success");
+public static AllowableValue CONSISTENCY_LEVEL_ANY = new 
AllowableValue("ANY", "Any", "Return success when any nodes have responded with 
write success");
+public static AllowableValue CONSISTENCY_LEVEL_ONE = new 
AllowableValue("ONE", "One", "Return success when one node has responded with 
write success");
+public static AllowableValue CONSISTENCY_LEVEL_QUORUM = new 
AllowableValue("QUORUM", "Quorum", "Return success when a majority of nodes 
have responded with write success");
+
+public static final PropertyDescriptor CONSISTENCY_LEVEL = new 
PropertyDescriptor.Builder()
+.name("influxdb-consistency-level")
+.displayName("Consistency Level")
+.description("InfluxDB consistency level")
+.required(true)
+.defaultValue(CONSISTENCY_LEVEL_ONE.getValue())
+.expressionLanguageSupported(true)
+
.allowableValues(CONSISTENCY_LEVEL_ONE,CONSISTENCY_LEVEL_ANY,CONSISTENCY_LEVEL_ALL,CONSISTENCY_LEVEL_QUORUM)
--- End diff --

Nit: unneeded white space.


---


[GitHub] nifi pull request #2101: NIFI-4289 - InfluxDB put processor

2018-01-26 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2101#discussion_r164208704
  
--- Diff: 
nifi-nar-bundles/nifi-influxdb-bundle/nifi-influxdb-processors/src/main/java/org/apache/nifi/processors/influxdb/AbstractInfluxDBProcessor.java
 ---
@@ -0,0 +1,142 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.influxdb;
+
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.DataUnit;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.influxdb.InfluxDB;
+import org.influxdb.InfluxDBFactory;
+
+/**
+ * Abstract base class for InfluxDB processors
+ */
+abstract class AbstractInfluxDBProcessor extends AbstractProcessor {
+
+protected static final PropertyDescriptor CHARSET = new 
PropertyDescriptor.Builder()
+.name("influxdb-charset")
+.displayName("Character Set")
+.description("Specifies the character set of the document 
data.")
+.required(true)
+.defaultValue("UTF-8")
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.CHARACTER_SET_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor INFLUX_DB_URL = new 
PropertyDescriptor.Builder()
+.name("influxdb-url")
+.displayName("InfluxDB connection url")
+.description("InfluxDB url to connect to")
+.required(true)
+.addValidator(StandardValidators.URL_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor DB_NAME = new 
PropertyDescriptor.Builder()
+.name("influxdb-dbname")
+.displayName("DB Name")
+.description("InfluxDB database to connect to")
+.required(true)
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor USERNAME = new 
PropertyDescriptor.Builder()
+.name("influxdb-username")
+.displayName("Username")
+.required(false)
+.description("Username for accessing InfluxDB")
+.addValidator(Validator.VALID)
--- End diff --

Might to try StandardValidators.NON_BLANK_VALIDATOR here otherwise the user 
could try to send a blank username. Since it's not a required field, leaving it 
blank should mean that it's not being specified.


---


[GitHub] nifi pull request #2101: NIFI-4289 - InfluxDB put processor

2018-01-26 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2101#discussion_r164208755
  
--- Diff: 
nifi-nar-bundles/nifi-influxdb-bundle/nifi-influxdb-processors/src/main/java/org/apache/nifi/processors/influxdb/AbstractInfluxDBProcessor.java
 ---
@@ -0,0 +1,142 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.influxdb;
+
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.DataUnit;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.influxdb.InfluxDB;
+import org.influxdb.InfluxDBFactory;
+
+/**
+ * Abstract base class for InfluxDB processors
+ */
+abstract class AbstractInfluxDBProcessor extends AbstractProcessor {
+
+protected static final PropertyDescriptor CHARSET = new 
PropertyDescriptor.Builder()
+.name("influxdb-charset")
+.displayName("Character Set")
+.description("Specifies the character set of the document 
data.")
+.required(true)
+.defaultValue("UTF-8")
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.CHARACTER_SET_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor INFLUX_DB_URL = new 
PropertyDescriptor.Builder()
+.name("influxdb-url")
+.displayName("InfluxDB connection url")
+.description("InfluxDB url to connect to")
+.required(true)
+.addValidator(StandardValidators.URL_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor DB_NAME = new 
PropertyDescriptor.Builder()
+.name("influxdb-dbname")
+.displayName("DB Name")
+.description("InfluxDB database to connect to")
+.required(true)
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor USERNAME = new 
PropertyDescriptor.Builder()
+.name("influxdb-username")
+.displayName("Username")
+.required(false)
+.description("Username for accessing InfluxDB")
+.addValidator(Validator.VALID)
+.build();
+
+public static final PropertyDescriptor PASSWORD = new 
PropertyDescriptor.Builder()
+.name("influxdb-password")
+.displayName("Password")
+.description("Password for user")
+.addValidator(Validator.VALID)
--- End diff --

Try StandardValidators.NON_BLANK_VALIDATOR here too.


---


[GitHub] nifi pull request #2101: NIFI-4289 - InfluxDB put processor

2018-01-26 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2101#discussion_r164209539
  
--- Diff: 
nifi-nar-bundles/nifi-influxdb-bundle/nifi-influxdb-processors/src/main/java/org/apache/nifi/processors/influxdb/AbstractInfluxDBProcessor.java
 ---
@@ -0,0 +1,142 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.influxdb;
+
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.DataUnit;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.influxdb.InfluxDB;
+import org.influxdb.InfluxDBFactory;
+
+/**
+ * Abstract base class for InfluxDB processors
+ */
+abstract class AbstractInfluxDBProcessor extends AbstractProcessor {
+
+protected static final PropertyDescriptor CHARSET = new 
PropertyDescriptor.Builder()
+.name("influxdb-charset")
+.displayName("Character Set")
+.description("Specifies the character set of the document 
data.")
+.required(true)
+.defaultValue("UTF-8")
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.CHARACTER_SET_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor INFLUX_DB_URL = new 
PropertyDescriptor.Builder()
+.name("influxdb-url")
+.displayName("InfluxDB connection url")
+.description("InfluxDB url to connect to")
+.required(true)
+.addValidator(StandardValidators.URL_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor DB_NAME = new 
PropertyDescriptor.Builder()
+.name("influxdb-dbname")
+.displayName("DB Name")
+.description("InfluxDB database to connect to")
+.required(true)
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor USERNAME = new 
PropertyDescriptor.Builder()
+.name("influxdb-username")
+.displayName("Username")
+.required(false)
+.description("Username for accessing InfluxDB")
+.addValidator(Validator.VALID)
+.build();
+
+public static final PropertyDescriptor PASSWORD = new 
PropertyDescriptor.Builder()
+.name("influxdb-password")
+.displayName("Password")
+.description("Password for user")
+.addValidator(Validator.VALID)
+.sensitive(true)
+.build();
+
+protected static final PropertyDescriptor MAX_RECORDS_SIZE = new 
PropertyDescriptor.Builder()
+.name("influxdb-max-records-size")
+.displayName("Max size of records")
+.description("Maximum size of records allowed to be posted in 
one batch")
+.defaultValue("1 MB")
+.required(true)
+.addValidator(StandardValidators.DATA_SIZE_VALIDATOR)
+.build();
+
+static final Relationship REL_SUCCESS = new 
Relationship.Builder().name("success")
+.description("Sucessful FlowFiles are routed to this 
relationship").build();
+
+static final Relationship REL_FAILURE = new 
Relationship.Builder().name("failure")
+.description("Failed FlowFiles are routed to this 
relationship").build();
+
+public static final String INFLUX_DB_ERROR_MESSAGE = 

[GitHub] nifi pull request #2101: NIFI-4289 - InfluxDB put processor

2018-01-26 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2101#discussion_r164212590
  
--- Diff: 
nifi-nar-bundles/nifi-influxdb-bundle/nifi-influxdb-processors/src/test/java/org/apache/nifi/processors/influxdb/ITPutInfluxDBTest.java
 ---
@@ -0,0 +1,189 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.influxdb;
+
+import static org.junit.Assert.assertEquals;
+import java.util.List;
+import org.apache.nifi.util.MockFlowFile;
+import org.apache.nifi.util.TestRunner;
+import org.apache.nifi.util.TestRunners;
+import org.influxdb.InfluxDB;
+import org.influxdb.InfluxDBFactory;
+import org.influxdb.dto.Query;
+import org.influxdb.dto.QueryResult;
+import org.junit.After;
+import org.junit.Before;
+import org.junit.Ignore;
+import org.junit.Test;
+
+/**
+ * Integration test for InfluxDB. Please ensure that the InfluxDB is 
running
+ * on local host with default port and has database test with table test. 
Please set user
+ * and password if applicable before running the integration tests.
+ */
+@Ignore("Comment this out for running tests against a real instance of 
InfluxDB")
+public class ITPutInfluxDBTest {
+private TestRunner runner;
+private InfluxDB influxDB;
+private String dbName = "test";
+private String dbUrl = "http://localhost:8086;;
+private String user = "admin";
+private String password = "admin";
+
+@Before
+public void setUp() throws Exception {
+runner = TestRunners.newTestRunner(PutInfluxDB.class);
+runner.setProperty(PutInfluxDB.DB_NAME, dbName);
+runner.setProperty(PutInfluxDB.USERNAME, user);
+runner.setProperty(PutInfluxDB.PASSWORD, password);
+runner.setProperty(PutInfluxDB.INFLUX_DB_URL, dbUrl);
+runner.setProperty(PutInfluxDB.CHARSET, "UTF-8");
+
runner.setProperty(PutInfluxDB.CONSISTENCY_LEVEL,PutInfluxDB.CONSISTENCY_LEVEL_ONE.getValue());
+runner.setProperty(PutInfluxDB.RETENTION_POLICY,"autogen");
+runner.setProperty(PutInfluxDB.MAX_RECORDS_SIZE, "1 KB");
+runner.assertValid();
+
+influxDB = InfluxDBFactory.connect(dbUrl,user,password);
+
+if ( influxDB.databaseExists(dbName) ) {
+QueryResult result = influxDB.query(new Query("DROP 
measurement water", dbName));
+checkError(result);
+result = influxDB.query(new Query("DROP measurement testm", 
dbName));
+checkError(result);
+} else {
+influxDB.createDatabase(dbName);
+int max = 10;
+while (!influxDB.databaseExists(dbName) && (max-- < 0)) {
+Thread.sleep(5);
+}
+if ( ! influxDB.databaseExists(dbName) ) {
+throw new Exception("unable to create database " + dbName);
+}
+}
+}
+
+protected void checkError(QueryResult result) {
+if ( result.hasError() )
+throw new IllegalStateException("Error while dropping 
measurements " + result.getError());
+}
+
+@After
+public void tearDown() throws Exception {
+runner = null;
+if ( influxDB != null )
--- End diff --

Curly brackets...


---


[GitHub] nifi pull request #2101: NIFI-4289 - InfluxDB put processor

2018-01-26 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2101#discussion_r164210019
  
--- Diff: 
nifi-nar-bundles/nifi-influxdb-bundle/nifi-influxdb-processors/src/main/java/org/apache/nifi/processors/influxdb/AbstractInfluxDBProcessor.java
 ---
@@ -0,0 +1,142 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.influxdb;
+
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.DataUnit;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.influxdb.InfluxDB;
+import org.influxdb.InfluxDBFactory;
+
+/**
+ * Abstract base class for InfluxDB processors
+ */
+abstract class AbstractInfluxDBProcessor extends AbstractProcessor {
+
+protected static final PropertyDescriptor CHARSET = new 
PropertyDescriptor.Builder()
+.name("influxdb-charset")
+.displayName("Character Set")
+.description("Specifies the character set of the document 
data.")
+.required(true)
+.defaultValue("UTF-8")
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.CHARACTER_SET_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor INFLUX_DB_URL = new 
PropertyDescriptor.Builder()
+.name("influxdb-url")
+.displayName("InfluxDB connection url")
+.description("InfluxDB url to connect to")
+.required(true)
+.addValidator(StandardValidators.URL_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor DB_NAME = new 
PropertyDescriptor.Builder()
+.name("influxdb-dbname")
+.displayName("DB Name")
+.description("InfluxDB database to connect to")
+.required(true)
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor USERNAME = new 
PropertyDescriptor.Builder()
+.name("influxdb-username")
+.displayName("Username")
+.required(false)
+.description("Username for accessing InfluxDB")
+.addValidator(Validator.VALID)
+.build();
+
+public static final PropertyDescriptor PASSWORD = new 
PropertyDescriptor.Builder()
+.name("influxdb-password")
+.displayName("Password")
+.description("Password for user")
+.addValidator(Validator.VALID)
+.sensitive(true)
+.build();
+
+protected static final PropertyDescriptor MAX_RECORDS_SIZE = new 
PropertyDescriptor.Builder()
+.name("influxdb-max-records-size")
+.displayName("Max size of records")
+.description("Maximum size of records allowed to be posted in 
one batch")
+.defaultValue("1 MB")
+.required(true)
+.addValidator(StandardValidators.DATA_SIZE_VALIDATOR)
+.build();
+
+static final Relationship REL_SUCCESS = new 
Relationship.Builder().name("success")
+.description("Sucessful FlowFiles are routed to this 
relationship").build();
+
+static final Relationship REL_FAILURE = new 
Relationship.Builder().name("failure")
+.description("Failed FlowFiles are routed to this 
relationship").build();
+
+public static final String INFLUX_DB_ERROR_MESSAGE = 

[GitHub] nifi pull request #2101: NIFI-4289 - InfluxDB put processor

2018-01-26 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2101#discussion_r164212552
  
--- Diff: 
nifi-nar-bundles/nifi-influxdb-bundle/nifi-influxdb-processors/src/test/java/org/apache/nifi/processors/influxdb/ITPutInfluxDBTest.java
 ---
@@ -0,0 +1,189 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.influxdb;
+
+import static org.junit.Assert.assertEquals;
+import java.util.List;
+import org.apache.nifi.util.MockFlowFile;
+import org.apache.nifi.util.TestRunner;
+import org.apache.nifi.util.TestRunners;
+import org.influxdb.InfluxDB;
+import org.influxdb.InfluxDBFactory;
+import org.influxdb.dto.Query;
+import org.influxdb.dto.QueryResult;
+import org.junit.After;
+import org.junit.Before;
+import org.junit.Ignore;
+import org.junit.Test;
+
+/**
+ * Integration test for InfluxDB. Please ensure that the InfluxDB is 
running
+ * on local host with default port and has database test with table test. 
Please set user
+ * and password if applicable before running the integration tests.
+ */
+@Ignore("Comment this out for running tests against a real instance of 
InfluxDB")
+public class ITPutInfluxDBTest {
+private TestRunner runner;
+private InfluxDB influxDB;
+private String dbName = "test";
+private String dbUrl = "http://localhost:8086;;
+private String user = "admin";
+private String password = "admin";
+
+@Before
+public void setUp() throws Exception {
+runner = TestRunners.newTestRunner(PutInfluxDB.class);
+runner.setProperty(PutInfluxDB.DB_NAME, dbName);
+runner.setProperty(PutInfluxDB.USERNAME, user);
+runner.setProperty(PutInfluxDB.PASSWORD, password);
+runner.setProperty(PutInfluxDB.INFLUX_DB_URL, dbUrl);
+runner.setProperty(PutInfluxDB.CHARSET, "UTF-8");
+
runner.setProperty(PutInfluxDB.CONSISTENCY_LEVEL,PutInfluxDB.CONSISTENCY_LEVEL_ONE.getValue());
+runner.setProperty(PutInfluxDB.RETENTION_POLICY,"autogen");
+runner.setProperty(PutInfluxDB.MAX_RECORDS_SIZE, "1 KB");
+runner.assertValid();
+
+influxDB = InfluxDBFactory.connect(dbUrl,user,password);
+
+if ( influxDB.databaseExists(dbName) ) {
+QueryResult result = influxDB.query(new Query("DROP 
measurement water", dbName));
+checkError(result);
+result = influxDB.query(new Query("DROP measurement testm", 
dbName));
+checkError(result);
+} else {
+influxDB.createDatabase(dbName);
+int max = 10;
+while (!influxDB.databaseExists(dbName) && (max-- < 0)) {
+Thread.sleep(5);
+}
+if ( ! influxDB.databaseExists(dbName) ) {
+throw new Exception("unable to create database " + dbName);
+}
+}
+}
+
+protected void checkError(QueryResult result) {
+if ( result.hasError() )
--- End diff --

Please add curly brackets.


---


[jira] [Commented] (NIFI-4821) Upgrade to Apache POI 3.16 or newer

2018-01-26 Thread Joseph Witt (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4821?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16341514#comment-16341514
 ] 

Joseph Witt commented on NIFI-4821:
---

{quote}

./nar/extensions/nifi-media-nar-1.5.0.nar-unpacked/META-INF/bundled-dependencies/poi-ooxml-schemas-3.17-beta1.jar
./nar/extensions/nifi-media-nar-1.5.0.nar-unpacked/META-INF/bundled-dependencies/poi-3.17-beta1.jar
./nar/extensions/nifi-media-nar-1.5.0.nar-unpacked/META-INF/bundled-dependencies/poi-ooxml-3.17-beta1.jar
./nar/extensions/nifi-media-nar-1.5.0.nar-unpacked/META-INF/bundled-dependencies/poi-scratchpad-3.17-beta1.jar
./nar/extensions/nifi-aws-nar-1.5.0.nar-unpacked/META-INF/bundled-dependencies/aws-java-sdk-pinpoint-1.11.68.jar
./nar/extensions/nifi-poi-nar-1.5.0.nar-unpacked/META-INF/MANIFEST.MF
./nar/extensions/nifi-poi-nar-1.5.0.nar-unpacked/META-INF/LICENSE
./nar/extensions/nifi-poi-nar-1.5.0.nar-unpacked/META-INF/maven/org.apache.nifi/nifi-poi-nar/pom.xml
./nar/extensions/nifi-poi-nar-1.5.0.nar-unpacked/META-INF/maven/org.apache.nifi/nifi-poi-nar/pom.properties
./nar/extensions/nifi-poi-nar-1.5.0.nar-unpacked/META-INF/DEPENDENCIES
./nar/extensions/nifi-poi-nar-1.5.0.nar-unpacked/META-INF/NOTICE
./nar/extensions/nifi-poi-nar-1.5.0.nar-unpacked/META-INF/bundled-dependencies/xmlbeans-2.6.0.jar
./nar/extensions/nifi-poi-nar-1.5.0.nar-unpacked/META-INF/bundled-dependencies/commons-csv-1.4.jar
./nar/extensions/nifi-poi-nar-1.5.0.nar-unpacked/META-INF/bundled-dependencies/bcprov-jdk15on-1.55.jar
./nar/extensions/nifi-poi-nar-1.5.0.nar-unpacked/META-INF/bundled-dependencies/bcpkix-jdk15on-1.55.jar
./nar/extensions/nifi-poi-nar-1.5.0.nar-unpacked/META-INF/bundled-dependencies/nifi-poi-processors-1.5.0.jar
./nar/extensions/nifi-poi-nar-1.5.0.nar-unpacked/META-INF/bundled-dependencies/xercesImpl-2.11.0.jar
./nar/extensions/nifi-poi-nar-1.5.0.nar-unpacked/META-INF/bundled-dependencies/poi-3.17.jar
./nar/extensions/nifi-poi-nar-1.5.0.nar-unpacked/META-INF/bundled-dependencies/jackson-annotations-2.9.0.jar
./nar/extensions/nifi-poi-nar-1.5.0.nar-unpacked/META-INF/bundled-dependencies/nifi-standard-record-utils-1.5.0.jar
./nar/extensions/nifi-poi-nar-1.5.0.nar-unpacked/META-INF/bundled-dependencies/poi-ooxml-schemas-3.17.jar
./nar/extensions/nifi-poi-nar-1.5.0.nar-unpacked/META-INF/bundled-dependencies/commons-collections4-4.1.jar
./nar/extensions/nifi-poi-nar-1.5.0.nar-unpacked/META-INF/bundled-dependencies/poi-ooxml-3.17.jar
./nar/extensions/nifi-poi-nar-1.5.0.nar-unpacked/META-INF/bundled-dependencies/commons-codec-1.11.jar
./nar/extensions/nifi-poi-nar-1.5.0.nar-unpacked/META-INF/bundled-dependencies/commons-io-2.6.jar
./nar/extensions/nifi-poi-nar-1.5.0.nar-unpacked/META-INF/bundled-dependencies/nifi-security-utils-1.5.0.jar
./nar/extensions/nifi-poi-nar-1.5.0.nar-unpacked/META-INF/bundled-dependencies/xml-apis-1.4.01.jar
./nar/extensions/nifi-poi-nar-1.5.0.nar-unpacked/META-INF/bundled-dependencies/jackson-databind-2.9.1.jar
./nar/extensions/nifi-poi-nar-1.5.0.nar-unpacked/META-INF/bundled-dependencies/jackson-core-2.9.1.jar
./nar/extensions/nifi-poi-nar-1.5.0.nar-unpacked/META-INF/bundled-dependencies/stax-api-1.0.1.jar
./nar/extensions/nifi-poi-nar-1.5.0.nar-unpacked/META-INF/bundled-dependencies/nifi-utils-1.5.0.jar
./nar/extensions/nifi-poi-nar-1.5.0.nar-unpacked/META-INF/bundled-dependencies/commons-lang3-3.7.jar
./nar/extensions/nifi-poi-nar-1.5.0.nar-unpacked/META-INF/bundled-dependencies/nifi-processor-utils-1.5.0.jar
./nar/extensions/nifi-poi-nar-1.5.0.nar-unpacked/META-INF/bundled-dependencies/curvesapi-1.04.jar
./nar/extensions/nifi-poi-nar-1.5.0.nar-unpacked/nar-md5sum
./nar/extensions/nifi-email-nar-1.5.0.nar-unpacked/META-INF/bundled-dependencies/poi-3.16.jar
./nar/extensions/nifi-email-nar-1.5.0.nar-unpacked/META-INF/bundled-dependencies/poi-scratchpad-3.16.jar
./docs/components/org.apache.nifi/nifi-poi-nar/1.5.0/org.apache.nifi.processors.poi.ConvertExcelToCSVProcessor/index.html
./docs/components/org.apache.nifi/nifi-poi-nar/1.5.0/org.apache.nifi.processors.poi.ConvertExcelToCSVProcessor/additionalDetails.html

{quote}

> Upgrade to Apache POI 3.16 or newer
> ---
>
> Key: NIFI-4821
> URL: https://issues.apache.org/jira/browse/NIFI-4821
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Reporter: Joseph Witt
>Assignee: Joseph Witt
>Priority: Major
> Fix For: 1.6.0
>
>
> CVE-2017-12626 was announced today with the text:
>  
> Title: CVE-2017-12626 – Denial of Service Vulnerabilities in Apache POI < 3.17
> Severity: Important
> Vendor: The Apache Software Foundation
> Versions affected: versions prior to version 3.17
> Description:   
>     Apache POI versions prior to release 3.17 are vulnerable to Denial of 
> Service Attacks:
>     * Infinite Loops while parsing specially crafted WMF, 

[jira] [Updated] (NIFI-4821) Upgrade to Apache POI 3.16 or newer

2018-01-26 Thread Joseph Witt (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-4821?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Joseph Witt updated NIFI-4821:
--
Component/s: Extensions

> Upgrade to Apache POI 3.16 or newer
> ---
>
> Key: NIFI-4821
> URL: https://issues.apache.org/jira/browse/NIFI-4821
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Reporter: Joseph Witt
>Priority: Major
> Fix For: 1.6.0
>
>
> CVE-2017-12626 was announced today with the text:
>  
> Title: CVE-2017-12626 – Denial of Service Vulnerabilities in Apache POI < 3.17
> Severity: Important
> Vendor: The Apache Software Foundation
> Versions affected: versions prior to version 3.17
> Description:   
>     Apache POI versions prior to release 3.17 are vulnerable to Denial of 
> Service Attacks:
>     * Infinite Loops while parsing specially crafted WMF, EMF, MSG and macros
>           (POI bugs 61338 [0] and 61294 [1])
>     * Out of Memory Exceptions while parsing specially crafted DOC, PPT and 
> XLS 
>           (POI bugs 52372 [2] and 61295 [3])
> Mitigation:  Users with applications which accept content from external or 
> untrusted sources are advised to upgrade to Apache POI 3.17 or newer.
> -Tim Allison
> on behalf of the Apache POI PMC
>  
> [0] [https://bz.apache.org/bugzilla/show_bug.cgi?id=61338]
> [1] [https://bz.apache.org/bugzilla/show_bug.cgi?id=61294]
> [2] [https://bz.apache.org/bugzilla/show_bug.cgi?id=52372]
> [3] [https://bz.apache.org/bugzilla/show_bug.cgi?id=61295]



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (NIFI-4821) Upgrade to Apache POI 3.16 or newer

2018-01-26 Thread Joseph Witt (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-4821?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Joseph Witt reassigned NIFI-4821:
-

Assignee: Joseph Witt

> Upgrade to Apache POI 3.16 or newer
> ---
>
> Key: NIFI-4821
> URL: https://issues.apache.org/jira/browse/NIFI-4821
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Reporter: Joseph Witt
>Assignee: Joseph Witt
>Priority: Major
> Fix For: 1.6.0
>
>
> CVE-2017-12626 was announced today with the text:
>  
> Title: CVE-2017-12626 – Denial of Service Vulnerabilities in Apache POI < 3.17
> Severity: Important
> Vendor: The Apache Software Foundation
> Versions affected: versions prior to version 3.17
> Description:   
>     Apache POI versions prior to release 3.17 are vulnerable to Denial of 
> Service Attacks:
>     * Infinite Loops while parsing specially crafted WMF, EMF, MSG and macros
>           (POI bugs 61338 [0] and 61294 [1])
>     * Out of Memory Exceptions while parsing specially crafted DOC, PPT and 
> XLS 
>           (POI bugs 52372 [2] and 61295 [3])
> Mitigation:  Users with applications which accept content from external or 
> untrusted sources are advised to upgrade to Apache POI 3.17 or newer.
> -Tim Allison
> on behalf of the Apache POI PMC
>  
> [0] [https://bz.apache.org/bugzilla/show_bug.cgi?id=61338]
> [1] [https://bz.apache.org/bugzilla/show_bug.cgi?id=61294]
> [2] [https://bz.apache.org/bugzilla/show_bug.cgi?id=52372]
> [3] [https://bz.apache.org/bugzilla/show_bug.cgi?id=61295]



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (NIFI-4821) Upgrade to Apache POI 3.16 or newer

2018-01-26 Thread Joseph Witt (JIRA)
Joseph Witt created NIFI-4821:
-

 Summary: Upgrade to Apache POI 3.16 or newer
 Key: NIFI-4821
 URL: https://issues.apache.org/jira/browse/NIFI-4821
 Project: Apache NiFi
  Issue Type: Improvement
Reporter: Joseph Witt


CVE-2017-12626 was announced today with the text:

 

Title: CVE-2017-12626 – Denial of Service Vulnerabilities in Apache POI < 3.17

Severity: Important

Vendor: The Apache Software Foundation

Versions affected: versions prior to version 3.17

Description:   
    Apache POI versions prior to release 3.17 are vulnerable to Denial of 
Service Attacks:
    * Infinite Loops while parsing specially crafted WMF, EMF, MSG and macros
          (POI bugs 61338 [0] and 61294 [1])
    * Out of Memory Exceptions while parsing specially crafted DOC, PPT and XLS 
          (POI bugs 52372 [2] and 61295 [3])


Mitigation:  Users with applications which accept content from external or 
untrusted sources are advised to upgrade to Apache POI 3.17 or newer.

-Tim Allison

on behalf of the Apache POI PMC

 

[0] [https://bz.apache.org/bugzilla/show_bug.cgi?id=61338]
[1] [https://bz.apache.org/bugzilla/show_bug.cgi?id=61294]
[2] [https://bz.apache.org/bugzilla/show_bug.cgi?id=52372]
[3] [https://bz.apache.org/bugzilla/show_bug.cgi?id=61295]



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (NIFI-4821) Upgrade to Apache POI 3.16 or newer

2018-01-26 Thread Joseph Witt (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-4821?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Joseph Witt updated NIFI-4821:
--
Fix Version/s: 1.6.0

> Upgrade to Apache POI 3.16 or newer
> ---
>
> Key: NIFI-4821
> URL: https://issues.apache.org/jira/browse/NIFI-4821
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Reporter: Joseph Witt
>Priority: Major
> Fix For: 1.6.0
>
>
> CVE-2017-12626 was announced today with the text:
>  
> Title: CVE-2017-12626 – Denial of Service Vulnerabilities in Apache POI < 3.17
> Severity: Important
> Vendor: The Apache Software Foundation
> Versions affected: versions prior to version 3.17
> Description:   
>     Apache POI versions prior to release 3.17 are vulnerable to Denial of 
> Service Attacks:
>     * Infinite Loops while parsing specially crafted WMF, EMF, MSG and macros
>           (POI bugs 61338 [0] and 61294 [1])
>     * Out of Memory Exceptions while parsing specially crafted DOC, PPT and 
> XLS 
>           (POI bugs 52372 [2] and 61295 [3])
> Mitigation:  Users with applications which accept content from external or 
> untrusted sources are advised to upgrade to Apache POI 3.17 or newer.
> -Tim Allison
> on behalf of the Apache POI PMC
>  
> [0] [https://bz.apache.org/bugzilla/show_bug.cgi?id=61338]
> [1] [https://bz.apache.org/bugzilla/show_bug.cgi?id=61294]
> [2] [https://bz.apache.org/bugzilla/show_bug.cgi?id=52372]
> [3] [https://bz.apache.org/bugzilla/show_bug.cgi?id=61295]



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2180: Added GetMongoAggregation to support running Mongo ...

2018-01-26 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2180#discussion_r164200676
  
--- Diff: 
nifi-nar-bundles/nifi-mongodb-bundle/nifi-mongodb-processors/src/test/java/org/apache/nifi/processors/mongodb/GetMongoTest.java
 ---
@@ -135,15 +135,15 @@ public void testValidators() {
 // invalid projection
 runner.setVariable("projection", "{a: x,y,z}");
 runner.setProperty(GetMongo.QUERY, "{a: 1}");
-runner.setProperty(GetMongo.PROJECTION, "${projection}");
+runner.setProperty(GetMongo.PROJECTION, "{a: z}");
--- End diff --

The test was reporting a false positive for me until I made that change. It 
seemed to think that x,y,z was a valid projection, but when I changed it to a:z 
it recognized it was invalid. Since none of that is valid JSON, I'm not even 
sure why any of it is working instead of the Mongo client API throwing an 
exception on parsing it.


---


[jira] [Commented] (NIFI-4805) allow delayed transfer

2018-01-26 Thread Martin Mucha (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4805?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16341415#comment-16341415
 ] 

Martin Mucha commented on NIFI-4805:


I understand that, but retry-loop is valid usecase. There are event 'how-to's 
on web implemented via just looping without any slowing down (active waiting).

Please give me resolution on this. 

I don't want active waiting loop. I don't want to implement my own scheduler 
either with persisting FlowFiles until their retry come. IIUC penalization is 
the simplest correct way, once it exists. Or would you recomment something 
else?  (and if some devel abuses feature, it's his users who should suffer, not 
other devels/users, no?)

> allow delayed transfer
> --
>
> Key: NIFI-4805
> URL: https://issues.apache.org/jira/browse/NIFI-4805
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Reporter: Martin Mucha
>Priority: Minor
> Attachments: retry.xml
>
>
> Nifi has concept of penalization, but this penalization has fixed delay, and 
> there isn't way how to change it dynamically. 
> If we want to implement retry flow, where FlowFile flows in loop, we can 
> either lower performance of Processor via yielding it, or we can do active 
> waiting. And this is actually recommended as a correct way how to do that.
> It seems, that we can easily implement better RetryProcessor, all we missing 
> is `session.penalize` which accepts `penalizationPeriod`. Processor then can 
> gradually prolong waiting time after each failure.
>  
> Would it be possible to make such method visible?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


***UNCHECKED*** [jira] [Assigned] (NIFI-4089) Allow standalone NiFi to use ZooKeeper

2018-01-26 Thread Jeff Storck (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-4089?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jeff Storck reassigned NIFI-4089:
-

Assignee: Jeff Storck

> Allow standalone NiFi to use ZooKeeper
> --
>
> Key: NIFI-4089
> URL: https://issues.apache.org/jira/browse/NIFI-4089
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core Framework
>Affects Versions: 1.3.0
>Reporter: Jeff Storck
>Assignee: Jeff Storck
>Priority: Minor
>
> NiFi, when running standalone, will not start an embedded ZooKeeper, and will 
> ignore configuration to use an external ZooKeeper.  While this saves the user 
> from doing extra setup and conserves resources of the host on which NiFi is 
> running when not running in a cluster, it may make moving this standalone 
> node into a cluster more difficult since local state is not migrated to ZK.  
> This may cause data to be reprocessed, since the most recent state of 
> processing would not be present in ZK.
> If a user plans on eventually expanding a standalone NiFi into a cluster, it 
> would be beneficial to allow a standalone NiFi to use ZK for as a state 
> provider to make adding nodes to create a cluster easier.
> NIFI-4088 would be helpful in the scenario of migrating from a standalone 
> NiFi to a cluster by allowing the migration of local state to ZK.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4805) allow delayed transfer

2018-01-26 Thread Mark Payne (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4805?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16341218#comment-16341218
 ] 

Mark Payne commented on NIFI-4805:
--

[~alfonz] the delay that is imposed by the session penalization is controlled 
in the Processor's Settings configuration dialog. We did used to have a 
session.penalize method that allowed the developer to indicate a penalization 
period. As a result, developers typically would expose this configuration via a 
processor property. Or sometimes multiple properties - "Penalization Period for 
ABC", "Penalization Period for XYZ", "Penalization Period for This Other 
Thing". This ended up becoming confusing for users, because it was just more 
complicated to configure this, and every processor configured this differently. 
As a result, it was eventually removed all together and placed just on the 
Settings pane for the processor.

> allow delayed transfer
> --
>
> Key: NIFI-4805
> URL: https://issues.apache.org/jira/browse/NIFI-4805
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Reporter: Martin Mucha
>Priority: Minor
> Attachments: retry.xml
>
>
> Nifi has concept of penalization, but this penalization has fixed delay, and 
> there isn't way how to change it dynamically. 
> If we want to implement retry flow, where FlowFile flows in loop, we can 
> either lower performance of Processor via yielding it, or we can do active 
> waiting. And this is actually recommended as a correct way how to do that.
> It seems, that we can easily implement better RetryProcessor, all we missing 
> is `session.penalize` which accepts `penalizationPeriod`. Processor then can 
> gradually prolong waiting time after each failure.
>  
> Would it be possible to make such method visible?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (NIFI-4820) Extend dynamic JAAS support for Kafka processors

2018-01-26 Thread Robert Batts (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-4820?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Robert Batts updated NIFI-4820:
---
Description: NIFI-3528 added dynamic JAAS support, but was made 
specifically for Kerberos. Both SASL/SCRUM (KAFKA-3751), SASL/PLAINTEXT 
(KAFKA-2658 / KAFKA-3149), and any future Kafka SASL mechanism should have the 
ability to have a dynamic JAAS as well since they are implemented in Kafka. I 
would imagine this would come in the form of a Service for re-usability, but a 
less attractive option (in terms of Nifi security) may be to allow users to set 
sasl.jaas.config directly.  (was: NIFI-3528 added dynamic JAAS support, but was 
made specifically for Kerberos. Both SASL/SCRUM (KAFKA-3751), SASL/PLAINTEXT 
(KAFKA-2658), and any future Kafka SASL mechanism should have the ability to 
have a dynamic JAAS as well since they are implemented in Kafka. I would 
imagine this would come in the form of a Service for re-usability, but a less 
attractive option (in terms of Nifi security) may be to allow users to set 
sasl.jaas.config directly.)

> Extend dynamic JAAS support for Kafka processors
> 
>
> Key: NIFI-4820
> URL: https://issues.apache.org/jira/browse/NIFI-4820
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Affects Versions: 1.4.0, 1.5.0, 1.6.0
>Reporter: Robert Batts
>Priority: Major
>  Labels: kafka
>
> NIFI-3528 added dynamic JAAS support, but was made specifically for Kerberos. 
> Both SASL/SCRUM (KAFKA-3751), SASL/PLAINTEXT (KAFKA-2658 / KAFKA-3149), and 
> any future Kafka SASL mechanism should have the ability to have a dynamic 
> JAAS as well since they are implemented in Kafka. I would imagine this would 
> come in the form of a Service for re-usability, but a less attractive option 
> (in terms of Nifi security) may be to allow users to set sasl.jaas.config 
> directly.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi-minifi pull request #112: MINIFI-431 Resolving dependency issues from J...

2018-01-26 Thread apiri
GitHub user apiri opened a pull request:

https://github.com/apache/nifi-minifi/pull/112

MINIFI-431 Resolving dependency issues from Jersey 2.26 upgrade.

MINIFI-431 Resolving dependency issues from Jersey 2.26 upgrade.



You can merge this pull request into a Git repository by running:

$ git pull https://github.com/apiri/nifi-minifi MINIFI-431

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi-minifi/pull/112.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #112


commit 26de588910d0a8a0c887b2d0dd4e9506445c8cd6
Author: Aldrin Piri 
Date:   2018-01-26T14:56:46Z

MINIFI-431 Resolving dependency issues from Jersey 2.26 upgrade.




---


[jira] [Created] (NIFI-4820) Extend dynamic JAAS support for Kafka processors

2018-01-26 Thread Robert Batts (JIRA)
Robert Batts created NIFI-4820:
--

 Summary: Extend dynamic JAAS support for Kafka processors
 Key: NIFI-4820
 URL: https://issues.apache.org/jira/browse/NIFI-4820
 Project: Apache NiFi
  Issue Type: Improvement
  Components: Extensions
Affects Versions: 1.5.0, 1.4.0, 1.6.0
Reporter: Robert Batts


NIFI-3528 added dynamic JAAS support, but was made specifically for Kerberos. 
Both SASL/SCRUM (KAFKA-3751), SASL/PLAINTEXT (KAFKA-2658), and any future Kafka 
SASL mechanism should have the ability to have a dynamic JAAS as well since 
they are implemented in Kafka. I would imagine this would come in the form of a 
Service for re-usability, but a less attractive option (in terms of Nifi 
security) may be to allow users to set sasl.jaas.config directly.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (NIFI-4794) Improve Garbage Collection required by Provenance Repository

2018-01-26 Thread Mark Payne (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-4794?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mark Payne updated NIFI-4794:
-
Fix Version/s: 1.6.0

> Improve Garbage Collection required by Provenance Repository
> 
>
> Key: NIFI-4794
> URL: https://issues.apache.org/jira/browse/NIFI-4794
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core Framework
>Reporter: Mark Payne
>Assignee: Mark Payne
>Priority: Major
> Fix For: 1.6.0
>
>
> The EventIdFirstSchemaRecordWriter that is used by the provenance repository 
> has a writeRecord(ProvenanceEventRecord) method. Within this method, it 
> serializes the given record into a byte array by serializing to a 
> ByteArrayOutputStream (after wrapping the BAOS in a DataOutputStream). Once 
> this is done, it calls toByteArray() on that BAOS so that it can write the 
> byte[] directly to another OutputStream.
> This can create a rather large amount of garbage to be collected. We can 
> improve this significantly:
>  # Instead of creating a new ByteArrayOutputStream each time, create a pool 
> of them. This avoids constantly having to garbage collect them.
>  # If said BAOS grows beyond a certain size, we should not return it to the 
> pool because we don't want to keep a huge impact on the heap.
>  # Instead of wrapping the BAOS in a new DataOutputStream, the 
> DataOutputStream should be pooled/recycled as well. Since it must create an 
> internal byte[] for the writeUTF method, this can save a significant amount 
> of garbage.
>  # Avoid calling ByteArrayOutputStream.toByteArray(). We can instead just use 
> ByteArrayOutputStream.writeTo(OutputStream). This avoids both allocating that 
> new array/copying the data, and the GC overhead.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (NIFI-4794) Improve Garbage Collection required by Provenance Repository

2018-01-26 Thread Mark Payne (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-4794?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mark Payne updated NIFI-4794:
-
Status: Patch Available  (was: Open)

> Improve Garbage Collection required by Provenance Repository
> 
>
> Key: NIFI-4794
> URL: https://issues.apache.org/jira/browse/NIFI-4794
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core Framework
>Reporter: Mark Payne
>Assignee: Mark Payne
>Priority: Major
>
> The EventIdFirstSchemaRecordWriter that is used by the provenance repository 
> has a writeRecord(ProvenanceEventRecord) method. Within this method, it 
> serializes the given record into a byte array by serializing to a 
> ByteArrayOutputStream (after wrapping the BAOS in a DataOutputStream). Once 
> this is done, it calls toByteArray() on that BAOS so that it can write the 
> byte[] directly to another OutputStream.
> This can create a rather large amount of garbage to be collected. We can 
> improve this significantly:
>  # Instead of creating a new ByteArrayOutputStream each time, create a pool 
> of them. This avoids constantly having to garbage collect them.
>  # If said BAOS grows beyond a certain size, we should not return it to the 
> pool because we don't want to keep a huge impact on the heap.
>  # Instead of wrapping the BAOS in a new DataOutputStream, the 
> DataOutputStream should be pooled/recycled as well. Since it must create an 
> internal byte[] for the writeUTF method, this can save a significant amount 
> of garbage.
>  # Avoid calling ByteArrayOutputStream.toByteArray(). We can instead just use 
> ByteArrayOutputStream.writeTo(OutputStream). This avoids both allocating that 
> new array/copying the data, and the GC overhead.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4794) Improve Garbage Collection required by Provenance Repository

2018-01-26 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4794?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16341181#comment-16341181
 ] 

ASF GitHub Bot commented on NIFI-4794:
--

GitHub user markap14 opened a pull request:

https://github.com/apache/nifi/pull/2437

NIFI-4794: Updated event writers to avoid creating a lot of byte[] by…

… reusing buffers. Also removed synchronization on EventWriter when rolling 
over the writer and just moved the writing of the header to happen before 
making the writer available to any other threads. This reduces thread 
contention during rollover.

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [ ] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [ ] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [ ] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [ ] Is your initial contribution a single, squashed commit?

### For code changes:
- [ ] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/markap14/nifi NIFI-4794

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/2437.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2437


commit 3bbd64bfebe81cb099a4b8017d839e591b3d9bc7
Author: Mark Payne 
Date:   2018-01-25T17:16:56Z

NIFI-4794: Updated event writers to avoid creating a lot of byte[] by 
reusing buffers. Also removed synchronization on EventWriter when rolling over 
the writer and just moved the writing of the header to happen before making the 
writer available to any other threads. This reduces thread contention during 
rollover.




> Improve Garbage Collection required by Provenance Repository
> 
>
> Key: NIFI-4794
> URL: https://issues.apache.org/jira/browse/NIFI-4794
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core Framework
>Reporter: Mark Payne
>Assignee: Mark Payne
>Priority: Major
>
> The EventIdFirstSchemaRecordWriter that is used by the provenance repository 
> has a writeRecord(ProvenanceEventRecord) method. Within this method, it 
> serializes the given record into a byte array by serializing to a 
> ByteArrayOutputStream (after wrapping the BAOS in a DataOutputStream). Once 
> this is done, it calls toByteArray() on that BAOS so that it can write the 
> byte[] directly to another OutputStream.
> This can create a rather large amount of garbage to be collected. We can 
> improve this significantly:
>  # Instead of creating a new ByteArrayOutputStream each time, create a pool 
> of them. This avoids constantly having to garbage collect them.
>  # If said BAOS grows beyond a certain size, we should not return it to the 
> pool because we don't want to keep a huge impact on the heap.
>  # Instead of wrapping the BAOS in a new DataOutputStream, the 
> DataOutputStream should be pooled/recycled as well. Since it must create an 
> internal byte[] for the writeUTF method, this can save a significant amount 
> of garbage.
>  # Avoid calling ByteArrayOutputStream.toByteArray(). We can instead just use 
> ByteArrayOutputStream.writeTo(OutputStream). This avoids both allocating that 
> new array/copying the data, and the GC 

[GitHub] nifi pull request #2437: NIFI-4794: Updated event writers to avoid creating ...

2018-01-26 Thread markap14
GitHub user markap14 opened a pull request:

https://github.com/apache/nifi/pull/2437

NIFI-4794: Updated event writers to avoid creating a lot of byte[] by…

… reusing buffers. Also removed synchronization on EventWriter when 
rolling over the writer and just moved the writing of the header to happen 
before making the writer available to any other threads. This reduces thread 
contention during rollover.

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [ ] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [ ] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [ ] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [ ] Is your initial contribution a single, squashed commit?

### For code changes:
- [ ] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/markap14/nifi NIFI-4794

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/2437.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2437


commit 3bbd64bfebe81cb099a4b8017d839e591b3d9bc7
Author: Mark Payne 
Date:   2018-01-25T17:16:56Z

NIFI-4794: Updated event writers to avoid creating a lot of byte[] by 
reusing buffers. Also removed synchronization on EventWriter when rolling over 
the writer and just moved the writing of the header to happen before making the 
writer available to any other threads. This reduces thread contention during 
rollover.




---


[jira] [Commented] (NIFI-3753) ListenBeats: Compressed beats packets may cause: Error decoding Beats frame: Error decompressing frame: invalid distance too far back

2018-01-26 Thread Glenn Ambler (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-3753?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16341136#comment-16341136
 ] 

Glenn Ambler commented on NIFI-3753:


Also getting this error, and update would be much appreciated.  Thanks 

> ListenBeats: Compressed beats packets may cause: Error decoding Beats  frame: 
> Error decompressing  frame: invalid distance too far back
> ---
>
> Key: NIFI-3753
> URL: https://issues.apache.org/jira/browse/NIFI-3753
> Project: Apache NiFi
>  Issue Type: Bug
>Reporter: Andre F de Miranda
>Priority: Critical
>
> 017-04-28 02:03:37,153 ERROR [pool-106-thread-1] 
> o.a.nifi.processors.beats.List
> enBeats
> org.apache.nifi.processors.beats.frame.BeatsFrameException: Error decoding 
> Beats
>  frame: Error decompressing  frame: invalid distance too far back
> at 
> org.apache.nifi.processors.beats.frame.BeatsDecoder.process(BeatsDeco
> der.java:123) ~[nifi-beats-processors-1.2.0-SNAPSHOT.jar:1.2.0-SNAPSHOT]
> at 
> org.apache.nifi.processors.beats.handler.BeatsSocketChannelHandler.pr
> ocessBuffer(BeatsSocketChannelHandler.java:71) 
> ~[nifi-beats-processors-1.2.0-SNA
> PSHOT.jar:1.2.0-SNAPSHOT]
> at 
> org.apache.nifi.processor.util.listen.handler.socket.StandardSocketCh
> annelHandler.run(StandardSocketChannelHandler.java:76) 
> [nifi-processor-utils-1.2
> .0-SNAPSHOT.jar:1.2.0-SNAPSHOT]
> at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.
> java:1142) [na:1.8.0_131]
> at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>  [na:1.8.0_131]
> at java.lang.Thread.run(Thread.java:748) [na:1.8.0_131]
> Caused by: org.apache.nifi.processors.beats.frame.BeatsFrameException: Error 
> decompressing  frame: invalid distance too far back
> at 
> org.apache.nifi.processors.beats.frame.BeatsDecoder.processPAYLOAD(BeatsDecoder.java:292)
>  ~[nifi-beats-processors-1.2.0-SNAPSHOT.jar:1.2.0-SNAPSHOT]
> at 
> org.apache.nifi.processors.beats.frame.BeatsDecoder.process(BeatsDecoder.java:103)
>  ~[nifi-beats-processors-1.2.0-SNAPSHOT.jar:1.2.0-SNAPSHOT]
> ... 5 common frames omitted
> Caused by: java.util.zip.ZipException: invalid distance too far back
> at 
> java.util.zip.InflaterInputStream.read(InflaterInputStream.java:164) 
> ~[na:1.8.0_131]
> at java.io.FilterInputStream.read(FilterInputStream.java:107) 
> ~[na:1.8.0_131]
> at 
> org.apache.nifi.processors.beats.frame.BeatsDecoder.processPAYLOAD(BeatsDecoder.java:277)
>  ~[nifi-beats-processors-1.2.0-SNAPSHOT.jar:1.2.0-SNAPSHOT]
> ... 6 common frames omitted



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2180: Added GetMongoAggregation to support running Mongo ...

2018-01-26 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2180#discussion_r164106059
  
--- Diff: 
nifi-nar-bundles/nifi-mongodb-bundle/nifi-mongodb-processors/src/main/java/org/apache/nifi/processors/mongodb/AbstractMongoProcessor.java
 ---
@@ -95,15 +102,54 @@
 
 public static final PropertyDescriptor WRITE_CONCERN = new 
PropertyDescriptor.Builder()
 .name("Write Concern")
+.displayName("Write Concern")
 .description("The write concern to use")
 .required(true)
 .allowableValues(WRITE_CONCERN_ACKNOWLEDGED, 
WRITE_CONCERN_UNACKNOWLEDGED, WRITE_CONCERN_FSYNCED, WRITE_CONCERN_JOURNALED,
 WRITE_CONCERN_REPLICA_ACKNOWLEDGED, 
WRITE_CONCERN_MAJORITY)
 .defaultValue(WRITE_CONCERN_ACKNOWLEDGED)
 .build();
 
+static final PropertyDescriptor RESULTS_PER_FLOWFILE = new 
PropertyDescriptor.Builder()
+.name("results-per-flowfile")
+.displayName("Results Per FlowFile")
+.description("How many results to put into a flowfile at once. 
The whole body will be treated as a JSON array of results.")
+.required(false)
+.addValidator(StandardValidators.POSITIVE_INTEGER_VALIDATOR)
+.defaultValue("1")
+.build();
+
+static final PropertyDescriptor BATCH_SIZE = new 
PropertyDescriptor.Builder()
+.name("Batch Size")
+.displayName("Batch Size")
+.description("The number of elements returned from the server 
in one batch")
+.required(false)
+.addValidator(StandardValidators.POSITIVE_INTEGER_VALIDATOR)
--- End diff --

I think 0 would be confusing, so leaving it blank is the best bet. I should 
put some text in there to explain that.


---


[jira] [Comment Edited] (NIFI-4819) Add support to delete blob from Azure Storage container

2018-01-26 Thread zenfenaan (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4819?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16340893#comment-16340893
 ] 

zenfenaan edited comment on NIFI-4819 at 1/26/18 10:46 AM:
---

Created a PR. [#2436|https://github.com/apache/nifi/pull/2436]


was (Author: sivaprasanna):
Created a PR. #2436

> Add support to delete blob from Azure Storage container
> ---
>
> Key: NIFI-4819
> URL: https://issues.apache.org/jira/browse/NIFI-4819
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Reporter: zenfenaan
>Priority: Major
>
> Implement a delete processor that handles deleting blob from Azure Storage 
> container. This should be an extension of nifi-azure-nar bundle. Currently, 
> the azure bundle's storage processors has support to list, fetch, put Azure 
> Storage blobs.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4819) Add support to delete blob from Azure Storage container

2018-01-26 Thread zenfenaan (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4819?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16340893#comment-16340893
 ] 

zenfenaan commented on NIFI-4819:
-

Created a PR. #2436

> Add support to delete blob from Azure Storage container
> ---
>
> Key: NIFI-4819
> URL: https://issues.apache.org/jira/browse/NIFI-4819
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Reporter: zenfenaan
>Priority: Major
>
> Implement a delete processor that handles deleting blob from Azure Storage 
> container. This should be an extension of nifi-azure-nar bundle. Currently, 
> the azure bundle's storage processors has support to list, fetch, put Azure 
> Storage blobs.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)