[jira] [Commented] (NIFI-5510) Allow records to be put directly into Cassandra

2018-09-16 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5510?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16617076#comment-16617076
 ] 

ASF GitHub Bot commented on NIFI-5510:
--

Github user zenfenan commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2992#discussion_r217948182
  
--- Diff: 
nifi-nar-bundles/nifi-cassandra-bundle/nifi-cassandra-processors/src/main/java/org/apache/nifi/processors/cassandra/PutCassandraRecord.java
 ---
@@ -0,0 +1,239 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.cassandra;
+
+import com.datastax.driver.core.BatchStatement;
+import com.datastax.driver.core.ConsistencyLevel;
+import com.datastax.driver.core.Session;
+import com.datastax.driver.core.exceptions.AuthenticationException;
+import com.datastax.driver.core.exceptions.NoHostAvailableException;
+import com.datastax.driver.core.querybuilder.Insert;
+import com.datastax.driver.core.querybuilder.QueryBuilder;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnShutdown;
+import org.apache.nifi.annotation.lifecycle.OnUnscheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.ValidationContext;
+import org.apache.nifi.components.ValidationResult;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordFieldType;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.serialization.record.util.DataTypeUtils;
+import org.apache.nifi.util.StopWatch;
+
+import java.io.InputStream;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collection;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.concurrent.TimeUnit;
+import java.util.concurrent.atomic.AtomicInteger;
+
+@Tags({"cassandra", "cql", "put", "insert", "update", "set", "record"})
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@CapabilityDescription("Writes the content of the incoming FlowFile as 
individual records to Apache Cassandra using native protocol version 3 or 
higher.")
+public class PutCassandraRecord extends AbstractCassandraProcessor {
+
+static final PropertyDescriptor RECORD_READER_FACTORY = new 
PropertyDescriptor.Builder()
+.name("put-cassandra-record-reader")
+.displayName("Record Reader")
+.description("Specifies the type of Record Reader controller 
service to use for parsing the incoming data " +
+"and determining the schema")
+.identifiesControllerService(RecordReaderFactory.class)
+.required(true)
+.build();
+
+static final PropertyDescriptor TABLE = new 
PropertyDescriptor.Builder()
+.name("put-cassandra-record-table")
+.displayName("Table name")
+.description("The name of the Cassandra table to which the 
records have to be written.")
+.required(true)
+.addValidator(StandardValidators.NON_EMPTY_EL_VALIDATOR)
+ 

[GitHub] nifi pull request #2992: NIFI-5510: Introducing PutCassandraRecord processor

2018-09-16 Thread zenfenan
Github user zenfenan commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2992#discussion_r217948182
  
--- Diff: 
nifi-nar-bundles/nifi-cassandra-bundle/nifi-cassandra-processors/src/main/java/org/apache/nifi/processors/cassandra/PutCassandraRecord.java
 ---
@@ -0,0 +1,239 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.cassandra;
+
+import com.datastax.driver.core.BatchStatement;
+import com.datastax.driver.core.ConsistencyLevel;
+import com.datastax.driver.core.Session;
+import com.datastax.driver.core.exceptions.AuthenticationException;
+import com.datastax.driver.core.exceptions.NoHostAvailableException;
+import com.datastax.driver.core.querybuilder.Insert;
+import com.datastax.driver.core.querybuilder.QueryBuilder;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnShutdown;
+import org.apache.nifi.annotation.lifecycle.OnUnscheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.ValidationContext;
+import org.apache.nifi.components.ValidationResult;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordFieldType;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.serialization.record.util.DataTypeUtils;
+import org.apache.nifi.util.StopWatch;
+
+import java.io.InputStream;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collection;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.concurrent.TimeUnit;
+import java.util.concurrent.atomic.AtomicInteger;
+
+@Tags({"cassandra", "cql", "put", "insert", "update", "set", "record"})
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@CapabilityDescription("Writes the content of the incoming FlowFile as 
individual records to Apache Cassandra using native protocol version 3 or 
higher.")
+public class PutCassandraRecord extends AbstractCassandraProcessor {
+
+static final PropertyDescriptor RECORD_READER_FACTORY = new 
PropertyDescriptor.Builder()
+.name("put-cassandra-record-reader")
+.displayName("Record Reader")
+.description("Specifies the type of Record Reader controller 
service to use for parsing the incoming data " +
+"and determining the schema")
+.identifiesControllerService(RecordReaderFactory.class)
+.required(true)
+.build();
+
+static final PropertyDescriptor TABLE = new 
PropertyDescriptor.Builder()
+.name("put-cassandra-record-table")
+.displayName("Table name")
+.description("The name of the Cassandra table to which the 
records have to be written.")
+.required(true)
+.addValidator(StandardValidators.NON_EMPTY_EL_VALIDATOR)
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
+.build();
+
+static final PropertyDescriptor BATCH_SIZE = new 
PropertyDescriptor.Builder()
+

[jira] [Commented] (NIFI-5510) Allow records to be put directly into Cassandra

2018-09-16 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5510?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16617074#comment-16617074
 ] 

ASF GitHub Bot commented on NIFI-5510:
--

Github user zenfenan commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2992#discussion_r217948138
  
--- Diff: 
nifi-nar-bundles/nifi-cassandra-bundle/nifi-cassandra-processors/src/main/java/org/apache/nifi/processors/cassandra/PutCassandraRecord.java
 ---
@@ -0,0 +1,239 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.cassandra;
+
+import com.datastax.driver.core.BatchStatement;
+import com.datastax.driver.core.ConsistencyLevel;
+import com.datastax.driver.core.Session;
+import com.datastax.driver.core.exceptions.AuthenticationException;
+import com.datastax.driver.core.exceptions.NoHostAvailableException;
+import com.datastax.driver.core.querybuilder.Insert;
+import com.datastax.driver.core.querybuilder.QueryBuilder;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnShutdown;
+import org.apache.nifi.annotation.lifecycle.OnUnscheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.ValidationContext;
+import org.apache.nifi.components.ValidationResult;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordFieldType;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.serialization.record.util.DataTypeUtils;
+import org.apache.nifi.util.StopWatch;
+
+import java.io.InputStream;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collection;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.concurrent.TimeUnit;
+import java.util.concurrent.atomic.AtomicInteger;
+
+@Tags({"cassandra", "cql", "put", "insert", "update", "set", "record"})
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@CapabilityDescription("Writes the content of the incoming FlowFile as 
individual records to Apache Cassandra using native protocol version 3 or 
higher.")
+public class PutCassandraRecord extends AbstractCassandraProcessor {
+
+static final PropertyDescriptor RECORD_READER_FACTORY = new 
PropertyDescriptor.Builder()
+.name("put-cassandra-record-reader")
+.displayName("Record Reader")
+.description("Specifies the type of Record Reader controller 
service to use for parsing the incoming data " +
+"and determining the schema")
+.identifiesControllerService(RecordReaderFactory.class)
+.required(true)
+.build();
+
+static final PropertyDescriptor TABLE = new 
PropertyDescriptor.Builder()
+.name("put-cassandra-record-table")
+.displayName("Table name")
+.description("The name of the Cassandra table to which the 
records have to be written.")
+.required(true)
+.addValidator(StandardValidators.NON_EMPTY_EL_VALIDATOR)
+ 

[GitHub] nifi pull request #2992: NIFI-5510: Introducing PutCassandraRecord processor

2018-09-16 Thread zenfenan
Github user zenfenan commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2992#discussion_r217948138
  
--- Diff: 
nifi-nar-bundles/nifi-cassandra-bundle/nifi-cassandra-processors/src/main/java/org/apache/nifi/processors/cassandra/PutCassandraRecord.java
 ---
@@ -0,0 +1,239 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.cassandra;
+
+import com.datastax.driver.core.BatchStatement;
+import com.datastax.driver.core.ConsistencyLevel;
+import com.datastax.driver.core.Session;
+import com.datastax.driver.core.exceptions.AuthenticationException;
+import com.datastax.driver.core.exceptions.NoHostAvailableException;
+import com.datastax.driver.core.querybuilder.Insert;
+import com.datastax.driver.core.querybuilder.QueryBuilder;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnShutdown;
+import org.apache.nifi.annotation.lifecycle.OnUnscheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.ValidationContext;
+import org.apache.nifi.components.ValidationResult;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordFieldType;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.serialization.record.util.DataTypeUtils;
+import org.apache.nifi.util.StopWatch;
+
+import java.io.InputStream;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collection;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.concurrent.TimeUnit;
+import java.util.concurrent.atomic.AtomicInteger;
+
+@Tags({"cassandra", "cql", "put", "insert", "update", "set", "record"})
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@CapabilityDescription("Writes the content of the incoming FlowFile as 
individual records to Apache Cassandra using native protocol version 3 or 
higher.")
+public class PutCassandraRecord extends AbstractCassandraProcessor {
+
+static final PropertyDescriptor RECORD_READER_FACTORY = new 
PropertyDescriptor.Builder()
+.name("put-cassandra-record-reader")
+.displayName("Record Reader")
+.description("Specifies the type of Record Reader controller 
service to use for parsing the incoming data " +
+"and determining the schema")
+.identifiesControllerService(RecordReaderFactory.class)
+.required(true)
+.build();
+
+static final PropertyDescriptor TABLE = new 
PropertyDescriptor.Builder()
+.name("put-cassandra-record-table")
+.displayName("Table name")
+.description("The name of the Cassandra table to which the 
records have to be written.")
+.required(true)
+.addValidator(StandardValidators.NON_EMPTY_EL_VALIDATOR)
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
+.build();
+
+static final PropertyDescriptor BATCH_SIZE = new 
PropertyDescriptor.Builder()
+

[GitHub] nifi pull request #2992: NIFI-5510: Introducing PutCassandraRecord processor

2018-09-16 Thread zenfenan
Github user zenfenan commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2992#discussion_r217947988
  
--- Diff: 
nifi-nar-bundles/nifi-cassandra-bundle/nifi-cassandra-processors/src/main/java/org/apache/nifi/processors/cassandra/PutCassandraRecord.java
 ---
@@ -0,0 +1,239 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.cassandra;
+
+import com.datastax.driver.core.BatchStatement;
+import com.datastax.driver.core.ConsistencyLevel;
+import com.datastax.driver.core.Session;
+import com.datastax.driver.core.exceptions.AuthenticationException;
+import com.datastax.driver.core.exceptions.NoHostAvailableException;
+import com.datastax.driver.core.querybuilder.Insert;
+import com.datastax.driver.core.querybuilder.QueryBuilder;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnShutdown;
+import org.apache.nifi.annotation.lifecycle.OnUnscheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.ValidationContext;
+import org.apache.nifi.components.ValidationResult;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordFieldType;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.serialization.record.util.DataTypeUtils;
+import org.apache.nifi.util.StopWatch;
+
+import java.io.InputStream;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collection;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.concurrent.TimeUnit;
+import java.util.concurrent.atomic.AtomicInteger;
+
+@Tags({"cassandra", "cql", "put", "insert", "update", "set", "record"})
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@CapabilityDescription("Writes the content of the incoming FlowFile as 
individual records to Apache Cassandra using native protocol version 3 or 
higher.")
--- End diff --

Understood. I'll update it.


---


[jira] [Commented] (NIFI-5510) Allow records to be put directly into Cassandra

2018-09-16 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5510?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16617072#comment-16617072
 ] 

ASF GitHub Bot commented on NIFI-5510:
--

Github user zenfenan commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2992#discussion_r217947988
  
--- Diff: 
nifi-nar-bundles/nifi-cassandra-bundle/nifi-cassandra-processors/src/main/java/org/apache/nifi/processors/cassandra/PutCassandraRecord.java
 ---
@@ -0,0 +1,239 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.cassandra;
+
+import com.datastax.driver.core.BatchStatement;
+import com.datastax.driver.core.ConsistencyLevel;
+import com.datastax.driver.core.Session;
+import com.datastax.driver.core.exceptions.AuthenticationException;
+import com.datastax.driver.core.exceptions.NoHostAvailableException;
+import com.datastax.driver.core.querybuilder.Insert;
+import com.datastax.driver.core.querybuilder.QueryBuilder;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnShutdown;
+import org.apache.nifi.annotation.lifecycle.OnUnscheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.ValidationContext;
+import org.apache.nifi.components.ValidationResult;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordFieldType;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.serialization.record.util.DataTypeUtils;
+import org.apache.nifi.util.StopWatch;
+
+import java.io.InputStream;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collection;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.concurrent.TimeUnit;
+import java.util.concurrent.atomic.AtomicInteger;
+
+@Tags({"cassandra", "cql", "put", "insert", "update", "set", "record"})
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@CapabilityDescription("Writes the content of the incoming FlowFile as 
individual records to Apache Cassandra using native protocol version 3 or 
higher.")
--- End diff --

Understood. I'll update it.


> Allow records to be put directly into Cassandra
> ---
>
> Key: NIFI-5510
> URL: https://issues.apache.org/jira/browse/NIFI-5510
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Reporter: Matt Burgess
>Assignee: Sivaprasanna Sethuraman
>Priority: Major
>
> Currently the standard way of getting data into Cassandra through NiFi is to 
> use PutCassandraQL, which often means raw data needs to be converted to CQL 
> statements, usually done (with modifications) via ConvertJSONToSQL.
> It would be better to have something closer to PutDatabaseRecord, a processor 
> called PutCassandraRecord perhaps, that would take the raw data and input it 
> into Cassandra "directly", without the need for the user to convert the data 
> into CQL statements.



--
This message was sent by 

[jira] [Commented] (NIFI-5239) Make MongoDBControllerService able to act as a configuration source for MongoDB processors

2018-09-16 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5239?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16617055#comment-16617055
 ] 

ASF GitHub Bot commented on NIFI-5239:
--

Github user zenfenan commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2896#discussion_r217945557
  
--- Diff: 
nifi-nar-bundles/nifi-mongodb-bundle/nifi-mongodb-services/src/main/java/org/apache/nifi/mongodb/MongoDBControllerService.java
 ---
@@ -17,156 +17,162 @@
 
 package org.apache.nifi.mongodb;
 
-import com.mongodb.client.FindIterable;
+import com.mongodb.MongoClient;
+import com.mongodb.MongoClientOptions;
+import com.mongodb.MongoClientURI;
+import com.mongodb.WriteConcern;
 import com.mongodb.client.MongoCollection;
-import com.mongodb.client.MongoCursor;
 import com.mongodb.client.MongoDatabase;
-import com.mongodb.client.model.UpdateOptions;
-
+import org.apache.commons.lang3.StringUtils;
 import org.apache.nifi.annotation.documentation.CapabilityDescription;
 import org.apache.nifi.annotation.documentation.Tags;
 import org.apache.nifi.annotation.lifecycle.OnDisabled;
 import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.authentication.exception.ProviderCreationException;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.controller.AbstractControllerService;
 import org.apache.nifi.controller.ConfigurationContext;
-import org.apache.nifi.reporting.InitializationException;
+import org.apache.nifi.security.util.SslContextFactory;
+import org.apache.nifi.ssl.SSLContextService;
 import org.bson.Document;
 
-import java.io.IOException;
+import javax.net.ssl.SSLContext;
 import java.util.ArrayList;
 import java.util.List;
 
 @Tags({"mongo", "mongodb", "service"})
 @CapabilityDescription(
 "Provides a controller service that wraps most of the functionality of 
the MongoDB driver."
 )
-public class MongoDBControllerService extends 
AbstractMongoDBControllerService implements MongoDBClientService {
+public class MongoDBControllerService extends AbstractControllerService 
implements MongoDBClientService {
--- End diff --

If possible, I would be +1 for a controller service with a better name. I'm 
not saying `MongoDBControllerService` is a bad name but a name that rhymes with 
functionality that this controller service offers would sound even better. 
something like `MongoDBClientProvider` or something.


> Make MongoDBControllerService able to act as a configuration source for 
> MongoDB processors
> --
>
> Key: NIFI-5239
> URL: https://issues.apache.org/jira/browse/NIFI-5239
> Project: Apache NiFi
>  Issue Type: New Feature
>Reporter: Mike Thomsen
>Assignee: Mike Thomsen
>Priority: Major
>
> The MongoDBControllerService should be able to provide the getDatabase and 
> getCollection functionality that are built into the MongoDB processors 
> through AbstractMongoDBProcessor. Using the controller service with the 
> processors should be optional in the first release it's added and then 
> mandatory going forward.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5239) Make MongoDBControllerService able to act as a configuration source for MongoDB processors

2018-09-16 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5239?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16617057#comment-16617057
 ] 

ASF GitHub Bot commented on NIFI-5239:
--

Github user zenfenan commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2896#discussion_r217945056
  
--- Diff: 
nifi-nar-bundles/nifi-mongodb-bundle/nifi-mongodb-services/src/main/java/org/apache/nifi/mongodb/MongoDBControllerService.java
 ---
@@ -17,156 +17,162 @@
 
 package org.apache.nifi.mongodb;
 
-import com.mongodb.client.FindIterable;
+import com.mongodb.MongoClient;
+import com.mongodb.MongoClientOptions;
+import com.mongodb.MongoClientURI;
+import com.mongodb.WriteConcern;
 import com.mongodb.client.MongoCollection;
-import com.mongodb.client.MongoCursor;
 import com.mongodb.client.MongoDatabase;
-import com.mongodb.client.model.UpdateOptions;
-
+import org.apache.commons.lang3.StringUtils;
 import org.apache.nifi.annotation.documentation.CapabilityDescription;
 import org.apache.nifi.annotation.documentation.Tags;
 import org.apache.nifi.annotation.lifecycle.OnDisabled;
 import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.authentication.exception.ProviderCreationException;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.controller.AbstractControllerService;
 import org.apache.nifi.controller.ConfigurationContext;
-import org.apache.nifi.reporting.InitializationException;
+import org.apache.nifi.security.util.SslContextFactory;
+import org.apache.nifi.ssl.SSLContextService;
 import org.bson.Document;
 
-import java.io.IOException;
+import javax.net.ssl.SSLContext;
 import java.util.ArrayList;
 import java.util.List;
 
 @Tags({"mongo", "mongodb", "service"})
 @CapabilityDescription(
 "Provides a controller service that wraps most of the functionality of 
the MongoDB driver."
--- End diff --

`CapabilityDescription` has to be updated.


> Make MongoDBControllerService able to act as a configuration source for 
> MongoDB processors
> --
>
> Key: NIFI-5239
> URL: https://issues.apache.org/jira/browse/NIFI-5239
> Project: Apache NiFi
>  Issue Type: New Feature
>Reporter: Mike Thomsen
>Assignee: Mike Thomsen
>Priority: Major
>
> The MongoDBControllerService should be able to provide the getDatabase and 
> getCollection functionality that are built into the MongoDB processors 
> through AbstractMongoDBProcessor. Using the controller service with the 
> processors should be optional in the first release it's added and then 
> mandatory going forward.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5239) Make MongoDBControllerService able to act as a configuration source for MongoDB processors

2018-09-16 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5239?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16617056#comment-16617056
 ] 

ASF GitHub Bot commented on NIFI-5239:
--

Github user zenfenan commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2896#discussion_r217945280
  
--- Diff: 
nifi-nar-bundles/nifi-mongodb-bundle/nifi-mongodb-services/src/main/java/org/apache/nifi/mongodb/MongoDBControllerService.java
 ---
@@ -17,156 +17,162 @@
 
 package org.apache.nifi.mongodb;
 
-import com.mongodb.client.FindIterable;
+import com.mongodb.MongoClient;
+import com.mongodb.MongoClientOptions;
+import com.mongodb.MongoClientURI;
+import com.mongodb.WriteConcern;
 import com.mongodb.client.MongoCollection;
-import com.mongodb.client.MongoCursor;
 import com.mongodb.client.MongoDatabase;
-import com.mongodb.client.model.UpdateOptions;
-
+import org.apache.commons.lang3.StringUtils;
 import org.apache.nifi.annotation.documentation.CapabilityDescription;
 import org.apache.nifi.annotation.documentation.Tags;
 import org.apache.nifi.annotation.lifecycle.OnDisabled;
 import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.authentication.exception.ProviderCreationException;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.controller.AbstractControllerService;
 import org.apache.nifi.controller.ConfigurationContext;
-import org.apache.nifi.reporting.InitializationException;
+import org.apache.nifi.security.util.SslContextFactory;
+import org.apache.nifi.ssl.SSLContextService;
 import org.bson.Document;
 
-import java.io.IOException;
+import javax.net.ssl.SSLContext;
 import java.util.ArrayList;
 import java.util.List;
 
 @Tags({"mongo", "mongodb", "service"})
 @CapabilityDescription(
 "Provides a controller service that wraps most of the functionality of 
the MongoDB driver."
 )
-public class MongoDBControllerService extends 
AbstractMongoDBControllerService implements MongoDBClientService {
+public class MongoDBControllerService extends AbstractControllerService 
implements MongoDBClientService {
 private MongoDatabase db;
 private MongoCollection col;
+private String uri;
 
 @OnEnabled
-public void onEnabled(final ConfigurationContext context) throws 
InitializationException, IOException, InterruptedException {
+public void onEnabled(final ConfigurationContext context) {
+this.uri = 
context.getProperty(URI).evaluateAttributeExpressions().getValue();
 this.createClient(context);
-this.db = 
this.mongoClient.getDatabase(context.getProperty(MongoDBControllerService.DATABASE_NAME).getValue());
-this.col = 
this.db.getCollection(context.getProperty(MongoDBControllerService.COLLECTION_NAME).getValue());
-}
-
-@OnDisabled
-public void onDisable() {
-this.mongoClient.close();
-}
-
-@Override
-public long count(Document query) {
-return this.col.count(query);
-}
-
-@Override
-public void delete(Document query) {
-this.col.deleteMany(query);
 }
 
-@Override
-public boolean exists(Document query) {
-return this.col.count(query) > 0;
-}
-
-@Override
-public Document findOne(Document query) {
-MongoCursor cursor  = 
this.col.find(query).limit(1).iterator();
-Document retVal = cursor.tryNext();
-cursor.close();
-
-return retVal;
-}
+static List descriptors = new ArrayList<>();
 
-@Override
-public Document findOne(Document query, Document projection) {
-MongoCursor cursor  = projection != null
-? 
this.col.find(query).projection(projection).limit(1).iterator()
-: this.col.find(query).limit(1).iterator();
-Document retVal = cursor.tryNext();
-cursor.close();
-
-return retVal;
-}
-
-@Override
-public List findMany(Document query) {
-return findMany(query, null, -1);
+static {
+descriptors.add(URI);
+descriptors.add(SSL_CONTEXT_SERVICE);
--- End diff --

Why are we only exposing `URI` alone? I thought this PR intends to offer a 
controller service where we can configure a connection to a MongoDB database & 
collection and use that controller service optionally instead of the processor 
level `URI`, `Database Name`, `Collection Name`.


> Make MongoDBControllerService able to act as a configuration source for 
> MongoDB processors

[GitHub] nifi pull request #2896: NIFI-5239 Made a client service an optional source ...

2018-09-16 Thread zenfenan
Github user zenfenan commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2896#discussion_r217945557
  
--- Diff: 
nifi-nar-bundles/nifi-mongodb-bundle/nifi-mongodb-services/src/main/java/org/apache/nifi/mongodb/MongoDBControllerService.java
 ---
@@ -17,156 +17,162 @@
 
 package org.apache.nifi.mongodb;
 
-import com.mongodb.client.FindIterable;
+import com.mongodb.MongoClient;
+import com.mongodb.MongoClientOptions;
+import com.mongodb.MongoClientURI;
+import com.mongodb.WriteConcern;
 import com.mongodb.client.MongoCollection;
-import com.mongodb.client.MongoCursor;
 import com.mongodb.client.MongoDatabase;
-import com.mongodb.client.model.UpdateOptions;
-
+import org.apache.commons.lang3.StringUtils;
 import org.apache.nifi.annotation.documentation.CapabilityDescription;
 import org.apache.nifi.annotation.documentation.Tags;
 import org.apache.nifi.annotation.lifecycle.OnDisabled;
 import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.authentication.exception.ProviderCreationException;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.controller.AbstractControllerService;
 import org.apache.nifi.controller.ConfigurationContext;
-import org.apache.nifi.reporting.InitializationException;
+import org.apache.nifi.security.util.SslContextFactory;
+import org.apache.nifi.ssl.SSLContextService;
 import org.bson.Document;
 
-import java.io.IOException;
+import javax.net.ssl.SSLContext;
 import java.util.ArrayList;
 import java.util.List;
 
 @Tags({"mongo", "mongodb", "service"})
 @CapabilityDescription(
 "Provides a controller service that wraps most of the functionality of 
the MongoDB driver."
 )
-public class MongoDBControllerService extends 
AbstractMongoDBControllerService implements MongoDBClientService {
+public class MongoDBControllerService extends AbstractControllerService 
implements MongoDBClientService {
--- End diff --

If possible, I would be +1 for a controller service with a better name. I'm 
not saying `MongoDBControllerService` is a bad name but a name that rhymes with 
functionality that this controller service offers would sound even better. 
something like `MongoDBClientProvider` or something.


---


[GitHub] nifi pull request #2896: NIFI-5239 Made a client service an optional source ...

2018-09-16 Thread zenfenan
Github user zenfenan commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2896#discussion_r217945280
  
--- Diff: 
nifi-nar-bundles/nifi-mongodb-bundle/nifi-mongodb-services/src/main/java/org/apache/nifi/mongodb/MongoDBControllerService.java
 ---
@@ -17,156 +17,162 @@
 
 package org.apache.nifi.mongodb;
 
-import com.mongodb.client.FindIterable;
+import com.mongodb.MongoClient;
+import com.mongodb.MongoClientOptions;
+import com.mongodb.MongoClientURI;
+import com.mongodb.WriteConcern;
 import com.mongodb.client.MongoCollection;
-import com.mongodb.client.MongoCursor;
 import com.mongodb.client.MongoDatabase;
-import com.mongodb.client.model.UpdateOptions;
-
+import org.apache.commons.lang3.StringUtils;
 import org.apache.nifi.annotation.documentation.CapabilityDescription;
 import org.apache.nifi.annotation.documentation.Tags;
 import org.apache.nifi.annotation.lifecycle.OnDisabled;
 import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.authentication.exception.ProviderCreationException;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.controller.AbstractControllerService;
 import org.apache.nifi.controller.ConfigurationContext;
-import org.apache.nifi.reporting.InitializationException;
+import org.apache.nifi.security.util.SslContextFactory;
+import org.apache.nifi.ssl.SSLContextService;
 import org.bson.Document;
 
-import java.io.IOException;
+import javax.net.ssl.SSLContext;
 import java.util.ArrayList;
 import java.util.List;
 
 @Tags({"mongo", "mongodb", "service"})
 @CapabilityDescription(
 "Provides a controller service that wraps most of the functionality of 
the MongoDB driver."
 )
-public class MongoDBControllerService extends 
AbstractMongoDBControllerService implements MongoDBClientService {
+public class MongoDBControllerService extends AbstractControllerService 
implements MongoDBClientService {
 private MongoDatabase db;
 private MongoCollection col;
+private String uri;
 
 @OnEnabled
-public void onEnabled(final ConfigurationContext context) throws 
InitializationException, IOException, InterruptedException {
+public void onEnabled(final ConfigurationContext context) {
+this.uri = 
context.getProperty(URI).evaluateAttributeExpressions().getValue();
 this.createClient(context);
-this.db = 
this.mongoClient.getDatabase(context.getProperty(MongoDBControllerService.DATABASE_NAME).getValue());
-this.col = 
this.db.getCollection(context.getProperty(MongoDBControllerService.COLLECTION_NAME).getValue());
-}
-
-@OnDisabled
-public void onDisable() {
-this.mongoClient.close();
-}
-
-@Override
-public long count(Document query) {
-return this.col.count(query);
-}
-
-@Override
-public void delete(Document query) {
-this.col.deleteMany(query);
 }
 
-@Override
-public boolean exists(Document query) {
-return this.col.count(query) > 0;
-}
-
-@Override
-public Document findOne(Document query) {
-MongoCursor cursor  = 
this.col.find(query).limit(1).iterator();
-Document retVal = cursor.tryNext();
-cursor.close();
-
-return retVal;
-}
+static List descriptors = new ArrayList<>();
 
-@Override
-public Document findOne(Document query, Document projection) {
-MongoCursor cursor  = projection != null
-? 
this.col.find(query).projection(projection).limit(1).iterator()
-: this.col.find(query).limit(1).iterator();
-Document retVal = cursor.tryNext();
-cursor.close();
-
-return retVal;
-}
-
-@Override
-public List findMany(Document query) {
-return findMany(query, null, -1);
+static {
+descriptors.add(URI);
+descriptors.add(SSL_CONTEXT_SERVICE);
--- End diff --

Why are we only exposing `URI` alone? I thought this PR intends to offer a 
controller service where we can configure a connection to a MongoDB database & 
collection and use that controller service optionally instead of the processor 
level `URI`, `Database Name`, `Collection Name`.


---


[GitHub] nifi pull request #2896: NIFI-5239 Made a client service an optional source ...

2018-09-16 Thread zenfenan
Github user zenfenan commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2896#discussion_r217945056
  
--- Diff: 
nifi-nar-bundles/nifi-mongodb-bundle/nifi-mongodb-services/src/main/java/org/apache/nifi/mongodb/MongoDBControllerService.java
 ---
@@ -17,156 +17,162 @@
 
 package org.apache.nifi.mongodb;
 
-import com.mongodb.client.FindIterable;
+import com.mongodb.MongoClient;
+import com.mongodb.MongoClientOptions;
+import com.mongodb.MongoClientURI;
+import com.mongodb.WriteConcern;
 import com.mongodb.client.MongoCollection;
-import com.mongodb.client.MongoCursor;
 import com.mongodb.client.MongoDatabase;
-import com.mongodb.client.model.UpdateOptions;
-
+import org.apache.commons.lang3.StringUtils;
 import org.apache.nifi.annotation.documentation.CapabilityDescription;
 import org.apache.nifi.annotation.documentation.Tags;
 import org.apache.nifi.annotation.lifecycle.OnDisabled;
 import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.authentication.exception.ProviderCreationException;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.controller.AbstractControllerService;
 import org.apache.nifi.controller.ConfigurationContext;
-import org.apache.nifi.reporting.InitializationException;
+import org.apache.nifi.security.util.SslContextFactory;
+import org.apache.nifi.ssl.SSLContextService;
 import org.bson.Document;
 
-import java.io.IOException;
+import javax.net.ssl.SSLContext;
 import java.util.ArrayList;
 import java.util.List;
 
 @Tags({"mongo", "mongodb", "service"})
 @CapabilityDescription(
 "Provides a controller service that wraps most of the functionality of 
the MongoDB driver."
--- End diff --

`CapabilityDescription` has to be updated.


---


[jira] [Commented] (NIFI-4532) OpenID Connect User Authentication

2018-09-16 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4532?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16616799#comment-16616799
 ] 

ASF GitHub Bot commented on NIFI-4532:
--

GitHub user SarthakSahu opened a pull request:

https://github.com/apache/nifi/pull/3009

NIFI-4532 OpenID Connect User Authentication

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [ ] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [ ] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [ ] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [ ] Is your initial contribution a single, squashed commit?

### For code changes:
- [ ] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/SarthakSahu/nifi NIFI-4532

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/3009.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3009


commit cfc5a4b4e85e5d90d45d079f6ec46a135d0e2d61
Author: Sahu Sarthak 
Date:   2018-09-16T16:51:25Z

NIFI-4532 OpenID Connect User Authentication




> OpenID Connect User Authentication
> --
>
> Key: NIFI-4532
> URL: https://issues.apache.org/jira/browse/NIFI-4532
> Project: Apache NiFi
>  Issue Type: Sub-task
>  Components: Docker
>Reporter: Aldrin Piri
>Assignee: Sarthak
>Priority: Major
>
> Provide configuration for OpenID Connect user authentication in Docker images



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #3009: NIFI-4532 OpenID Connect User Authentication

2018-09-16 Thread SarthakSahu
GitHub user SarthakSahu opened a pull request:

https://github.com/apache/nifi/pull/3009

NIFI-4532 OpenID Connect User Authentication

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [ ] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [ ] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [ ] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [ ] Is your initial contribution a single, squashed commit?

### For code changes:
- [ ] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/SarthakSahu/nifi NIFI-4532

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/3009.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3009


commit cfc5a4b4e85e5d90d45d079f6ec46a135d0e2d61
Author: Sahu Sarthak 
Date:   2018-09-16T16:51:25Z

NIFI-4532 OpenID Connect User Authentication




---


[GitHub] nifi issue #3008: NIFI-5492_EXEC Adding UDF to EL

2018-09-16 Thread bdesert
Github user bdesert commented on the issue:

https://github.com/apache/nifi/pull/3008
  
@joewitt , @alopresto , @mattyb149 
Since we had discussions on this topic before, I would like to get your 
opinion.
I've addressed all the issues related to security (separate class loader), 
API changes (there are no such), scope (separate interface implementation 
enforced).
If that's not enough to feel safe on this, do not hesitate to trash this 
PR, or give me some feedback how it can be improved.
Thank you!



---


[GitHub] nifi pull request #3008: NIFI-5492_EXEC Adding UDF to EL

2018-09-16 Thread bdesert
GitHub user bdesert opened a pull request:

https://github.com/apache/nifi/pull/3008

NIFI-5492_EXEC Adding UDF to EL

**this PR adds new functinoality to expression language - user defined 
funtions.**

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [x] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [x] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [x] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [x] Is your initial contribution a single, squashed commit?

### For code changes:
- [x] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [x] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/bdesert/nifi NIFI-5492_EXEC

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/3008.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3008


commit 9cdb0f7b9fa963cfa29100d7b9ae76645388404b
Author: Ed B 
Date:   2018-09-16T14:31:27Z

NIFI-5492_EXEC Adding UDF to EL

this PR adds new functinoality to expression language - user defined 
funtions.




---


[jira] [Commented] (NIFI-5596) GetSplunk 401 unauthorized :Multi-cookie based session is not supported by splunk-sdk-java 1.5.0

2018-09-16 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5596?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16616704#comment-16616704
 ] 

ASF GitHub Bot commented on NIFI-5596:
--

Github user MikeThomsen commented on the issue:

https://github.com/apache/nifi/pull/3003
  
That concern wasn't over the CI issues, but over breaking compatibility by 
dropping support for versions of Splunk that need to be supported.


> GetSplunk 401 unauthorized :Multi-cookie based session is not supported by 
> splunk-sdk-java 1.5.0
> 
>
> Key: NIFI-5596
> URL: https://issues.apache.org/jira/browse/NIFI-5596
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.7.1
> Environment: all
>Reporter: Mohit
>Priority: Major
>  Labels: pull-request-available
> Fix For: 1.8.0
>
>
> Splunk Rest API server which is load balanced requires the clients must 
> support and send cookies with requests. In my corporate environment, Splunk 
> 6.6.6 Enterprise is provided in load-balanced way which fails when Nifi tried 
> to connect to it. 
> Multi-cookie support was added in splunk-sdk-java v-1.6.3 onwards. which is 
> used in nifi-splunk processor.
>  
> Proposed resolution:
> I have updated the version of splunk-sdk-java in pom.xml of 
> [nifi-splunk-bundle|https://github.com/apache/nifi/tree/master/nifi-nar-bundles/nifi-splunk-bundle].
>  Also, performed the manual testing from the build. It now works fine and 
> build is sane. I will propose the PR with this Jira issue. 
> *Update*
> Submitted : https://github.com/apache/nifi/pull/3007 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #3003: NIFI-5596 : Upgraded splunk-sdk-java version

2018-09-16 Thread MikeThomsen
Github user MikeThomsen commented on the issue:

https://github.com/apache/nifi/pull/3003
  
That concern wasn't over the CI issues, but over breaking compatibility by 
dropping support for versions of Splunk that need to be supported.


---


[jira] [Updated] (NIFI-5596) GetSplunk 401 unauthorized :Multi-cookie based session is not supported by splunk-sdk-java 1.5.0

2018-09-16 Thread Mohit (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-5596?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mohit updated NIFI-5596:

   Labels: pull-request-available  (was: )
Fix Version/s: 1.8.0

> GetSplunk 401 unauthorized :Multi-cookie based session is not supported by 
> splunk-sdk-java 1.5.0
> 
>
> Key: NIFI-5596
> URL: https://issues.apache.org/jira/browse/NIFI-5596
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.7.1
> Environment: all
>Reporter: Mohit
>Priority: Major
>  Labels: pull-request-available
> Fix For: 1.8.0
>
>
> Splunk Rest API server which is load balanced requires the clients must 
> support and send cookies with requests. In my corporate environment, Splunk 
> 6.6.6 Enterprise is provided in load-balanced way which fails when Nifi tried 
> to connect to it. 
> Multi-cookie support was added in splunk-sdk-java v-1.6.3 onwards. which is 
> used in nifi-splunk processor.
>  
> Proposed resolution:
> I have updated the version of splunk-sdk-java in pom.xml of 
> [nifi-splunk-bundle|https://github.com/apache/nifi/tree/master/nifi-nar-bundles/nifi-splunk-bundle].
>  Also, performed the manual testing from the build. It now works fine and 
> build is sane. I will propose the PR with this Jira issue. 
> *Update*
> Submitted : https://github.com/apache/nifi/pull/3007 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (NIFI-5596) GetSplunk 401 unauthorized :Multi-cookie based session is not supported by splunk-sdk-java 1.5.0

2018-09-16 Thread Mohit (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-5596?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mohit updated NIFI-5596:

Description: 
Splunk Rest API server which is load balanced requires the clients must support 
and send cookies with requests. In my corporate environment, Splunk 6.6.6 
Enterprise is provided in load-balanced way which fails when Nifi tried to 
connect to it. 

Multi-cookie support was added in splunk-sdk-java v-1.6.3 onwards. which is 
used in nifi-splunk processor.

 

Proposed resolution:

I have updated the version of splunk-sdk-java in pom.xml of 
[nifi-splunk-bundle|https://github.com/apache/nifi/tree/master/nifi-nar-bundles/nifi-splunk-bundle].
 Also, performed the manual testing from the build. It now works fine and build 
is sane. I will propose the PR with this Jira issue. 

*Update*

Submitted : https://github.com/apache/nifi/pull/3007 

  was:
Splunk Rest API server which is load balanced requires the clients must support 
and send cookies with requests. In my corporate environment, Splunk 6.6.6 
Enterprise is provided in load-balanced way which fails when Nifi tried to 
connect to it. 

Multi-cookie support was added in splunk-sdk-java v-1.6.3 onwards. which is 
used in nifi-splunk processor.

 

Proposed resolution:

I have updated the version of splunk-sdk-java in pom.xml of 
[nifi-splunk-bundle|https://github.com/apache/nifi/tree/master/nifi-nar-bundles/nifi-splunk-bundle].
 Also, performed the manual testing from the build. It now works fine and build 
is sane. I will propose the PR with this Jira issue. 

 


> GetSplunk 401 unauthorized :Multi-cookie based session is not supported by 
> splunk-sdk-java 1.5.0
> 
>
> Key: NIFI-5596
> URL: https://issues.apache.org/jira/browse/NIFI-5596
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.7.1
> Environment: all
>Reporter: Mohit
>Priority: Major
>
> Splunk Rest API server which is load balanced requires the clients must 
> support and send cookies with requests. In my corporate environment, Splunk 
> 6.6.6 Enterprise is provided in load-balanced way which fails when Nifi tried 
> to connect to it. 
> Multi-cookie support was added in splunk-sdk-java v-1.6.3 onwards. which is 
> used in nifi-splunk processor.
>  
> Proposed resolution:
> I have updated the version of splunk-sdk-java in pom.xml of 
> [nifi-splunk-bundle|https://github.com/apache/nifi/tree/master/nifi-nar-bundles/nifi-splunk-bundle].
>  Also, performed the manual testing from the build. It now works fine and 
> build is sane. I will propose the PR with this Jira issue. 
> *Update*
> Submitted : https://github.com/apache/nifi/pull/3007 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5051) Create a LookupService that uses ElasticSearch

2018-09-16 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5051?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16616674#comment-16616674
 ] 

ASF GitHub Bot commented on NIFI-5051:
--

Github user MikeThomsen commented on the issue:

https://github.com/apache/nifi/pull/2615
  

[ES_Lookup_Test.xml.txt](https://github.com/apache/nifi/files/2386507/ES_Lookup_Test.xml.txt)

[5051 Kibana 
Commands](https://github.com/apache/nifi/files/2386508/5051.kibana.txt)


[docker-compose.yml.txt](https://github.com/apache/nifi/files/2386510/docker-compose.yml.txt)





> Create a LookupService that uses ElasticSearch
> --
>
> Key: NIFI-5051
> URL: https://issues.apache.org/jira/browse/NIFI-5051
> Project: Apache NiFi
>  Issue Type: New Feature
>Reporter: Mike Thomsen
>Assignee: Mike Thomsen
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2615: NIFI-5051 Created ElasticSearch lookup service.

2018-09-16 Thread MikeThomsen
Github user MikeThomsen commented on the issue:

https://github.com/apache/nifi/pull/2615
  

[ES_Lookup_Test.xml.txt](https://github.com/apache/nifi/files/2386507/ES_Lookup_Test.xml.txt)

[5051 Kibana 
Commands](https://github.com/apache/nifi/files/2386508/5051.kibana.txt)


[docker-compose.yml.txt](https://github.com/apache/nifi/files/2386510/docker-compose.yml.txt)





---


[GitHub] nifi pull request #3007: Upgraded splunk-sdk-java version

2018-09-16 Thread mohitgargk
GitHub user mohitgargk opened a pull request:

https://github.com/apache/nifi/pull/3007

Upgraded splunk-sdk-java version

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [ ] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [ ] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [ ] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [ ] Is your initial contribution a single, squashed commit?

### For code changes:
- [ ] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/mohitgargk/nifi NIFI-5596

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/3007.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3007


commit ec175b285c23cc2325f3f31a830ceaa6704594e2
Author: Mohit Garg 
Date:   2018-09-16T10:23:56Z

Upgraded splunk-sdk-java version




---


[jira] [Commented] (NIFI-5596) GetSplunk 401 unauthorized :Multi-cookie based session is not supported by splunk-sdk-java 1.5.0

2018-09-16 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5596?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16616661#comment-16616661
 ] 

ASF GitHub Bot commented on NIFI-5596:
--

Github user mohitgargk closed the pull request at:

https://github.com/apache/nifi/pull/3003


> GetSplunk 401 unauthorized :Multi-cookie based session is not supported by 
> splunk-sdk-java 1.5.0
> 
>
> Key: NIFI-5596
> URL: https://issues.apache.org/jira/browse/NIFI-5596
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.7.1
> Environment: all
>Reporter: Mohit
>Priority: Major
>
> Splunk Rest API server which is load balanced requires the clients must 
> support and send cookies with requests. In my corporate environment, Splunk 
> 6.6.6 Enterprise is provided in load-balanced way which fails when Nifi tried 
> to connect to it. 
> Multi-cookie support was added in splunk-sdk-java v-1.6.3 onwards. which is 
> used in nifi-splunk processor.
>  
> Proposed resolution:
> I have updated the version of splunk-sdk-java in pom.xml of 
> [nifi-splunk-bundle|https://github.com/apache/nifi/tree/master/nifi-nar-bundles/nifi-splunk-bundle].
>  Also, performed the manual testing from the build. It now works fine and 
> build is sane. I will propose the PR with this Jira issue. 
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #3003: NIFI-5596 : Upgraded splunk-sdk-java version

2018-09-16 Thread mohitgargk
Github user mohitgargk closed the pull request at:

https://github.com/apache/nifi/pull/3003


---


[jira] [Commented] (NIFI-5596) GetSplunk 401 unauthorized :Multi-cookie based session is not supported by splunk-sdk-java 1.5.0

2018-09-16 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5596?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16616660#comment-16616660
 ] 

ASF GitHub Bot commented on NIFI-5596:
--

Github user mohitgargk commented on the issue:

https://github.com/apache/nifi/pull/3003
  
@MikeThomsen I tried with 6.6.6 and 7.1.2 after building nifi. Besides, the 
splunk-sdk-java 1.6.3.0 is released by Splunk and tested on Splunk 5.0 onwards. 
I doubt if splunk-sdk-java is causing nifi's CI tests to get failed. Any 
thoughts?


> GetSplunk 401 unauthorized :Multi-cookie based session is not supported by 
> splunk-sdk-java 1.5.0
> 
>
> Key: NIFI-5596
> URL: https://issues.apache.org/jira/browse/NIFI-5596
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.7.1
> Environment: all
>Reporter: Mohit
>Priority: Major
>
> Splunk Rest API server which is load balanced requires the clients must 
> support and send cookies with requests. In my corporate environment, Splunk 
> 6.6.6 Enterprise is provided in load-balanced way which fails when Nifi tried 
> to connect to it. 
> Multi-cookie support was added in splunk-sdk-java v-1.6.3 onwards. which is 
> used in nifi-splunk processor.
>  
> Proposed resolution:
> I have updated the version of splunk-sdk-java in pom.xml of 
> [nifi-splunk-bundle|https://github.com/apache/nifi/tree/master/nifi-nar-bundles/nifi-splunk-bundle].
>  Also, performed the manual testing from the build. It now works fine and 
> build is sane. I will propose the PR with this Jira issue. 
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #3003: NIFI-5596 : Upgraded splunk-sdk-java version

2018-09-16 Thread mohitgargk
Github user mohitgargk commented on the issue:

https://github.com/apache/nifi/pull/3003
  
@MikeThomsen I tried with 6.6.6 and 7.1.2 after building nifi. Besides, the 
splunk-sdk-java 1.6.3.0 is released by Splunk and tested on Splunk 5.0 onwards. 
I doubt if splunk-sdk-java is causing nifi's CI tests to get failed. Any 
thoughts?


---


[jira] [Commented] (NIFI-5596) GetSplunk 401 unauthorized :Multi-cookie based session is not supported by splunk-sdk-java 1.5.0

2018-09-16 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5596?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16616654#comment-16616654
 ] 

ASF GitHub Bot commented on NIFI-5596:
--

GitHub user mohitgargk reopened a pull request:

https://github.com/apache/nifi/pull/3003

NIFI-5596 : Upgraded splunk-sdk-java version

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [ ] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [ ] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [ ] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [ ] Is your initial contribution a single, squashed commit?

### For code changes:
- [ ] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/mohitgargk/nifi NIFI-5596

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/3003.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3003


commit 3fd80061cb95362aaa02409aa2e2288544bf85e5
Author: Mohit Garg 
Date:   2018-09-14T07:24:57Z

NIFI-5596 : Upgraded splunk-sdk-java version




> GetSplunk 401 unauthorized :Multi-cookie based session is not supported by 
> splunk-sdk-java 1.5.0
> 
>
> Key: NIFI-5596
> URL: https://issues.apache.org/jira/browse/NIFI-5596
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.7.1
> Environment: all
>Reporter: Mohit
>Priority: Major
>
> Splunk Rest API server which is load balanced requires the clients must 
> support and send cookies with requests. In my corporate environment, Splunk 
> 6.6.6 Enterprise is provided in load-balanced way which fails when Nifi tried 
> to connect to it. 
> Multi-cookie support was added in splunk-sdk-java v-1.6.3 onwards. which is 
> used in nifi-splunk processor.
>  
> Proposed resolution:
> I have updated the version of splunk-sdk-java in pom.xml of 
> [nifi-splunk-bundle|https://github.com/apache/nifi/tree/master/nifi-nar-bundles/nifi-splunk-bundle].
>  Also, performed the manual testing from the build. It now works fine and 
> build is sane. I will propose the PR with this Jira issue. 
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #3003: NIFI-5596 : Upgraded splunk-sdk-java version

2018-09-16 Thread mohitgargk
GitHub user mohitgargk reopened a pull request:

https://github.com/apache/nifi/pull/3003

NIFI-5596 : Upgraded splunk-sdk-java version

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [ ] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [ ] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [ ] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [ ] Is your initial contribution a single, squashed commit?

### For code changes:
- [ ] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/mohitgargk/nifi NIFI-5596

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/3003.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3003


commit 3fd80061cb95362aaa02409aa2e2288544bf85e5
Author: Mohit Garg 
Date:   2018-09-14T07:24:57Z

NIFI-5596 : Upgraded splunk-sdk-java version




---


[GitHub] nifi pull request #3003: NIFI-5596 : Upgraded splunk-sdk-java version

2018-09-16 Thread mohitgargk
Github user mohitgargk closed the pull request at:

https://github.com/apache/nifi/pull/3003


---


[jira] [Commented] (NIFI-5596) GetSplunk 401 unauthorized :Multi-cookie based session is not supported by splunk-sdk-java 1.5.0

2018-09-16 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5596?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16616652#comment-16616652
 ] 

ASF GitHub Bot commented on NIFI-5596:
--

Github user mohitgargk closed the pull request at:

https://github.com/apache/nifi/pull/3003


> GetSplunk 401 unauthorized :Multi-cookie based session is not supported by 
> splunk-sdk-java 1.5.0
> 
>
> Key: NIFI-5596
> URL: https://issues.apache.org/jira/browse/NIFI-5596
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.7.1
> Environment: all
>Reporter: Mohit
>Priority: Major
>
> Splunk Rest API server which is load balanced requires the clients must 
> support and send cookies with requests. In my corporate environment, Splunk 
> 6.6.6 Enterprise is provided in load-balanced way which fails when Nifi tried 
> to connect to it. 
> Multi-cookie support was added in splunk-sdk-java v-1.6.3 onwards. which is 
> used in nifi-splunk processor.
>  
> Proposed resolution:
> I have updated the version of splunk-sdk-java in pom.xml of 
> [nifi-splunk-bundle|https://github.com/apache/nifi/tree/master/nifi-nar-bundles/nifi-splunk-bundle].
>  Also, performed the manual testing from the build. It now works fine and 
> build is sane. I will propose the PR with this Jira issue. 
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5596) GetSplunk 401 unauthorized :Multi-cookie based session is not supported by splunk-sdk-java 1.5.0

2018-09-16 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5596?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16616653#comment-16616653
 ] 

ASF GitHub Bot commented on NIFI-5596:
--

Github user mohitgargk commented on the issue:

https://github.com/apache/nifi/pull/3003
  
Retrying CI again.


> GetSplunk 401 unauthorized :Multi-cookie based session is not supported by 
> splunk-sdk-java 1.5.0
> 
>
> Key: NIFI-5596
> URL: https://issues.apache.org/jira/browse/NIFI-5596
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.7.1
> Environment: all
>Reporter: Mohit
>Priority: Major
>
> Splunk Rest API server which is load balanced requires the clients must 
> support and send cookies with requests. In my corporate environment, Splunk 
> 6.6.6 Enterprise is provided in load-balanced way which fails when Nifi tried 
> to connect to it. 
> Multi-cookie support was added in splunk-sdk-java v-1.6.3 onwards. which is 
> used in nifi-splunk processor.
>  
> Proposed resolution:
> I have updated the version of splunk-sdk-java in pom.xml of 
> [nifi-splunk-bundle|https://github.com/apache/nifi/tree/master/nifi-nar-bundles/nifi-splunk-bundle].
>  Also, performed the manual testing from the build. It now works fine and 
> build is sane. I will propose the PR with this Jira issue. 
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #3003: NIFI-5596 : Upgraded splunk-sdk-java version

2018-09-16 Thread mohitgargk
Github user mohitgargk commented on the issue:

https://github.com/apache/nifi/pull/3003
  
Retrying CI again.


---