[jira] [Commented] (NIFI-5562) Upgrade Guava dependencies

2018-08-31 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5562?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16599524#comment-16599524
 ] 

ASF GitHub Bot commented on NIFI-5562:
--

Github user alopresto commented on the issue:

https://github.com/apache/nifi/pull/2977
  
Reviewing...


> Upgrade Guava dependencies
> --
>
> Key: NIFI-5562
> URL: https://issues.apache.org/jira/browse/NIFI-5562
> Project: Apache NiFi
>  Issue Type: Improvement
>Reporter: Nathan Gough
>Assignee: Nathan Gough
>Priority: Major
>
> A lot of the current Guava dependency versions are v18. Upgrade dependencies 
> from v18 to 25.1-jre and test.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2977: NIFI-5562 - Upgraded guava versions from v18.0 to v26.0-jr...

2018-08-31 Thread alopresto
Github user alopresto commented on the issue:

https://github.com/apache/nifi/pull/2977
  
Reviewing...


---


[jira] [Commented] (NIFI-5562) Upgrade Guava dependencies

2018-08-31 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5562?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16599502#comment-16599502
 ] 

ASF GitHub Bot commented on NIFI-5562:
--

Github user MikeThomsen commented on the issue:

https://github.com/apache/nifi/pull/2977
  
Saw @bbende's jira comment:

> We should only update Guava dependencies that we directly use in our own 
code, but not transitive dependencies used by client libraries (i.e. do not 
change the Guava version for things like Hadoop processors, or HBase 
processors, etc.).

I saw Kudu, Kite and Ignite get updated, but according to `mvn 
dependency:tree` they're not transitive. So I'm putting this here for 
documentation purposes. Overall LGTM but will take another look tomorrow.


> Upgrade Guava dependencies
> --
>
> Key: NIFI-5562
> URL: https://issues.apache.org/jira/browse/NIFI-5562
> Project: Apache NiFi
>  Issue Type: Improvement
>Reporter: Nathan Gough
>Assignee: Nathan Gough
>Priority: Major
>
> A lot of the current Guava dependency versions are v18. Upgrade dependencies 
> from v18 to 25.1-jre and test.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2977: NIFI-5562 - Upgraded guava versions from v18.0 to v26.0-jr...

2018-08-31 Thread MikeThomsen
Github user MikeThomsen commented on the issue:

https://github.com/apache/nifi/pull/2977
  
Saw @bbende's jira comment:

> We should only update Guava dependencies that we directly use in our own 
code, but not transitive dependencies used by client libraries (i.e. do not 
change the Guava version for things like Hadoop processors, or HBase 
processors, etc.).

I saw Kudu, Kite and Ignite get updated, but according to `mvn 
dependency:tree` they're not transitive. So I'm putting this here for 
documentation purposes. Overall LGTM but will take another look tomorrow.


---


[jira] [Commented] (NIFI-5562) Upgrade Guava dependencies

2018-08-31 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5562?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16599500#comment-16599500
 ] 

ASF GitHub Bot commented on NIFI-5562:
--

Github user MikeThomsen commented on the issue:

https://github.com/apache/nifi/pull/2977
  
@joewitt just checked the license on google/guava and it's still ASL so I 
think L&N is clear on this.


> Upgrade Guava dependencies
> --
>
> Key: NIFI-5562
> URL: https://issues.apache.org/jira/browse/NIFI-5562
> Project: Apache NiFi
>  Issue Type: Improvement
>Reporter: Nathan Gough
>Assignee: Nathan Gough
>Priority: Major
>
> A lot of the current Guava dependency versions are v18. Upgrade dependencies 
> from v18 to 25.1-jre and test.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2977: NIFI-5562 - Upgraded guava versions from v18.0 to v26.0-jr...

2018-08-31 Thread MikeThomsen
Github user MikeThomsen commented on the issue:

https://github.com/apache/nifi/pull/2977
  
@joewitt just checked the license on google/guava and it's still ASL so I 
think L&N is clear on this.


---


[jira] [Commented] (NIFI-5327) NetFlow Processors

2018-08-31 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5327?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16599494#comment-16599494
 ] 

ASF GitHub Bot commented on NIFI-5327:
--

Github user MikeThomsen commented on the issue:

https://github.com/apache/nifi/pull/2820
  
@PrashanthVenkatesan you also have a merge conflict. That'll need to be 
resolved as well.


> NetFlow Processors
> --
>
> Key: NIFI-5327
> URL: https://issues.apache.org/jira/browse/NIFI-5327
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Core Framework
>Affects Versions: 1.6.0
>Reporter: Prashanth Venkatesan
>Assignee: Prashanth Venkatesan
>Priority: Major
>
> As network traffic data scopes for the big data use case, would like NiFi to 
> have processors to support parsing of those protocols.
> Netflow is a protocol introduced by Cisco that provides the ability to 
> collect IP network traffic as it enters or exits an interface and is 
> described in detail in here:
> [https://www.cisco.com/c/en/us/td/docs/net_mgmt/netflow_collection_engine/3-6/user/guide/format.html]
>  
> Currently, I have created the following processor:
> *ParseNetflowv5*:  Parses the ingress netflowv5 bytes and ingest as either 
> NiFi flowfile attributes or as a JSON content. This also sends 
> one-time-template.
>  
> Further ahead, we can add many processor specific to network protocols in 
> this nar bundle.
> I will create a pull request.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2820: NIFI-5327 Adding Netflowv5 protocol parser

2018-08-31 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2820#discussion_r214501508
  
--- Diff: 
nifi-nar-bundles/nifi-network-bundle/nifi-network-utils/src/main/java/org/apache/nifi/processors/network/parser/Netflowv5Parser.java
 ---
@@ -0,0 +1,134 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ * http://www.apache.org/licenses/LICENSE-2.0
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.network.parser;
+
+import java.util.OptionalInt;
+
+import static 
org.apache.nifi.processors.network.parser.util.ConversionUtil.toShort;
+import static 
org.apache.nifi.processors.network.parser.util.ConversionUtil.toInt;
+import static 
org.apache.nifi.processors.network.parser.util.ConversionUtil.toLong;
+import static 
org.apache.nifi.processors.network.parser.util.ConversionUtil.toIPV4;
+
+/**
+ * Networkv5 is Cisco data export format which contains one header and one 
or more flow records. This Parser parses the netflowv5 format. More 
information: @see
+ * https://www.cisco.com/c/en/us/td/docs/net_mgmt/netflow_collection_engine/3-6/user/guide/format.html";>Netflowv5
+ */
+public final class Netflowv5Parser {
+private static final int HEADER_SIZE = 24;
+private static final int RECORD_SIZE = 48;
+
+private static final int SHORT_TYPE = 0;
+private static final int INTEGER_TYPE = 1;
+private static final int LONG_TYPE = 2;
+private static final int IPV4_TYPE = 3;
+
+private static final String headerField[] = { "version", "count", 
"sys_uptime", "unix_secs", "unix_nsecs", "flow_sequence", "engine_type", 
"engine_id", "sampling_interval" };
+private static final String recordField[] = { "srcaddr", "dstaddr", 
"nexthop", "input", "output", "dPkts", "dOctets", "first", "last", "srcport", 
"dstport", "pad1", "tcp_flags", "prot", "tos",
+"src_as", "dst_as", "src_mask", "dst_mask", "pad2" };
+
+private final int portNumber;
+
+private Object headerData[];
+private Object recordData[][];
+
+public Netflowv5Parser(final OptionalInt portNumber) {
+this.portNumber = (portNumber.isPresent()) ? portNumber.getAsInt() 
: 0;
+}
+
+public final int parse(final byte[] buffer) throws Throwable {
+final int version = toInt(buffer, 0, 2);
+assert version == 5 : "Version mismatch";
+final int count = toInt(buffer, 2, 2);
--- End diff --

Do we need any additional validation of the `buffer` variable like checking 
for a minimum length?


---


[GitHub] nifi pull request #2820: NIFI-5327 Adding Netflowv5 protocol parser

2018-08-31 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2820#discussion_r214501495
  
--- Diff: 
nifi-nar-bundles/nifi-network-bundle/nifi-network-utils/src/main/java/org/apache/nifi/processors/network/parser/Netflowv5Parser.java
 ---
@@ -0,0 +1,134 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ * http://www.apache.org/licenses/LICENSE-2.0
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.network.parser;
+
+import java.util.OptionalInt;
+
+import static 
org.apache.nifi.processors.network.parser.util.ConversionUtil.toShort;
+import static 
org.apache.nifi.processors.network.parser.util.ConversionUtil.toInt;
+import static 
org.apache.nifi.processors.network.parser.util.ConversionUtil.toLong;
+import static 
org.apache.nifi.processors.network.parser.util.ConversionUtil.toIPV4;
+
+/**
+ * Networkv5 is Cisco data export format which contains one header and one 
or more flow records. This Parser parses the netflowv5 format. More 
information: @see
+ * https://www.cisco.com/c/en/us/td/docs/net_mgmt/netflow_collection_engine/3-6/user/guide/format.html";>Netflowv5
+ */
+public final class Netflowv5Parser {
+private static final int HEADER_SIZE = 24;
+private static final int RECORD_SIZE = 48;
+
+private static final int SHORT_TYPE = 0;
+private static final int INTEGER_TYPE = 1;
+private static final int LONG_TYPE = 2;
+private static final int IPV4_TYPE = 3;
+
+private static final String headerField[] = { "version", "count", 
"sys_uptime", "unix_secs", "unix_nsecs", "flow_sequence", "engine_type", 
"engine_id", "sampling_interval" };
+private static final String recordField[] = { "srcaddr", "dstaddr", 
"nexthop", "input", "output", "dPkts", "dOctets", "first", "last", "srcport", 
"dstport", "pad1", "tcp_flags", "prot", "tos",
+"src_as", "dst_as", "src_mask", "dst_mask", "pad2" };
+
+private final int portNumber;
+
+private Object headerData[];
+private Object recordData[][];
+
+public Netflowv5Parser(final OptionalInt portNumber) {
+this.portNumber = (portNumber.isPresent()) ? portNumber.getAsInt() 
: 0;
+}
+
+public final int parse(final byte[] buffer) throws Throwable {
+final int version = toInt(buffer, 0, 2);
+assert version == 5 : "Version mismatch";
--- End diff --

Since assertions are disabled by default, throw `ProcessException` instead.


---


[jira] [Commented] (NIFI-5327) NetFlow Processors

2018-08-31 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5327?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16599490#comment-16599490
 ] 

ASF GitHub Bot commented on NIFI-5327:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2820#discussion_r214501717
  
--- Diff: 
nifi-nar-bundles/nifi-network-bundle/nifi-network-processors/src/main/java/org/apache/nifi/processors/network/ParseNetflowv5.java
 ---
@@ -0,0 +1,258 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ * http://www.apache.org/licenses/LICENSE-2.0
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.network;
+
+import static 
org.apache.nifi.processors.network.parser.Netflowv5Parser.getHeaderFields;
+import static 
org.apache.nifi.processors.network.parser.Netflowv5Parser.getRecordFields;
+
+import java.io.BufferedOutputStream;
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.OutputStream;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.OptionalInt;
+import java.util.Set;
+
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.ReadsAttribute;
+import org.apache.nifi.annotation.behavior.ReadsAttributes;
+import org.apache.nifi.annotation.behavior.SideEffectFree;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.io.InputStreamCallback;
+import org.apache.nifi.processor.io.OutputStreamCallback;
+import org.apache.nifi.processors.network.parser.Netflowv5Parser;
+import org.apache.nifi.stream.io.StreamUtils;
+
+import com.fasterxml.jackson.core.JsonProcessingException;
+import com.fasterxml.jackson.databind.ObjectMapper;
+import com.fasterxml.jackson.databind.node.ObjectNode;
+
+@EventDriven
+@SideEffectFree
+@SupportsBatching
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@Tags({ "network", "netflow", "attributes", "datagram", "v5", "packet", 
"byte" })
+@CapabilityDescription("Parses netflowv5 byte ingest and add to NiFi 
flowfile as attributes or JSON content.")
+@ReadsAttributes({ @ReadsAttribute(attribute = "udp.port", description = 
"Optionally read if packets are received from UDP datagrams.") })
+@WritesAttributes({ @WritesAttribute(attribute = "netflowv5.header.*", 
description = "The key and value generated by the parsing of the header 
fields."),
+@WritesAttribute(attribute = "netflowv5.record.*", description = 
"The key and value generated by the parsing of the record fields.") })
+
+public class ParseNetflowv5 extends AbstractProcessor {
+private String destination;
+// Add mapper
+private static final ObjectMapper mapper = new ObjectMapper();
+
+public static final String DESTINATION_CONTENT = "flowfile-content";
+public static final String DESTINATION_ATTRIBUTES = 
"flowfile-attribute";
+public static final PropertyDescriptor FIELDS_DESTINATION = new 
P

[jira] [Commented] (NIFI-5327) NetFlow Processors

2018-08-31 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5327?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16599488#comment-16599488
 ] 

ASF GitHub Bot commented on NIFI-5327:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2820#discussion_r214501991
  
--- Diff: 
nifi-nar-bundles/nifi-network-bundle/nifi-network-processors/src/main/java/org/apache/nifi/processors/network/ParseNetflowv5.java
 ---
@@ -0,0 +1,258 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ * http://www.apache.org/licenses/LICENSE-2.0
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.network;
+
+import static 
org.apache.nifi.processors.network.parser.Netflowv5Parser.getHeaderFields;
+import static 
org.apache.nifi.processors.network.parser.Netflowv5Parser.getRecordFields;
+
+import java.io.BufferedOutputStream;
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.OutputStream;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.OptionalInt;
+import java.util.Set;
+
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.ReadsAttribute;
+import org.apache.nifi.annotation.behavior.ReadsAttributes;
+import org.apache.nifi.annotation.behavior.SideEffectFree;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.io.InputStreamCallback;
+import org.apache.nifi.processor.io.OutputStreamCallback;
+import org.apache.nifi.processors.network.parser.Netflowv5Parser;
+import org.apache.nifi.stream.io.StreamUtils;
+
+import com.fasterxml.jackson.core.JsonProcessingException;
+import com.fasterxml.jackson.databind.ObjectMapper;
+import com.fasterxml.jackson.databind.node.ObjectNode;
+
+@EventDriven
+@SideEffectFree
+@SupportsBatching
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@Tags({ "network", "netflow", "attributes", "datagram", "v5", "packet", 
"byte" })
+@CapabilityDescription("Parses netflowv5 byte ingest and add to NiFi 
flowfile as attributes or JSON content.")
+@ReadsAttributes({ @ReadsAttribute(attribute = "udp.port", description = 
"Optionally read if packets are received from UDP datagrams.") })
+@WritesAttributes({ @WritesAttribute(attribute = "netflowv5.header.*", 
description = "The key and value generated by the parsing of the header 
fields."),
+@WritesAttribute(attribute = "netflowv5.record.*", description = 
"The key and value generated by the parsing of the record fields.") })
+
+public class ParseNetflowv5 extends AbstractProcessor {
+private String destination;
+// Add mapper
+private static final ObjectMapper mapper = new ObjectMapper();
+
+public static final String DESTINATION_CONTENT = "flowfile-content";
+public static final String DESTINATION_ATTRIBUTES = 
"flowfile-attribute";
+public static final PropertyDescriptor FIELDS_DESTINATION = new 
P

[GitHub] nifi issue #2820: NIFI-5327 Adding Netflowv5 protocol parser

2018-08-31 Thread MikeThomsen
Github user MikeThomsen commented on the issue:

https://github.com/apache/nifi/pull/2820
  
@PrashanthVenkatesan you also have a merge conflict. That'll need to be 
resolved as well.


---


[GitHub] nifi pull request #2820: NIFI-5327 Adding Netflowv5 protocol parser

2018-08-31 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2820#discussion_r214501822
  
--- Diff: 
nifi-nar-bundles/nifi-network-bundle/nifi-network-processors/src/main/java/org/apache/nifi/processors/network/ParseNetflowv5.java
 ---
@@ -0,0 +1,258 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ * http://www.apache.org/licenses/LICENSE-2.0
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.network;
+
+import static 
org.apache.nifi.processors.network.parser.Netflowv5Parser.getHeaderFields;
+import static 
org.apache.nifi.processors.network.parser.Netflowv5Parser.getRecordFields;
+
+import java.io.BufferedOutputStream;
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.OutputStream;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.OptionalInt;
+import java.util.Set;
+
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.ReadsAttribute;
+import org.apache.nifi.annotation.behavior.ReadsAttributes;
+import org.apache.nifi.annotation.behavior.SideEffectFree;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.io.InputStreamCallback;
+import org.apache.nifi.processor.io.OutputStreamCallback;
+import org.apache.nifi.processors.network.parser.Netflowv5Parser;
+import org.apache.nifi.stream.io.StreamUtils;
+
+import com.fasterxml.jackson.core.JsonProcessingException;
+import com.fasterxml.jackson.databind.ObjectMapper;
+import com.fasterxml.jackson.databind.node.ObjectNode;
+
+@EventDriven
+@SideEffectFree
+@SupportsBatching
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@Tags({ "network", "netflow", "attributes", "datagram", "v5", "packet", 
"byte" })
+@CapabilityDescription("Parses netflowv5 byte ingest and add to NiFi 
flowfile as attributes or JSON content.")
+@ReadsAttributes({ @ReadsAttribute(attribute = "udp.port", description = 
"Optionally read if packets are received from UDP datagrams.") })
+@WritesAttributes({ @WritesAttribute(attribute = "netflowv5.header.*", 
description = "The key and value generated by the parsing of the header 
fields."),
+@WritesAttribute(attribute = "netflowv5.record.*", description = 
"The key and value generated by the parsing of the record fields.") })
+
+public class ParseNetflowv5 extends AbstractProcessor {
+private String destination;
+// Add mapper
+private static final ObjectMapper mapper = new ObjectMapper();
+
+public static final String DESTINATION_CONTENT = "flowfile-content";
+public static final String DESTINATION_ATTRIBUTES = 
"flowfile-attribute";
+public static final PropertyDescriptor FIELDS_DESTINATION = new 
PropertyDescriptor.Builder().name("FIELDS_DESTINATION").displayName("Parsed 
fields destination")
+.description("Indicates whether the results of the parser are 
written " + "to the FlowFile content or a FlowFile attribute; if using " + 
D

[jira] [Commented] (NIFI-5327) NetFlow Processors

2018-08-31 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5327?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16599480#comment-16599480
 ] 

ASF GitHub Bot commented on NIFI-5327:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2820#discussion_r214501508
  
--- Diff: 
nifi-nar-bundles/nifi-network-bundle/nifi-network-utils/src/main/java/org/apache/nifi/processors/network/parser/Netflowv5Parser.java
 ---
@@ -0,0 +1,134 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ * http://www.apache.org/licenses/LICENSE-2.0
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.network.parser;
+
+import java.util.OptionalInt;
+
+import static 
org.apache.nifi.processors.network.parser.util.ConversionUtil.toShort;
+import static 
org.apache.nifi.processors.network.parser.util.ConversionUtil.toInt;
+import static 
org.apache.nifi.processors.network.parser.util.ConversionUtil.toLong;
+import static 
org.apache.nifi.processors.network.parser.util.ConversionUtil.toIPV4;
+
+/**
+ * Networkv5 is Cisco data export format which contains one header and one 
or more flow records. This Parser parses the netflowv5 format. More 
information: @see
+ * https://www.cisco.com/c/en/us/td/docs/net_mgmt/netflow_collection_engine/3-6/user/guide/format.html";>Netflowv5
+ */
+public final class Netflowv5Parser {
+private static final int HEADER_SIZE = 24;
+private static final int RECORD_SIZE = 48;
+
+private static final int SHORT_TYPE = 0;
+private static final int INTEGER_TYPE = 1;
+private static final int LONG_TYPE = 2;
+private static final int IPV4_TYPE = 3;
+
+private static final String headerField[] = { "version", "count", 
"sys_uptime", "unix_secs", "unix_nsecs", "flow_sequence", "engine_type", 
"engine_id", "sampling_interval" };
+private static final String recordField[] = { "srcaddr", "dstaddr", 
"nexthop", "input", "output", "dPkts", "dOctets", "first", "last", "srcport", 
"dstport", "pad1", "tcp_flags", "prot", "tos",
+"src_as", "dst_as", "src_mask", "dst_mask", "pad2" };
+
+private final int portNumber;
+
+private Object headerData[];
+private Object recordData[][];
+
+public Netflowv5Parser(final OptionalInt portNumber) {
+this.portNumber = (portNumber.isPresent()) ? portNumber.getAsInt() 
: 0;
+}
+
+public final int parse(final byte[] buffer) throws Throwable {
+final int version = toInt(buffer, 0, 2);
+assert version == 5 : "Version mismatch";
+final int count = toInt(buffer, 2, 2);
--- End diff --

Do we need any additional validation of the `buffer` variable like checking 
for a minimum length?


> NetFlow Processors
> --
>
> Key: NIFI-5327
> URL: https://issues.apache.org/jira/browse/NIFI-5327
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Core Framework
>Affects Versions: 1.6.0
>Reporter: Prashanth Venkatesan
>Assignee: Prashanth Venkatesan
>Priority: Major
>
> As network traffic data scopes for the big data use case, would like NiFi to 
> have processors to support parsing of those protocols.
> Netflow is a protocol introduced by Cisco that provides the ability to 
> collect IP network traffic as it enters or exits an interface and is 
> described in detail in here:
> [https://www.cisco.com/c/en/us/td/docs/net_mgmt/netflow_collection_engine/3-6/user/guide/format.html]
>  
> Currently, I have created the following processor:
> *ParseNetflowv5*:  Parses the ingress netflowv5 bytes and ingest as either 
> NiFi flowfile attributes or as a JSON content. This also sends 
> one-time-template.
>  
> Further ahead, we can add many processor specific to network protocols in 
> this nar bundle.
> I will create a pull request.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5327) NetFlow Processors

2018-08-31 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5327?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16599483#comment-16599483
 ] 

ASF GitHub Bot commented on NIFI-5327:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2820#discussion_r214501605
  
--- Diff: 
nifi-nar-bundles/nifi-network-bundle/nifi-network-processors/pom.xml ---
@@ -0,0 +1,67 @@
+
+
+http://maven.apache.org/POM/4.0.0"; 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance";
+   xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
http://maven.apache.org/xsd/maven-4.0.0.xsd";>
+   4.0.0
+
+   
+   org.apache.nifi
+   nifi-network-bundle
+   1.8.0-SNAPSHOT
+   
+
+   nifi-network-processors
+   jar
+
+   
+   
+   org.apache.nifi
+   nifi-api
+   
+   
+   org.apache.nifi
+   nifi-utils
+   1.8.0-SNAPSHOT
--- End diff --

We should be able to dispense with these manual version numbers.


> NetFlow Processors
> --
>
> Key: NIFI-5327
> URL: https://issues.apache.org/jira/browse/NIFI-5327
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Core Framework
>Affects Versions: 1.6.0
>Reporter: Prashanth Venkatesan
>Assignee: Prashanth Venkatesan
>Priority: Major
>
> As network traffic data scopes for the big data use case, would like NiFi to 
> have processors to support parsing of those protocols.
> Netflow is a protocol introduced by Cisco that provides the ability to 
> collect IP network traffic as it enters or exits an interface and is 
> described in detail in here:
> [https://www.cisco.com/c/en/us/td/docs/net_mgmt/netflow_collection_engine/3-6/user/guide/format.html]
>  
> Currently, I have created the following processor:
> *ParseNetflowv5*:  Parses the ingress netflowv5 bytes and ingest as either 
> NiFi flowfile attributes or as a JSON content. This also sends 
> one-time-template.
>  
> Further ahead, we can add many processor specific to network protocols in 
> this nar bundle.
> I will create a pull request.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2820: NIFI-5327 Adding Netflowv5 protocol parser

2018-08-31 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2820#discussion_r214501780
  
--- Diff: 
nifi-nar-bundles/nifi-network-bundle/nifi-network-processors/src/main/java/org/apache/nifi/processors/network/ParseNetflowv5.java
 ---
@@ -0,0 +1,258 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ * http://www.apache.org/licenses/LICENSE-2.0
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.network;
+
+import static 
org.apache.nifi.processors.network.parser.Netflowv5Parser.getHeaderFields;
+import static 
org.apache.nifi.processors.network.parser.Netflowv5Parser.getRecordFields;
+
+import java.io.BufferedOutputStream;
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.OutputStream;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.OptionalInt;
+import java.util.Set;
+
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.ReadsAttribute;
+import org.apache.nifi.annotation.behavior.ReadsAttributes;
+import org.apache.nifi.annotation.behavior.SideEffectFree;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.io.InputStreamCallback;
+import org.apache.nifi.processor.io.OutputStreamCallback;
+import org.apache.nifi.processors.network.parser.Netflowv5Parser;
+import org.apache.nifi.stream.io.StreamUtils;
+
+import com.fasterxml.jackson.core.JsonProcessingException;
+import com.fasterxml.jackson.databind.ObjectMapper;
+import com.fasterxml.jackson.databind.node.ObjectNode;
+
+@EventDriven
+@SideEffectFree
+@SupportsBatching
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@Tags({ "network", "netflow", "attributes", "datagram", "v5", "packet", 
"byte" })
+@CapabilityDescription("Parses netflowv5 byte ingest and add to NiFi 
flowfile as attributes or JSON content.")
+@ReadsAttributes({ @ReadsAttribute(attribute = "udp.port", description = 
"Optionally read if packets are received from UDP datagrams.") })
+@WritesAttributes({ @WritesAttribute(attribute = "netflowv5.header.*", 
description = "The key and value generated by the parsing of the header 
fields."),
+@WritesAttribute(attribute = "netflowv5.record.*", description = 
"The key and value generated by the parsing of the record fields.") })
+
+public class ParseNetflowv5 extends AbstractProcessor {
+private String destination;
+// Add mapper
+private static final ObjectMapper mapper = new ObjectMapper();
+
+public static final String DESTINATION_CONTENT = "flowfile-content";
+public static final String DESTINATION_ATTRIBUTES = 
"flowfile-attribute";
+public static final PropertyDescriptor FIELDS_DESTINATION = new 
PropertyDescriptor.Builder().name("FIELDS_DESTINATION").displayName("Parsed 
fields destination")
+.description("Indicates whether the results of the parser are 
written " + "to the FlowFile content or a FlowFile attribute; if using " + 
D

[jira] [Commented] (NIFI-5327) NetFlow Processors

2018-08-31 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5327?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16599484#comment-16599484
 ] 

ASF GitHub Bot commented on NIFI-5327:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2820#discussion_r214501669
  
--- Diff: 
nifi-nar-bundles/nifi-network-bundle/nifi-network-processors/src/main/java/org/apache/nifi/processors/network/ParseNetflowv5.java
 ---
@@ -0,0 +1,258 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ * http://www.apache.org/licenses/LICENSE-2.0
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.network;
+
+import static 
org.apache.nifi.processors.network.parser.Netflowv5Parser.getHeaderFields;
+import static 
org.apache.nifi.processors.network.parser.Netflowv5Parser.getRecordFields;
+
+import java.io.BufferedOutputStream;
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.OutputStream;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.OptionalInt;
+import java.util.Set;
+
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.ReadsAttribute;
+import org.apache.nifi.annotation.behavior.ReadsAttributes;
+import org.apache.nifi.annotation.behavior.SideEffectFree;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.io.InputStreamCallback;
+import org.apache.nifi.processor.io.OutputStreamCallback;
+import org.apache.nifi.processors.network.parser.Netflowv5Parser;
+import org.apache.nifi.stream.io.StreamUtils;
+
+import com.fasterxml.jackson.core.JsonProcessingException;
+import com.fasterxml.jackson.databind.ObjectMapper;
+import com.fasterxml.jackson.databind.node.ObjectNode;
+
+@EventDriven
+@SideEffectFree
+@SupportsBatching
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@Tags({ "network", "netflow", "attributes", "datagram", "v5", "packet", 
"byte" })
+@CapabilityDescription("Parses netflowv5 byte ingest and add to NiFi 
flowfile as attributes or JSON content.")
+@ReadsAttributes({ @ReadsAttribute(attribute = "udp.port", description = 
"Optionally read if packets are received from UDP datagrams.") })
+@WritesAttributes({ @WritesAttribute(attribute = "netflowv5.header.*", 
description = "The key and value generated by the parsing of the header 
fields."),
+@WritesAttribute(attribute = "netflowv5.record.*", description = 
"The key and value generated by the parsing of the record fields.") })
+
+public class ParseNetflowv5 extends AbstractProcessor {
+private String destination;
+// Add mapper
+private static final ObjectMapper mapper = new ObjectMapper();
+
+public static final String DESTINATION_CONTENT = "flowfile-content";
+public static final String DESTINATION_ATTRIBUTES = 
"flowfile-attribute";
+public static final PropertyDescriptor FIELDS_DESTINATION = new 
P

[GitHub] nifi pull request #2820: NIFI-5327 Adding Netflowv5 protocol parser

2018-08-31 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2820#discussion_r214501748
  
--- Diff: 
nifi-nar-bundles/nifi-network-bundle/nifi-network-processors/src/main/java/org/apache/nifi/processors/network/ParseNetflowv5.java
 ---
@@ -0,0 +1,258 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ * http://www.apache.org/licenses/LICENSE-2.0
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.network;
+
+import static 
org.apache.nifi.processors.network.parser.Netflowv5Parser.getHeaderFields;
+import static 
org.apache.nifi.processors.network.parser.Netflowv5Parser.getRecordFields;
+
+import java.io.BufferedOutputStream;
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.OutputStream;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.OptionalInt;
+import java.util.Set;
+
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.ReadsAttribute;
+import org.apache.nifi.annotation.behavior.ReadsAttributes;
+import org.apache.nifi.annotation.behavior.SideEffectFree;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.io.InputStreamCallback;
+import org.apache.nifi.processor.io.OutputStreamCallback;
+import org.apache.nifi.processors.network.parser.Netflowv5Parser;
+import org.apache.nifi.stream.io.StreamUtils;
+
+import com.fasterxml.jackson.core.JsonProcessingException;
+import com.fasterxml.jackson.databind.ObjectMapper;
+import com.fasterxml.jackson.databind.node.ObjectNode;
+
+@EventDriven
+@SideEffectFree
+@SupportsBatching
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@Tags({ "network", "netflow", "attributes", "datagram", "v5", "packet", 
"byte" })
+@CapabilityDescription("Parses netflowv5 byte ingest and add to NiFi 
flowfile as attributes or JSON content.")
+@ReadsAttributes({ @ReadsAttribute(attribute = "udp.port", description = 
"Optionally read if packets are received from UDP datagrams.") })
+@WritesAttributes({ @WritesAttribute(attribute = "netflowv5.header.*", 
description = "The key and value generated by the parsing of the header 
fields."),
+@WritesAttribute(attribute = "netflowv5.record.*", description = 
"The key and value generated by the parsing of the record fields.") })
+
+public class ParseNetflowv5 extends AbstractProcessor {
+private String destination;
+// Add mapper
+private static final ObjectMapper mapper = new ObjectMapper();
+
+public static final String DESTINATION_CONTENT = "flowfile-content";
+public static final String DESTINATION_ATTRIBUTES = 
"flowfile-attribute";
+public static final PropertyDescriptor FIELDS_DESTINATION = new 
PropertyDescriptor.Builder().name("FIELDS_DESTINATION").displayName("Parsed 
fields destination")
+.description("Indicates whether the results of the parser are 
written " + "to the FlowFile content or a FlowFile attribute; if using " + 
D

[GitHub] nifi pull request #2820: NIFI-5327 Adding Netflowv5 protocol parser

2018-08-31 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2820#discussion_r214501765
  
--- Diff: 
nifi-nar-bundles/nifi-network-bundle/nifi-network-processors/src/main/java/org/apache/nifi/processors/network/ParseNetflowv5.java
 ---
@@ -0,0 +1,258 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ * http://www.apache.org/licenses/LICENSE-2.0
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.network;
+
+import static 
org.apache.nifi.processors.network.parser.Netflowv5Parser.getHeaderFields;
+import static 
org.apache.nifi.processors.network.parser.Netflowv5Parser.getRecordFields;
+
+import java.io.BufferedOutputStream;
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.OutputStream;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.OptionalInt;
+import java.util.Set;
+
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.ReadsAttribute;
+import org.apache.nifi.annotation.behavior.ReadsAttributes;
+import org.apache.nifi.annotation.behavior.SideEffectFree;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.io.InputStreamCallback;
+import org.apache.nifi.processor.io.OutputStreamCallback;
+import org.apache.nifi.processors.network.parser.Netflowv5Parser;
+import org.apache.nifi.stream.io.StreamUtils;
+
+import com.fasterxml.jackson.core.JsonProcessingException;
+import com.fasterxml.jackson.databind.ObjectMapper;
+import com.fasterxml.jackson.databind.node.ObjectNode;
+
+@EventDriven
+@SideEffectFree
+@SupportsBatching
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@Tags({ "network", "netflow", "attributes", "datagram", "v5", "packet", 
"byte" })
+@CapabilityDescription("Parses netflowv5 byte ingest and add to NiFi 
flowfile as attributes or JSON content.")
+@ReadsAttributes({ @ReadsAttribute(attribute = "udp.port", description = 
"Optionally read if packets are received from UDP datagrams.") })
+@WritesAttributes({ @WritesAttribute(attribute = "netflowv5.header.*", 
description = "The key and value generated by the parsing of the header 
fields."),
+@WritesAttribute(attribute = "netflowv5.record.*", description = 
"The key and value generated by the parsing of the record fields.") })
+
+public class ParseNetflowv5 extends AbstractProcessor {
+private String destination;
+// Add mapper
+private static final ObjectMapper mapper = new ObjectMapper();
+
+public static final String DESTINATION_CONTENT = "flowfile-content";
+public static final String DESTINATION_ATTRIBUTES = 
"flowfile-attribute";
+public static final PropertyDescriptor FIELDS_DESTINATION = new 
PropertyDescriptor.Builder().name("FIELDS_DESTINATION").displayName("Parsed 
fields destination")
+.description("Indicates whether the results of the parser are 
written " + "to the FlowFile content or a FlowFile attribute; if using " + 
D

[GitHub] nifi pull request #2820: NIFI-5327 Adding Netflowv5 protocol parser

2018-08-31 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2820#discussion_r214501429
  
--- Diff: nifi-nar-bundles/nifi-network-bundle/nifi-network-utils/pom.xml 
---
@@ -0,0 +1,43 @@
+
+
+http://maven.apache.org/POM/4.0.0"; 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance";
+   xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
http://maven.apache.org/xsd/maven-4.0.0.xsd";>
+   
+   nifi-network-bundle
+   org.apache.nifi
+   1.8.0-SNAPSHOT
+   
+   4.0.0
+   nifi-network-utils
+   jar
+   
+   
+   com.fasterxml.jackson.core
+   jackson-databind
+   2.7.8
--- End diff --

Should be 2.9.5 to keep things consistent.


---


[jira] [Commented] (NIFI-5327) NetFlow Processors

2018-08-31 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5327?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16599482#comment-16599482
 ] 

ASF GitHub Bot commented on NIFI-5327:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2820#discussion_r214501495
  
--- Diff: 
nifi-nar-bundles/nifi-network-bundle/nifi-network-utils/src/main/java/org/apache/nifi/processors/network/parser/Netflowv5Parser.java
 ---
@@ -0,0 +1,134 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ * http://www.apache.org/licenses/LICENSE-2.0
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.network.parser;
+
+import java.util.OptionalInt;
+
+import static 
org.apache.nifi.processors.network.parser.util.ConversionUtil.toShort;
+import static 
org.apache.nifi.processors.network.parser.util.ConversionUtil.toInt;
+import static 
org.apache.nifi.processors.network.parser.util.ConversionUtil.toLong;
+import static 
org.apache.nifi.processors.network.parser.util.ConversionUtil.toIPV4;
+
+/**
+ * Networkv5 is Cisco data export format which contains one header and one 
or more flow records. This Parser parses the netflowv5 format. More 
information: @see
+ * https://www.cisco.com/c/en/us/td/docs/net_mgmt/netflow_collection_engine/3-6/user/guide/format.html";>Netflowv5
+ */
+public final class Netflowv5Parser {
+private static final int HEADER_SIZE = 24;
+private static final int RECORD_SIZE = 48;
+
+private static final int SHORT_TYPE = 0;
+private static final int INTEGER_TYPE = 1;
+private static final int LONG_TYPE = 2;
+private static final int IPV4_TYPE = 3;
+
+private static final String headerField[] = { "version", "count", 
"sys_uptime", "unix_secs", "unix_nsecs", "flow_sequence", "engine_type", 
"engine_id", "sampling_interval" };
+private static final String recordField[] = { "srcaddr", "dstaddr", 
"nexthop", "input", "output", "dPkts", "dOctets", "first", "last", "srcport", 
"dstport", "pad1", "tcp_flags", "prot", "tos",
+"src_as", "dst_as", "src_mask", "dst_mask", "pad2" };
+
+private final int portNumber;
+
+private Object headerData[];
+private Object recordData[][];
+
+public Netflowv5Parser(final OptionalInt portNumber) {
+this.portNumber = (portNumber.isPresent()) ? portNumber.getAsInt() 
: 0;
+}
+
+public final int parse(final byte[] buffer) throws Throwable {
+final int version = toInt(buffer, 0, 2);
+assert version == 5 : "Version mismatch";
--- End diff --

Since assertions are disabled by default, throw `ProcessException` instead.


> NetFlow Processors
> --
>
> Key: NIFI-5327
> URL: https://issues.apache.org/jira/browse/NIFI-5327
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Core Framework
>Affects Versions: 1.6.0
>Reporter: Prashanth Venkatesan
>Assignee: Prashanth Venkatesan
>Priority: Major
>
> As network traffic data scopes for the big data use case, would like NiFi to 
> have processors to support parsing of those protocols.
> Netflow is a protocol introduced by Cisco that provides the ability to 
> collect IP network traffic as it enters or exits an interface and is 
> described in detail in here:
> [https://www.cisco.com/c/en/us/td/docs/net_mgmt/netflow_collection_engine/3-6/user/guide/format.html]
>  
> Currently, I have created the following processor:
> *ParseNetflowv5*:  Parses the ingress netflowv5 bytes and ingest as either 
> NiFi flowfile attributes or as a JSON content. This also sends 
> one-time-template.
>  
> Further ahead, we can add many processor specific to network protocols in 
> this nar bundle.
> I will create a pull request.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5327) NetFlow Processors

2018-08-31 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5327?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16599485#comment-16599485
 ] 

ASF GitHub Bot commented on NIFI-5327:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2820#discussion_r214501748
  
--- Diff: 
nifi-nar-bundles/nifi-network-bundle/nifi-network-processors/src/main/java/org/apache/nifi/processors/network/ParseNetflowv5.java
 ---
@@ -0,0 +1,258 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ * http://www.apache.org/licenses/LICENSE-2.0
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.network;
+
+import static 
org.apache.nifi.processors.network.parser.Netflowv5Parser.getHeaderFields;
+import static 
org.apache.nifi.processors.network.parser.Netflowv5Parser.getRecordFields;
+
+import java.io.BufferedOutputStream;
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.OutputStream;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.OptionalInt;
+import java.util.Set;
+
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.ReadsAttribute;
+import org.apache.nifi.annotation.behavior.ReadsAttributes;
+import org.apache.nifi.annotation.behavior.SideEffectFree;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.io.InputStreamCallback;
+import org.apache.nifi.processor.io.OutputStreamCallback;
+import org.apache.nifi.processors.network.parser.Netflowv5Parser;
+import org.apache.nifi.stream.io.StreamUtils;
+
+import com.fasterxml.jackson.core.JsonProcessingException;
+import com.fasterxml.jackson.databind.ObjectMapper;
+import com.fasterxml.jackson.databind.node.ObjectNode;
+
+@EventDriven
+@SideEffectFree
+@SupportsBatching
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@Tags({ "network", "netflow", "attributes", "datagram", "v5", "packet", 
"byte" })
+@CapabilityDescription("Parses netflowv5 byte ingest and add to NiFi 
flowfile as attributes or JSON content.")
+@ReadsAttributes({ @ReadsAttribute(attribute = "udp.port", description = 
"Optionally read if packets are received from UDP datagrams.") })
+@WritesAttributes({ @WritesAttribute(attribute = "netflowv5.header.*", 
description = "The key and value generated by the parsing of the header 
fields."),
+@WritesAttribute(attribute = "netflowv5.record.*", description = 
"The key and value generated by the parsing of the record fields.") })
+
+public class ParseNetflowv5 extends AbstractProcessor {
+private String destination;
+// Add mapper
+private static final ObjectMapper mapper = new ObjectMapper();
+
+public static final String DESTINATION_CONTENT = "flowfile-content";
+public static final String DESTINATION_ATTRIBUTES = 
"flowfile-attribute";
+public static final PropertyDescriptor FIELDS_DESTINATION = new 
P

[GitHub] nifi pull request #2820: NIFI-5327 Adding Netflowv5 protocol parser

2018-08-31 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2820#discussion_r214501717
  
--- Diff: 
nifi-nar-bundles/nifi-network-bundle/nifi-network-processors/src/main/java/org/apache/nifi/processors/network/ParseNetflowv5.java
 ---
@@ -0,0 +1,258 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ * http://www.apache.org/licenses/LICENSE-2.0
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.network;
+
+import static 
org.apache.nifi.processors.network.parser.Netflowv5Parser.getHeaderFields;
+import static 
org.apache.nifi.processors.network.parser.Netflowv5Parser.getRecordFields;
+
+import java.io.BufferedOutputStream;
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.OutputStream;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.OptionalInt;
+import java.util.Set;
+
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.ReadsAttribute;
+import org.apache.nifi.annotation.behavior.ReadsAttributes;
+import org.apache.nifi.annotation.behavior.SideEffectFree;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.io.InputStreamCallback;
+import org.apache.nifi.processor.io.OutputStreamCallback;
+import org.apache.nifi.processors.network.parser.Netflowv5Parser;
+import org.apache.nifi.stream.io.StreamUtils;
+
+import com.fasterxml.jackson.core.JsonProcessingException;
+import com.fasterxml.jackson.databind.ObjectMapper;
+import com.fasterxml.jackson.databind.node.ObjectNode;
+
+@EventDriven
+@SideEffectFree
+@SupportsBatching
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@Tags({ "network", "netflow", "attributes", "datagram", "v5", "packet", 
"byte" })
+@CapabilityDescription("Parses netflowv5 byte ingest and add to NiFi 
flowfile as attributes or JSON content.")
+@ReadsAttributes({ @ReadsAttribute(attribute = "udp.port", description = 
"Optionally read if packets are received from UDP datagrams.") })
+@WritesAttributes({ @WritesAttribute(attribute = "netflowv5.header.*", 
description = "The key and value generated by the parsing of the header 
fields."),
+@WritesAttribute(attribute = "netflowv5.record.*", description = 
"The key and value generated by the parsing of the record fields.") })
+
+public class ParseNetflowv5 extends AbstractProcessor {
+private String destination;
+// Add mapper
+private static final ObjectMapper mapper = new ObjectMapper();
+
+public static final String DESTINATION_CONTENT = "flowfile-content";
+public static final String DESTINATION_ATTRIBUTES = 
"flowfile-attribute";
+public static final PropertyDescriptor FIELDS_DESTINATION = new 
PropertyDescriptor.Builder().name("FIELDS_DESTINATION").displayName("Parsed 
fields destination")
+.description("Indicates whether the results of the parser are 
written " + "to the FlowFile content or a FlowFile attribute; if using " + 
D

[GitHub] nifi pull request #2820: NIFI-5327 Adding Netflowv5 protocol parser

2018-08-31 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2820#discussion_r214501899
  
--- Diff: 
nifi-nar-bundles/nifi-network-bundle/nifi-network-processors/src/main/java/org/apache/nifi/processors/network/ParseNetflowv5.java
 ---
@@ -0,0 +1,258 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ * http://www.apache.org/licenses/LICENSE-2.0
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.network;
+
+import static 
org.apache.nifi.processors.network.parser.Netflowv5Parser.getHeaderFields;
+import static 
org.apache.nifi.processors.network.parser.Netflowv5Parser.getRecordFields;
+
+import java.io.BufferedOutputStream;
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.OutputStream;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.OptionalInt;
+import java.util.Set;
+
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.ReadsAttribute;
+import org.apache.nifi.annotation.behavior.ReadsAttributes;
+import org.apache.nifi.annotation.behavior.SideEffectFree;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.io.InputStreamCallback;
+import org.apache.nifi.processor.io.OutputStreamCallback;
+import org.apache.nifi.processors.network.parser.Netflowv5Parser;
+import org.apache.nifi.stream.io.StreamUtils;
+
+import com.fasterxml.jackson.core.JsonProcessingException;
+import com.fasterxml.jackson.databind.ObjectMapper;
+import com.fasterxml.jackson.databind.node.ObjectNode;
+
+@EventDriven
+@SideEffectFree
+@SupportsBatching
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@Tags({ "network", "netflow", "attributes", "datagram", "v5", "packet", 
"byte" })
+@CapabilityDescription("Parses netflowv5 byte ingest and add to NiFi 
flowfile as attributes or JSON content.")
+@ReadsAttributes({ @ReadsAttribute(attribute = "udp.port", description = 
"Optionally read if packets are received from UDP datagrams.") })
+@WritesAttributes({ @WritesAttribute(attribute = "netflowv5.header.*", 
description = "The key and value generated by the parsing of the header 
fields."),
+@WritesAttribute(attribute = "netflowv5.record.*", description = 
"The key and value generated by the parsing of the record fields.") })
+
+public class ParseNetflowv5 extends AbstractProcessor {
+private String destination;
+// Add mapper
+private static final ObjectMapper mapper = new ObjectMapper();
+
+public static final String DESTINATION_CONTENT = "flowfile-content";
+public static final String DESTINATION_ATTRIBUTES = 
"flowfile-attribute";
+public static final PropertyDescriptor FIELDS_DESTINATION = new 
PropertyDescriptor.Builder().name("FIELDS_DESTINATION").displayName("Parsed 
fields destination")
+.description("Indicates whether the results of the parser are 
written " + "to the FlowFile content or a FlowFile attribute; if using " + 
D

[jira] [Commented] (NIFI-5327) NetFlow Processors

2018-08-31 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5327?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16599487#comment-16599487
 ] 

ASF GitHub Bot commented on NIFI-5327:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2820#discussion_r214501822
  
--- Diff: 
nifi-nar-bundles/nifi-network-bundle/nifi-network-processors/src/main/java/org/apache/nifi/processors/network/ParseNetflowv5.java
 ---
@@ -0,0 +1,258 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ * http://www.apache.org/licenses/LICENSE-2.0
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.network;
+
+import static 
org.apache.nifi.processors.network.parser.Netflowv5Parser.getHeaderFields;
+import static 
org.apache.nifi.processors.network.parser.Netflowv5Parser.getRecordFields;
+
+import java.io.BufferedOutputStream;
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.OutputStream;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.OptionalInt;
+import java.util.Set;
+
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.ReadsAttribute;
+import org.apache.nifi.annotation.behavior.ReadsAttributes;
+import org.apache.nifi.annotation.behavior.SideEffectFree;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.io.InputStreamCallback;
+import org.apache.nifi.processor.io.OutputStreamCallback;
+import org.apache.nifi.processors.network.parser.Netflowv5Parser;
+import org.apache.nifi.stream.io.StreamUtils;
+
+import com.fasterxml.jackson.core.JsonProcessingException;
+import com.fasterxml.jackson.databind.ObjectMapper;
+import com.fasterxml.jackson.databind.node.ObjectNode;
+
+@EventDriven
+@SideEffectFree
+@SupportsBatching
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@Tags({ "network", "netflow", "attributes", "datagram", "v5", "packet", 
"byte" })
+@CapabilityDescription("Parses netflowv5 byte ingest and add to NiFi 
flowfile as attributes or JSON content.")
+@ReadsAttributes({ @ReadsAttribute(attribute = "udp.port", description = 
"Optionally read if packets are received from UDP datagrams.") })
+@WritesAttributes({ @WritesAttribute(attribute = "netflowv5.header.*", 
description = "The key and value generated by the parsing of the header 
fields."),
+@WritesAttribute(attribute = "netflowv5.record.*", description = 
"The key and value generated by the parsing of the record fields.") })
+
+public class ParseNetflowv5 extends AbstractProcessor {
+private String destination;
+// Add mapper
+private static final ObjectMapper mapper = new ObjectMapper();
+
+public static final String DESTINATION_CONTENT = "flowfile-content";
+public static final String DESTINATION_ATTRIBUTES = 
"flowfile-attribute";
+public static final PropertyDescriptor FIELDS_DESTINATION = new 
P

[jira] [Commented] (NIFI-5327) NetFlow Processors

2018-08-31 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5327?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16599479#comment-16599479
 ] 

ASF GitHub Bot commented on NIFI-5327:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2820#discussion_r214501416
  
--- Diff: 
nifi-nar-bundles/nifi-network-bundle/nifi-network-processors/src/main/resources/docs/org.apache.nifi.processors.network.ParseNetflowv5/additionalDetails.html
 ---
@@ -0,0 +1,74 @@
+
+
+
+
+
+Netflowv5Parser
+
+
+
+
+   
+   Netflowv5Parser processor parses the ingress netflowv5 datagram 
format
+   and transfers it either as flowfile attributes or JSON object.
+   Netflowv5 format has predefined schema named "template" for 
parsing
+   the netflowv5 record. More information: https://www.cisco.com/c/en/us/td/docs/net_mgmt/netflow_collection_engine/3-6/user/guide/format.html";>RFC-netflowv5
+   
--- End diff --

My $0.02 is that we keep this as-is and have a separate, record-based 
version of this. So I'm fine with this.


> NetFlow Processors
> --
>
> Key: NIFI-5327
> URL: https://issues.apache.org/jira/browse/NIFI-5327
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Core Framework
>Affects Versions: 1.6.0
>Reporter: Prashanth Venkatesan
>Assignee: Prashanth Venkatesan
>Priority: Major
>
> As network traffic data scopes for the big data use case, would like NiFi to 
> have processors to support parsing of those protocols.
> Netflow is a protocol introduced by Cisco that provides the ability to 
> collect IP network traffic as it enters or exits an interface and is 
> described in detail in here:
> [https://www.cisco.com/c/en/us/td/docs/net_mgmt/netflow_collection_engine/3-6/user/guide/format.html]
>  
> Currently, I have created the following processor:
> *ParseNetflowv5*:  Parses the ingress netflowv5 bytes and ingest as either 
> NiFi flowfile attributes or as a JSON content. This also sends 
> one-time-template.
>  
> Further ahead, we can add many processor specific to network protocols in 
> this nar bundle.
> I will create a pull request.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5327) NetFlow Processors

2018-08-31 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5327?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16599486#comment-16599486
 ] 

ASF GitHub Bot commented on NIFI-5327:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2820#discussion_r214501765
  
--- Diff: 
nifi-nar-bundles/nifi-network-bundle/nifi-network-processors/src/main/java/org/apache/nifi/processors/network/ParseNetflowv5.java
 ---
@@ -0,0 +1,258 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ * http://www.apache.org/licenses/LICENSE-2.0
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.network;
+
+import static 
org.apache.nifi.processors.network.parser.Netflowv5Parser.getHeaderFields;
+import static 
org.apache.nifi.processors.network.parser.Netflowv5Parser.getRecordFields;
+
+import java.io.BufferedOutputStream;
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.OutputStream;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.OptionalInt;
+import java.util.Set;
+
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.ReadsAttribute;
+import org.apache.nifi.annotation.behavior.ReadsAttributes;
+import org.apache.nifi.annotation.behavior.SideEffectFree;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.io.InputStreamCallback;
+import org.apache.nifi.processor.io.OutputStreamCallback;
+import org.apache.nifi.processors.network.parser.Netflowv5Parser;
+import org.apache.nifi.stream.io.StreamUtils;
+
+import com.fasterxml.jackson.core.JsonProcessingException;
+import com.fasterxml.jackson.databind.ObjectMapper;
+import com.fasterxml.jackson.databind.node.ObjectNode;
+
+@EventDriven
+@SideEffectFree
+@SupportsBatching
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@Tags({ "network", "netflow", "attributes", "datagram", "v5", "packet", 
"byte" })
+@CapabilityDescription("Parses netflowv5 byte ingest and add to NiFi 
flowfile as attributes or JSON content.")
+@ReadsAttributes({ @ReadsAttribute(attribute = "udp.port", description = 
"Optionally read if packets are received from UDP datagrams.") })
+@WritesAttributes({ @WritesAttribute(attribute = "netflowv5.header.*", 
description = "The key and value generated by the parsing of the header 
fields."),
+@WritesAttribute(attribute = "netflowv5.record.*", description = 
"The key and value generated by the parsing of the record fields.") })
+
+public class ParseNetflowv5 extends AbstractProcessor {
+private String destination;
+// Add mapper
+private static final ObjectMapper mapper = new ObjectMapper();
+
+public static final String DESTINATION_CONTENT = "flowfile-content";
+public static final String DESTINATION_ATTRIBUTES = 
"flowfile-attribute";
+public static final PropertyDescriptor FIELDS_DESTINATION = new 
P

[GitHub] nifi pull request #2820: NIFI-5327 Adding Netflowv5 protocol parser

2018-08-31 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2820#discussion_r214501605
  
--- Diff: 
nifi-nar-bundles/nifi-network-bundle/nifi-network-processors/pom.xml ---
@@ -0,0 +1,67 @@
+
+
+http://maven.apache.org/POM/4.0.0"; 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance";
+   xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
http://maven.apache.org/xsd/maven-4.0.0.xsd";>
+   4.0.0
+
+   
+   org.apache.nifi
+   nifi-network-bundle
+   1.8.0-SNAPSHOT
+   
+
+   nifi-network-processors
+   jar
+
+   
+   
+   org.apache.nifi
+   nifi-api
+   
+   
+   org.apache.nifi
+   nifi-utils
+   1.8.0-SNAPSHOT
--- End diff --

We should be able to dispense with these manual version numbers.


---


[jira] [Commented] (NIFI-5327) NetFlow Processors

2018-08-31 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5327?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16599493#comment-16599493
 ] 

ASF GitHub Bot commented on NIFI-5327:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2820#discussion_r214501899
  
--- Diff: 
nifi-nar-bundles/nifi-network-bundle/nifi-network-processors/src/main/java/org/apache/nifi/processors/network/ParseNetflowv5.java
 ---
@@ -0,0 +1,258 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ * http://www.apache.org/licenses/LICENSE-2.0
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.network;
+
+import static 
org.apache.nifi.processors.network.parser.Netflowv5Parser.getHeaderFields;
+import static 
org.apache.nifi.processors.network.parser.Netflowv5Parser.getRecordFields;
+
+import java.io.BufferedOutputStream;
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.OutputStream;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.OptionalInt;
+import java.util.Set;
+
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.ReadsAttribute;
+import org.apache.nifi.annotation.behavior.ReadsAttributes;
+import org.apache.nifi.annotation.behavior.SideEffectFree;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.io.InputStreamCallback;
+import org.apache.nifi.processor.io.OutputStreamCallback;
+import org.apache.nifi.processors.network.parser.Netflowv5Parser;
+import org.apache.nifi.stream.io.StreamUtils;
+
+import com.fasterxml.jackson.core.JsonProcessingException;
+import com.fasterxml.jackson.databind.ObjectMapper;
+import com.fasterxml.jackson.databind.node.ObjectNode;
+
+@EventDriven
+@SideEffectFree
+@SupportsBatching
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@Tags({ "network", "netflow", "attributes", "datagram", "v5", "packet", 
"byte" })
+@CapabilityDescription("Parses netflowv5 byte ingest and add to NiFi 
flowfile as attributes or JSON content.")
+@ReadsAttributes({ @ReadsAttribute(attribute = "udp.port", description = 
"Optionally read if packets are received from UDP datagrams.") })
+@WritesAttributes({ @WritesAttribute(attribute = "netflowv5.header.*", 
description = "The key and value generated by the parsing of the header 
fields."),
+@WritesAttribute(attribute = "netflowv5.record.*", description = 
"The key and value generated by the parsing of the record fields.") })
+
+public class ParseNetflowv5 extends AbstractProcessor {
+private String destination;
+// Add mapper
+private static final ObjectMapper mapper = new ObjectMapper();
+
+public static final String DESTINATION_CONTENT = "flowfile-content";
+public static final String DESTINATION_ATTRIBUTES = 
"flowfile-attribute";
+public static final PropertyDescriptor FIELDS_DESTINATION = new 
P

[GitHub] nifi pull request #2820: NIFI-5327 Adding Netflowv5 protocol parser

2018-08-31 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2820#discussion_r214501416
  
--- Diff: 
nifi-nar-bundles/nifi-network-bundle/nifi-network-processors/src/main/resources/docs/org.apache.nifi.processors.network.ParseNetflowv5/additionalDetails.html
 ---
@@ -0,0 +1,74 @@
+
+
+
+
+
+Netflowv5Parser
+
+
+
+
+   
+   Netflowv5Parser processor parses the ingress netflowv5 datagram 
format
+   and transfers it either as flowfile attributes or JSON object.
+   Netflowv5 format has predefined schema named "template" for 
parsing
+   the netflowv5 record. More information: https://www.cisco.com/c/en/us/td/docs/net_mgmt/netflow_collection_engine/3-6/user/guide/format.html";>RFC-netflowv5
+   
--- End diff --

My $0.02 is that we keep this as-is and have a separate, record-based 
version of this. So I'm fine with this.


---


[GitHub] nifi pull request #2820: NIFI-5327 Adding Netflowv5 protocol parser

2018-08-31 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2820#discussion_r214501669
  
--- Diff: 
nifi-nar-bundles/nifi-network-bundle/nifi-network-processors/src/main/java/org/apache/nifi/processors/network/ParseNetflowv5.java
 ---
@@ -0,0 +1,258 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ * http://www.apache.org/licenses/LICENSE-2.0
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.network;
+
+import static 
org.apache.nifi.processors.network.parser.Netflowv5Parser.getHeaderFields;
+import static 
org.apache.nifi.processors.network.parser.Netflowv5Parser.getRecordFields;
+
+import java.io.BufferedOutputStream;
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.OutputStream;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.OptionalInt;
+import java.util.Set;
+
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.ReadsAttribute;
+import org.apache.nifi.annotation.behavior.ReadsAttributes;
+import org.apache.nifi.annotation.behavior.SideEffectFree;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.io.InputStreamCallback;
+import org.apache.nifi.processor.io.OutputStreamCallback;
+import org.apache.nifi.processors.network.parser.Netflowv5Parser;
+import org.apache.nifi.stream.io.StreamUtils;
+
+import com.fasterxml.jackson.core.JsonProcessingException;
+import com.fasterxml.jackson.databind.ObjectMapper;
+import com.fasterxml.jackson.databind.node.ObjectNode;
+
+@EventDriven
+@SideEffectFree
+@SupportsBatching
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@Tags({ "network", "netflow", "attributes", "datagram", "v5", "packet", 
"byte" })
+@CapabilityDescription("Parses netflowv5 byte ingest and add to NiFi 
flowfile as attributes or JSON content.")
+@ReadsAttributes({ @ReadsAttribute(attribute = "udp.port", description = 
"Optionally read if packets are received from UDP datagrams.") })
+@WritesAttributes({ @WritesAttribute(attribute = "netflowv5.header.*", 
description = "The key and value generated by the parsing of the header 
fields."),
+@WritesAttribute(attribute = "netflowv5.record.*", description = 
"The key and value generated by the parsing of the record fields.") })
+
+public class ParseNetflowv5 extends AbstractProcessor {
+private String destination;
+// Add mapper
+private static final ObjectMapper mapper = new ObjectMapper();
+
+public static final String DESTINATION_CONTENT = "flowfile-content";
+public static final String DESTINATION_ATTRIBUTES = 
"flowfile-attribute";
+public static final PropertyDescriptor FIELDS_DESTINATION = new 
PropertyDescriptor.Builder().name("FIELDS_DESTINATION").displayName("Parsed 
fields destination")
+.description("Indicates whether the results of the parser are 
written " + "to the FlowFile content or a FlowFile attribute; if using " + 
D

[GitHub] nifi pull request #2820: NIFI-5327 Adding Netflowv5 protocol parser

2018-08-31 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2820#discussion_r214501991
  
--- Diff: 
nifi-nar-bundles/nifi-network-bundle/nifi-network-processors/src/main/java/org/apache/nifi/processors/network/ParseNetflowv5.java
 ---
@@ -0,0 +1,258 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ * http://www.apache.org/licenses/LICENSE-2.0
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.network;
+
+import static 
org.apache.nifi.processors.network.parser.Netflowv5Parser.getHeaderFields;
+import static 
org.apache.nifi.processors.network.parser.Netflowv5Parser.getRecordFields;
+
+import java.io.BufferedOutputStream;
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.OutputStream;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.OptionalInt;
+import java.util.Set;
+
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.ReadsAttribute;
+import org.apache.nifi.annotation.behavior.ReadsAttributes;
+import org.apache.nifi.annotation.behavior.SideEffectFree;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.io.InputStreamCallback;
+import org.apache.nifi.processor.io.OutputStreamCallback;
+import org.apache.nifi.processors.network.parser.Netflowv5Parser;
+import org.apache.nifi.stream.io.StreamUtils;
+
+import com.fasterxml.jackson.core.JsonProcessingException;
+import com.fasterxml.jackson.databind.ObjectMapper;
+import com.fasterxml.jackson.databind.node.ObjectNode;
+
+@EventDriven
+@SideEffectFree
+@SupportsBatching
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@Tags({ "network", "netflow", "attributes", "datagram", "v5", "packet", 
"byte" })
+@CapabilityDescription("Parses netflowv5 byte ingest and add to NiFi 
flowfile as attributes or JSON content.")
+@ReadsAttributes({ @ReadsAttribute(attribute = "udp.port", description = 
"Optionally read if packets are received from UDP datagrams.") })
+@WritesAttributes({ @WritesAttribute(attribute = "netflowv5.header.*", 
description = "The key and value generated by the parsing of the header 
fields."),
+@WritesAttribute(attribute = "netflowv5.record.*", description = 
"The key and value generated by the parsing of the record fields.") })
+
+public class ParseNetflowv5 extends AbstractProcessor {
+private String destination;
+// Add mapper
+private static final ObjectMapper mapper = new ObjectMapper();
+
+public static final String DESTINATION_CONTENT = "flowfile-content";
+public static final String DESTINATION_ATTRIBUTES = 
"flowfile-attribute";
+public static final PropertyDescriptor FIELDS_DESTINATION = new 
PropertyDescriptor.Builder().name("FIELDS_DESTINATION").displayName("Parsed 
fields destination")
+.description("Indicates whether the results of the parser are 
written " + "to the FlowFile content or a FlowFile attribute; if using " + 
D

[jira] [Commented] (NIFI-5327) NetFlow Processors

2018-08-31 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5327?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16599491#comment-16599491
 ] 

ASF GitHub Bot commented on NIFI-5327:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2820#discussion_r214502000
  
--- Diff: 
nifi-nar-bundles/nifi-network-bundle/nifi-network-processors/src/main/java/org/apache/nifi/processors/network/ParseNetflowv5.java
 ---
@@ -0,0 +1,258 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ * http://www.apache.org/licenses/LICENSE-2.0
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.network;
+
+import static 
org.apache.nifi.processors.network.parser.Netflowv5Parser.getHeaderFields;
+import static 
org.apache.nifi.processors.network.parser.Netflowv5Parser.getRecordFields;
+
+import java.io.BufferedOutputStream;
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.OutputStream;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.OptionalInt;
+import java.util.Set;
+
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.ReadsAttribute;
+import org.apache.nifi.annotation.behavior.ReadsAttributes;
+import org.apache.nifi.annotation.behavior.SideEffectFree;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.io.InputStreamCallback;
+import org.apache.nifi.processor.io.OutputStreamCallback;
+import org.apache.nifi.processors.network.parser.Netflowv5Parser;
+import org.apache.nifi.stream.io.StreamUtils;
+
+import com.fasterxml.jackson.core.JsonProcessingException;
+import com.fasterxml.jackson.databind.ObjectMapper;
+import com.fasterxml.jackson.databind.node.ObjectNode;
+
+@EventDriven
+@SideEffectFree
+@SupportsBatching
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@Tags({ "network", "netflow", "attributes", "datagram", "v5", "packet", 
"byte" })
+@CapabilityDescription("Parses netflowv5 byte ingest and add to NiFi 
flowfile as attributes or JSON content.")
+@ReadsAttributes({ @ReadsAttribute(attribute = "udp.port", description = 
"Optionally read if packets are received from UDP datagrams.") })
+@WritesAttributes({ @WritesAttribute(attribute = "netflowv5.header.*", 
description = "The key and value generated by the parsing of the header 
fields."),
+@WritesAttribute(attribute = "netflowv5.record.*", description = 
"The key and value generated by the parsing of the record fields.") })
+
+public class ParseNetflowv5 extends AbstractProcessor {
+private String destination;
+// Add mapper
+private static final ObjectMapper mapper = new ObjectMapper();
+
+public static final String DESTINATION_CONTENT = "flowfile-content";
+public static final String DESTINATION_ATTRIBUTES = 
"flowfile-attribute";
+public static final PropertyDescriptor FIELDS_DESTINATION = new 
P

[jira] [Commented] (NIFI-5327) NetFlow Processors

2018-08-31 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5327?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16599481#comment-16599481
 ] 

ASF GitHub Bot commented on NIFI-5327:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2820#discussion_r214501429
  
--- Diff: nifi-nar-bundles/nifi-network-bundle/nifi-network-utils/pom.xml 
---
@@ -0,0 +1,43 @@
+
+
+http://maven.apache.org/POM/4.0.0"; 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance";
+   xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
http://maven.apache.org/xsd/maven-4.0.0.xsd";>
+   
+   nifi-network-bundle
+   org.apache.nifi
+   1.8.0-SNAPSHOT
+   
+   4.0.0
+   nifi-network-utils
+   jar
+   
+   
+   com.fasterxml.jackson.core
+   jackson-databind
+   2.7.8
--- End diff --

Should be 2.9.5 to keep things consistent.


> NetFlow Processors
> --
>
> Key: NIFI-5327
> URL: https://issues.apache.org/jira/browse/NIFI-5327
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Core Framework
>Affects Versions: 1.6.0
>Reporter: Prashanth Venkatesan
>Assignee: Prashanth Venkatesan
>Priority: Major
>
> As network traffic data scopes for the big data use case, would like NiFi to 
> have processors to support parsing of those protocols.
> Netflow is a protocol introduced by Cisco that provides the ability to 
> collect IP network traffic as it enters or exits an interface and is 
> described in detail in here:
> [https://www.cisco.com/c/en/us/td/docs/net_mgmt/netflow_collection_engine/3-6/user/guide/format.html]
>  
> Currently, I have created the following processor:
> *ParseNetflowv5*:  Parses the ingress netflowv5 bytes and ingest as either 
> NiFi flowfile attributes or as a JSON content. This also sends 
> one-time-template.
>  
> Further ahead, we can add many processor specific to network protocols in 
> this nar bundle.
> I will create a pull request.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2820: NIFI-5327 Adding Netflowv5 protocol parser

2018-08-31 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2820#discussion_r214501922
  
--- Diff: 
nifi-nar-bundles/nifi-network-bundle/nifi-network-processors/src/main/java/org/apache/nifi/processors/network/ParseNetflowv5.java
 ---
@@ -0,0 +1,258 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ * http://www.apache.org/licenses/LICENSE-2.0
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.network;
+
+import static 
org.apache.nifi.processors.network.parser.Netflowv5Parser.getHeaderFields;
+import static 
org.apache.nifi.processors.network.parser.Netflowv5Parser.getRecordFields;
+
+import java.io.BufferedOutputStream;
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.OutputStream;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.OptionalInt;
+import java.util.Set;
+
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.ReadsAttribute;
+import org.apache.nifi.annotation.behavior.ReadsAttributes;
+import org.apache.nifi.annotation.behavior.SideEffectFree;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.io.InputStreamCallback;
+import org.apache.nifi.processor.io.OutputStreamCallback;
+import org.apache.nifi.processors.network.parser.Netflowv5Parser;
+import org.apache.nifi.stream.io.StreamUtils;
+
+import com.fasterxml.jackson.core.JsonProcessingException;
+import com.fasterxml.jackson.databind.ObjectMapper;
+import com.fasterxml.jackson.databind.node.ObjectNode;
+
+@EventDriven
+@SideEffectFree
+@SupportsBatching
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@Tags({ "network", "netflow", "attributes", "datagram", "v5", "packet", 
"byte" })
+@CapabilityDescription("Parses netflowv5 byte ingest and add to NiFi 
flowfile as attributes or JSON content.")
+@ReadsAttributes({ @ReadsAttribute(attribute = "udp.port", description = 
"Optionally read if packets are received from UDP datagrams.") })
+@WritesAttributes({ @WritesAttribute(attribute = "netflowv5.header.*", 
description = "The key and value generated by the parsing of the header 
fields."),
+@WritesAttribute(attribute = "netflowv5.record.*", description = 
"The key and value generated by the parsing of the record fields.") })
+
+public class ParseNetflowv5 extends AbstractProcessor {
+private String destination;
+// Add mapper
+private static final ObjectMapper mapper = new ObjectMapper();
+
+public static final String DESTINATION_CONTENT = "flowfile-content";
+public static final String DESTINATION_ATTRIBUTES = 
"flowfile-attribute";
+public static final PropertyDescriptor FIELDS_DESTINATION = new 
PropertyDescriptor.Builder().name("FIELDS_DESTINATION").displayName("Parsed 
fields destination")
+.description("Indicates whether the results of the parser are 
written " + "to the FlowFile content or a FlowFile attribute; if using " + 
D

[jira] [Commented] (NIFI-5327) NetFlow Processors

2018-08-31 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5327?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16599492#comment-16599492
 ] 

ASF GitHub Bot commented on NIFI-5327:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2820#discussion_r214501780
  
--- Diff: 
nifi-nar-bundles/nifi-network-bundle/nifi-network-processors/src/main/java/org/apache/nifi/processors/network/ParseNetflowv5.java
 ---
@@ -0,0 +1,258 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ * http://www.apache.org/licenses/LICENSE-2.0
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.network;
+
+import static 
org.apache.nifi.processors.network.parser.Netflowv5Parser.getHeaderFields;
+import static 
org.apache.nifi.processors.network.parser.Netflowv5Parser.getRecordFields;
+
+import java.io.BufferedOutputStream;
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.OutputStream;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.OptionalInt;
+import java.util.Set;
+
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.ReadsAttribute;
+import org.apache.nifi.annotation.behavior.ReadsAttributes;
+import org.apache.nifi.annotation.behavior.SideEffectFree;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.io.InputStreamCallback;
+import org.apache.nifi.processor.io.OutputStreamCallback;
+import org.apache.nifi.processors.network.parser.Netflowv5Parser;
+import org.apache.nifi.stream.io.StreamUtils;
+
+import com.fasterxml.jackson.core.JsonProcessingException;
+import com.fasterxml.jackson.databind.ObjectMapper;
+import com.fasterxml.jackson.databind.node.ObjectNode;
+
+@EventDriven
+@SideEffectFree
+@SupportsBatching
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@Tags({ "network", "netflow", "attributes", "datagram", "v5", "packet", 
"byte" })
+@CapabilityDescription("Parses netflowv5 byte ingest and add to NiFi 
flowfile as attributes or JSON content.")
+@ReadsAttributes({ @ReadsAttribute(attribute = "udp.port", description = 
"Optionally read if packets are received from UDP datagrams.") })
+@WritesAttributes({ @WritesAttribute(attribute = "netflowv5.header.*", 
description = "The key and value generated by the parsing of the header 
fields."),
+@WritesAttribute(attribute = "netflowv5.record.*", description = 
"The key and value generated by the parsing of the record fields.") })
+
+public class ParseNetflowv5 extends AbstractProcessor {
+private String destination;
+// Add mapper
+private static final ObjectMapper mapper = new ObjectMapper();
+
+public static final String DESTINATION_CONTENT = "flowfile-content";
+public static final String DESTINATION_ATTRIBUTES = 
"flowfile-attribute";
+public static final PropertyDescriptor FIELDS_DESTINATION = new 
P

[jira] [Commented] (NIFI-5327) NetFlow Processors

2018-08-31 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5327?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16599489#comment-16599489
 ] 

ASF GitHub Bot commented on NIFI-5327:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2820#discussion_r214501922
  
--- Diff: 
nifi-nar-bundles/nifi-network-bundle/nifi-network-processors/src/main/java/org/apache/nifi/processors/network/ParseNetflowv5.java
 ---
@@ -0,0 +1,258 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ * http://www.apache.org/licenses/LICENSE-2.0
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.network;
+
+import static 
org.apache.nifi.processors.network.parser.Netflowv5Parser.getHeaderFields;
+import static 
org.apache.nifi.processors.network.parser.Netflowv5Parser.getRecordFields;
+
+import java.io.BufferedOutputStream;
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.OutputStream;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.OptionalInt;
+import java.util.Set;
+
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.ReadsAttribute;
+import org.apache.nifi.annotation.behavior.ReadsAttributes;
+import org.apache.nifi.annotation.behavior.SideEffectFree;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.io.InputStreamCallback;
+import org.apache.nifi.processor.io.OutputStreamCallback;
+import org.apache.nifi.processors.network.parser.Netflowv5Parser;
+import org.apache.nifi.stream.io.StreamUtils;
+
+import com.fasterxml.jackson.core.JsonProcessingException;
+import com.fasterxml.jackson.databind.ObjectMapper;
+import com.fasterxml.jackson.databind.node.ObjectNode;
+
+@EventDriven
+@SideEffectFree
+@SupportsBatching
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@Tags({ "network", "netflow", "attributes", "datagram", "v5", "packet", 
"byte" })
+@CapabilityDescription("Parses netflowv5 byte ingest and add to NiFi 
flowfile as attributes or JSON content.")
+@ReadsAttributes({ @ReadsAttribute(attribute = "udp.port", description = 
"Optionally read if packets are received from UDP datagrams.") })
+@WritesAttributes({ @WritesAttribute(attribute = "netflowv5.header.*", 
description = "The key and value generated by the parsing of the header 
fields."),
+@WritesAttribute(attribute = "netflowv5.record.*", description = 
"The key and value generated by the parsing of the record fields.") })
+
+public class ParseNetflowv5 extends AbstractProcessor {
+private String destination;
+// Add mapper
+private static final ObjectMapper mapper = new ObjectMapper();
+
+public static final String DESTINATION_CONTENT = "flowfile-content";
+public static final String DESTINATION_ATTRIBUTES = 
"flowfile-attribute";
+public static final PropertyDescriptor FIELDS_DESTINATION = new 
P

[GitHub] nifi pull request #2820: NIFI-5327 Adding Netflowv5 protocol parser

2018-08-31 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2820#discussion_r214502000
  
--- Diff: 
nifi-nar-bundles/nifi-network-bundle/nifi-network-processors/src/main/java/org/apache/nifi/processors/network/ParseNetflowv5.java
 ---
@@ -0,0 +1,258 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ * http://www.apache.org/licenses/LICENSE-2.0
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.network;
+
+import static 
org.apache.nifi.processors.network.parser.Netflowv5Parser.getHeaderFields;
+import static 
org.apache.nifi.processors.network.parser.Netflowv5Parser.getRecordFields;
+
+import java.io.BufferedOutputStream;
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.OutputStream;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.OptionalInt;
+import java.util.Set;
+
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.ReadsAttribute;
+import org.apache.nifi.annotation.behavior.ReadsAttributes;
+import org.apache.nifi.annotation.behavior.SideEffectFree;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.io.InputStreamCallback;
+import org.apache.nifi.processor.io.OutputStreamCallback;
+import org.apache.nifi.processors.network.parser.Netflowv5Parser;
+import org.apache.nifi.stream.io.StreamUtils;
+
+import com.fasterxml.jackson.core.JsonProcessingException;
+import com.fasterxml.jackson.databind.ObjectMapper;
+import com.fasterxml.jackson.databind.node.ObjectNode;
+
+@EventDriven
+@SideEffectFree
+@SupportsBatching
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@Tags({ "network", "netflow", "attributes", "datagram", "v5", "packet", 
"byte" })
+@CapabilityDescription("Parses netflowv5 byte ingest and add to NiFi 
flowfile as attributes or JSON content.")
+@ReadsAttributes({ @ReadsAttribute(attribute = "udp.port", description = 
"Optionally read if packets are received from UDP datagrams.") })
+@WritesAttributes({ @WritesAttribute(attribute = "netflowv5.header.*", 
description = "The key and value generated by the parsing of the header 
fields."),
+@WritesAttribute(attribute = "netflowv5.record.*", description = 
"The key and value generated by the parsing of the record fields.") })
+
+public class ParseNetflowv5 extends AbstractProcessor {
+private String destination;
+// Add mapper
+private static final ObjectMapper mapper = new ObjectMapper();
+
+public static final String DESTINATION_CONTENT = "flowfile-content";
+public static final String DESTINATION_ATTRIBUTES = 
"flowfile-attribute";
+public static final PropertyDescriptor FIELDS_DESTINATION = new 
PropertyDescriptor.Builder().name("FIELDS_DESTINATION").displayName("Parsed 
fields destination")
+.description("Indicates whether the results of the parser are 
written " + "to the FlowFile content or a FlowFile attribute; if using " + 
D

[jira] [Created] (NIFI-5566) Bring HashContent inline with HashService and rename legacy components

2018-08-31 Thread Andy LoPresto (JIRA)
Andy LoPresto created NIFI-5566:
---

 Summary: Bring HashContent inline with HashService and rename 
legacy components
 Key: NIFI-5566
 URL: https://issues.apache.org/jira/browse/NIFI-5566
 Project: Apache NiFi
  Issue Type: Improvement
  Components: Extensions
Affects Versions: 1.7.1
Reporter: Andy LoPresto
Assignee: Andy LoPresto


As documented in [NIFI-5147|https://issues.apache.org/jira/browse/NIFI-5147] 
and [PR 2980|https://github.com/apache/nifi/pull/2980], the {{HashAttribute}} 
processor and {{HashContent}} processor are lacking some features, do not offer 
consistent algorithms across platforms, etc. 

I propose the following:
* Rename {{HashAttribute}} (which does not provide the service of calculating a 
hash over one or more attributes) to {{HashAttributeLegacy}}
* Renamed {{CalculateAttributeHash}} to {{HashAttribute}} to make semantic sense
* Rename {{HashContent}} to {{HashContentLegacy}} for users who need obscure 
digest algorithms which may or may not have been offered on their platform
* Implement a processor {{HashContent}} with similar semantics to the existing 
processor but with consistent algorithm offerings and using the common 
{{HashService}} offering

With the new component versioning features provided as part of the flow 
versioning behavior, silently disrupting existing flows which use these 
processors is no longer a concern. Rather, Any flow currently using the 
existing processors will either:

1. continue normal operation
1. require flow manager interaction and provide documentation about the change
  1. migration notes and upgrade instructions will be provided



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5318) Implement NiFi test harness

2018-08-31 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5318?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16599376#comment-16599376
 ] 

ASF GitHub Bot commented on NIFI-5318:
--

Github user MikeThomsen commented on the issue:

https://github.com/apache/nifi/pull/2872
  
@peter-gergely-horvath can you push that commit?


> Implement NiFi test harness
> ---
>
> Key: NIFI-5318
> URL: https://issues.apache.org/jira/browse/NIFI-5318
> Project: Apache NiFi
>  Issue Type: New Feature
>Reporter: Peter Horvath
>Priority: Major
>
> Currently, it is not really possible to automatically test the behaviour of a 
> specific NiFi flow and make unit test type asserts if it works as expected. 
> For example, if the expected behaviour of a NiFi flow is that a file placed 
> to a specific directory will trigger some operation after which some output 
> file will appear at another directory, once currently can only do one thing: 
> test the NiFi flow manually. 
> Manual testing is especially hard to manage if a NiFi flow is being actively 
> developed: any change to a complex, existing NiFi flow might require a lot of 
> manual testing just to ensure there are no regressions introduced. 
> Some kind of Java API that allows managing a NiFi instance and manipulating 
> flow deployments like for example, [Codehaus 
> Cargo|]https://codehaus-cargo.github.io/] would be of great help. 
>  
>  
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2872: NIFI-5318 Implement NiFi test harness: initial commit of n...

2018-08-31 Thread MikeThomsen
Github user MikeThomsen commented on the issue:

https://github.com/apache/nifi/pull/2872
  
@peter-gergely-horvath can you push that commit?


---


[jira] [Resolved] (NIFI-5408) Add documentation for NIFI-4279

2018-08-31 Thread Mike Thomsen (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-5408?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mike Thomsen resolved NIFI-5408.

   Resolution: Fixed
Fix Version/s: 1.8.0

> Add documentation for NIFI-4279
> ---
>
> Key: NIFI-5408
> URL: https://issues.apache.org/jira/browse/NIFI-5408
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Reporter: Lars Francke
>Assignee: Lars Francke
>Priority: Minor
> Fix For: 1.8.0
>
>
> NIFI-4279 changed the order in which columns are read.
> All this patch does is add a javadoc comment to the code pointing to the 
> relevant issue. The issue is not well known and worth documenting



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5408) Add documentation for NIFI-4279

2018-08-31 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5408?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16599375#comment-16599375
 ] 

ASF GitHub Bot commented on NIFI-5408:
--

Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/2878


> Add documentation for NIFI-4279
> ---
>
> Key: NIFI-5408
> URL: https://issues.apache.org/jira/browse/NIFI-5408
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Reporter: Lars Francke
>Assignee: Lars Francke
>Priority: Minor
> Fix For: 1.8.0
>
>
> NIFI-4279 changed the order in which columns are read.
> All this patch does is add a javadoc comment to the code pointing to the 
> relevant issue. The issue is not well known and worth documenting



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2878: NIFI-5408 Add documentation for NIFI-4279

2018-08-31 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/2878


---


[jira] [Commented] (NIFI-4279) PutDataBaseRecord and ConvertJSONToSQL stream has already been closed

2018-08-31 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4279?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16599374#comment-16599374
 ] 

ASF subversion and git services commented on NIFI-4279:
---

Commit 1b73578c48d66f6bab7b90f49cbe837a4d7169c0 in nifi's branch 
refs/heads/master from [~lars_francke]
[ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=1b73578 ]

NIFI-5408 Add documentation for NIFI-4279

This closes #2878

Signed-off-by: Mike Thomsen 


> PutDataBaseRecord and ConvertJSONToSQL stream has already been closed
> -
>
> Key: NIFI-4279
> URL: https://issues.apache.org/jira/browse/NIFI-4279
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Extensions
> Environment: we have used nifi on openshift cluster and standalone on 
> a linux machine.
> Oracle database with version 11.2.0.2
> linux with rhel 7.0
>Reporter: simon water
>Assignee: Peter Wicks
>Priority: Minor
>  Labels: database, nifi, patch, processor
> Fix For: 1.8.0
>
>   Original Estimate: 24h
>  Remaining Estimate: 24h
>
> we have created a table in an Oracle Database with a default sysdate column.
> when i restarted the processor it started throwing the stream has already 
> been closed exception
> i have used a csv file with the headers inside of it
> csv Reader that reads the column names from the csv
> the data base version is 11.2.0 Oracle database



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5408) Add documentation for NIFI-4279

2018-08-31 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5408?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16599373#comment-16599373
 ] 

ASF subversion and git services commented on NIFI-5408:
---

Commit 1b73578c48d66f6bab7b90f49cbe837a4d7169c0 in nifi's branch 
refs/heads/master from [~lars_francke]
[ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=1b73578 ]

NIFI-5408 Add documentation for NIFI-4279

This closes #2878

Signed-off-by: Mike Thomsen 


> Add documentation for NIFI-4279
> ---
>
> Key: NIFI-5408
> URL: https://issues.apache.org/jira/browse/NIFI-5408
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Reporter: Lars Francke
>Assignee: Lars Francke
>Priority: Minor
>
> NIFI-4279 changed the order in which columns are read.
> All this patch does is add a javadoc comment to the code pointing to the 
> relevant issue. The issue is not well known and worth documenting



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5366) Implement Content Security Policy frame-ancestors directive

2018-08-31 Thread Nathan Gough (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5366?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16599321#comment-16599321
 ] 

Nathan Gough commented on NIFI-5366:


They appear to work in combination just fine, so perhaps we can leave it with 
both headers for now. 

> Implement Content Security Policy frame-ancestors directive
> ---
>
> Key: NIFI-5366
> URL: https://issues.apache.org/jira/browse/NIFI-5366
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core Framework
>Affects Versions: 1.7.0
>Reporter: Andy LoPresto
>Assignee: Nathan Gough
>Priority: Major
>  Labels: frame, header, http, security
>
> The {{X-Frame-Options}} headers [1] currently in place to prevent malicious 
> framing / clickjacking [2] are superseded by and should be replaced by the 
> Content Security Policy frame-ancestors [3] directive. 
> [1] https://tools.ietf.org/html/rfc7034
> [2] https://en.wikipedia.org/wiki/Clickjacking
> [3] 
> https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Content-Security-Policy/frame-ancestors



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5239) Make MongoDBControllerService able to act as a configuration source for MongoDB processors

2018-08-31 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5239?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16599219#comment-16599219
 ] 

ASF GitHub Bot commented on NIFI-5239:
--

Github user MikeThomsen commented on the issue:

https://github.com/apache/nifi/pull/2896
  
@zenfenan @mattyb149 can one of you review?

The goal here, btw, is to make it so that admins are able to easily share a 
`MongoClient` between multiple processors. Per the 
[docs](http://mongodb.github.io/mongo-java-driver/3.5/driver/getting-started/quick-start/#make-a-connection)
 `MongoClient` is thread-safe and you should limit how many of them your 
application creates:

> The MongoClient instance represents a pool of connections to the 
database; you will only need one instance of class MongoClient even with 
multiple threads.


> Make MongoDBControllerService able to act as a configuration source for 
> MongoDB processors
> --
>
> Key: NIFI-5239
> URL: https://issues.apache.org/jira/browse/NIFI-5239
> Project: Apache NiFi
>  Issue Type: New Feature
>Reporter: Mike Thomsen
>Assignee: Mike Thomsen
>Priority: Major
>
> The MongoDBControllerService should be able to provide the getDatabase and 
> getCollection functionality that are built into the MongoDB processors 
> through AbstractMongoDBProcessor. Using the controller service with the 
> processors should be optional in the first release it's added and then 
> mandatory going forward.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2896: NIFI-5239 Made a client service an optional source of conn...

2018-08-31 Thread MikeThomsen
Github user MikeThomsen commented on the issue:

https://github.com/apache/nifi/pull/2896
  
@zenfenan @mattyb149 can one of you review?

The goal here, btw, is to make it so that admins are able to easily share a 
`MongoClient` between multiple processors. Per the 
[docs](http://mongodb.github.io/mongo-java-driver/3.5/driver/getting-started/quick-start/#make-a-connection)
 `MongoClient` is thread-safe and you should limit how many of them your 
application creates:

> The MongoClient instance represents a pool of connections to the 
database; you will only need one instance of class MongoClient even with 
multiple threads.


---


[jira] [Commented] (NIFI-5495) Allow configuration of DateFormat for Mongo Processors

2018-08-31 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5495?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16599216#comment-16599216
 ] 

ASF GitHub Bot commented on NIFI-5495:
--

Github user MikeThomsen commented on the issue:

https://github.com/apache/nifi/pull/2969
  
@zenfenan @mattyb149 can one of you review?


> Allow configuration of DateFormat for Mongo Processors
> --
>
> Key: NIFI-5495
> URL: https://issues.apache.org/jira/browse/NIFI-5495
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Extensions
>Affects Versions: 1.7.1
> Environment: CentOS 7.5, Java 1.8.0 u172
>Reporter: Ryan Hendrickson
>Assignee: Mike Thomsen
>Priority: Blocker
>
> When using the GetMongo, configured with JSON Type of "Standard JSON", it 
> truncates dates with milliseconds.   
>  
> I've got a document in Mongo that has a date field that looks like the 
> following:
> {
>    ...
>    "date" : ISODate("2018-08-06T16:20:10.912Z"
>    ...
> }
>  
>    When GetMongo spits it out, the date comes out as:  
> "2018-08-06T16:20:10Z", noticeably missing the milliseconds.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2969: NIFI-5495 Made date format configurable.

2018-08-31 Thread MikeThomsen
Github user MikeThomsen commented on the issue:

https://github.com/apache/nifi/pull/2969
  
@zenfenan @mattyb149 can one of you review?


---


[jira] [Commented] (NIFI-5495) Allow configuration of DateFormat for Mongo Processors

2018-08-31 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5495?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16599215#comment-16599215
 ] 

ASF GitHub Bot commented on NIFI-5495:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2969#discussion_r214460944
  
--- Diff: 
nifi-nar-bundles/nifi-mongodb-bundle/nifi-mongodb-processors/src/main/java/org/apache/nifi/processors/mongodb/AbstractMongoProcessor.java
 ---
@@ -173,6 +175,29 @@
 
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
 .build();
 
--- End diff --

Just rebased and reran the tests in IntelliJ and got 52/52 for the entire 
processor test package. Don't know what to tell ya.


> Allow configuration of DateFormat for Mongo Processors
> --
>
> Key: NIFI-5495
> URL: https://issues.apache.org/jira/browse/NIFI-5495
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Extensions
>Affects Versions: 1.7.1
> Environment: CentOS 7.5, Java 1.8.0 u172
>Reporter: Ryan Hendrickson
>Assignee: Mike Thomsen
>Priority: Blocker
>
> When using the GetMongo, configured with JSON Type of "Standard JSON", it 
> truncates dates with milliseconds.   
>  
> I've got a document in Mongo that has a date field that looks like the 
> following:
> {
>    ...
>    "date" : ISODate("2018-08-06T16:20:10.912Z"
>    ...
> }
>  
>    When GetMongo spits it out, the date comes out as:  
> "2018-08-06T16:20:10Z", noticeably missing the milliseconds.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2969: NIFI-5495 Made date format configurable.

2018-08-31 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2969#discussion_r214460944
  
--- Diff: 
nifi-nar-bundles/nifi-mongodb-bundle/nifi-mongodb-processors/src/main/java/org/apache/nifi/processors/mongodb/AbstractMongoProcessor.java
 ---
@@ -173,6 +175,29 @@
 
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
 .build();
 
--- End diff --

Just rebased and reran the tests in IntelliJ and got 52/52 for the entire 
processor test package. Don't know what to tell ya.


---


[jira] [Commented] (NIFI-5544) Refactor GetMongo processor codebase

2018-08-31 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5544?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16599198#comment-16599198
 ] 

ASF GitHub Bot commented on NIFI-5544:
--

Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/2958


> Refactor GetMongo processor codebase
> 
>
> Key: NIFI-5544
> URL: https://issues.apache.org/jira/browse/NIFI-5544
> Project: Apache NiFi
>  Issue Type: Improvement
>Reporter: Sivaprasanna Sethuraman
>Assignee: Sivaprasanna Sethuraman
>Priority: Minor
>
> Current codebase of the GetMongo processor can be improved.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5544) Refactor GetMongo processor codebase

2018-08-31 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5544?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16599196#comment-16599196
 ] 

ASF subversion and git services commented on NIFI-5544:
---

Commit 05e32cff28b44fcfc6504dc22fe6972d8f79b7eb in nifi's branch 
refs/heads/master from zenfenan
[ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=05e32cf ]

NIFI-5544: GetMongo refactored
NIFI-5544: PR Review changes

This closes #2958

Signed-off-by: Mike Thomsen 


> Refactor GetMongo processor codebase
> 
>
> Key: NIFI-5544
> URL: https://issues.apache.org/jira/browse/NIFI-5544
> Project: Apache NiFi
>  Issue Type: Improvement
>Reporter: Sivaprasanna Sethuraman
>Assignee: Sivaprasanna Sethuraman
>Priority: Minor
>
> Current codebase of the GetMongo processor can be improved.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5544) Refactor GetMongo processor codebase

2018-08-31 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5544?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16599197#comment-16599197
 ] 

ASF subversion and git services commented on NIFI-5544:
---

Commit 05e32cff28b44fcfc6504dc22fe6972d8f79b7eb in nifi's branch 
refs/heads/master from zenfenan
[ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=05e32cf ]

NIFI-5544: GetMongo refactored
NIFI-5544: PR Review changes

This closes #2958

Signed-off-by: Mike Thomsen 


> Refactor GetMongo processor codebase
> 
>
> Key: NIFI-5544
> URL: https://issues.apache.org/jira/browse/NIFI-5544
> Project: Apache NiFi
>  Issue Type: Improvement
>Reporter: Sivaprasanna Sethuraman
>Assignee: Sivaprasanna Sethuraman
>Priority: Minor
>
> Current codebase of the GetMongo processor can be improved.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2958: NIFI-5544: GetMongo refactored

2018-08-31 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/2958


---


[jira] [Commented] (NIFI-4914) Implement record model processor for Pulsar, i.e. ConsumePulsarRecord, PublishPulsarRecord

2018-08-31 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4914?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16599184#comment-16599184
 ] 

ASF GitHub Bot commented on NIFI-4914:
--

Github user david-streamlio commented on the issue:

https://github.com/apache/nifi/pull/2882
  
FWIW, this PR includes the following Record-Oriented processors:

org.apache.nifi.processors.pulsar.pubsub.ConsumePulsarRecord
org.apache.nifi.processors.pulsar.pubsub.PublishPulsarRecord 

As for the security issue you mentioned, we are actually adding an 
additional layer of security between Pulsar and NiFi by enabling connections to 
be secured with user supplied TLS certificates


> Implement record model processor for Pulsar, i.e. ConsumePulsarRecord, 
> PublishPulsarRecord
> --
>
> Key: NIFI-4914
> URL: https://issues.apache.org/jira/browse/NIFI-4914
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Affects Versions: 1.6.0
>Reporter: David Kjerrumgaard
>Priority: Minor
>   Original Estimate: 168h
>  Remaining Estimate: 168h
>
> Create record-based processors for Apache Pulsar 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2882: NIFI-4914

2018-08-31 Thread david-streamlio
Github user david-streamlio commented on the issue:

https://github.com/apache/nifi/pull/2882
  
FWIW, this PR includes the following Record-Oriented processors:

org.apache.nifi.processors.pulsar.pubsub.ConsumePulsarRecord
org.apache.nifi.processors.pulsar.pubsub.PublishPulsarRecord 

As for the security issue you mentioned, we are actually adding an 
additional layer of security between Pulsar and NiFi by enabling connections to 
be secured with user supplied TLS certificates


---


[jira] [Commented] (NIFI-4914) Implement record model processor for Pulsar, i.e. ConsumePulsarRecord, PublishPulsarRecord

2018-08-31 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4914?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16599156#comment-16599156
 ] 

ASF GitHub Bot commented on NIFI-4914:
--

Github user joewitt commented on the issue:

https://github.com/apache/nifi/pull/2882
  
not clear when we'll be cutting a 1.8 release so i'm not sure about safety

but this is clearly a cool capability and it is just a matter of finding a 
committer to review it with sufficient bandwidth and expertise. things 
impacting security are super important and not being record oriented makes it 
less useful for sure.  I haven't looked at the details in a while to see if you 
added that.  I'd go so far as to recommend not offering a non record approach 
but i wouldnt say that is a rule - just a recommendation


> Implement record model processor for Pulsar, i.e. ConsumePulsarRecord, 
> PublishPulsarRecord
> --
>
> Key: NIFI-4914
> URL: https://issues.apache.org/jira/browse/NIFI-4914
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Affects Versions: 1.6.0
>Reporter: David Kjerrumgaard
>Priority: Minor
>   Original Estimate: 168h
>  Remaining Estimate: 168h
>
> Create record-based processors for Apache Pulsar 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2882: NIFI-4914

2018-08-31 Thread joewitt
Github user joewitt commented on the issue:

https://github.com/apache/nifi/pull/2882
  
not clear when we'll be cutting a 1.8 release so i'm not sure about safety

but this is clearly a cool capability and it is just a matter of finding a 
committer to review it with sufficient bandwidth and expertise. things 
impacting security are super important and not being record oriented makes it 
less useful for sure.  I haven't looked at the details in a while to see if you 
added that.  I'd go so far as to recommend not offering a non record approach 
but i wouldnt say that is a rule - just a recommendation


---


[jira] [Commented] (NIFI-4914) Implement record model processor for Pulsar, i.e. ConsumePulsarRecord, PublishPulsarRecord

2018-08-31 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4914?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16599144#comment-16599144
 ] 

ASF GitHub Bot commented on NIFI-4914:
--

Github user david-streamlio commented on the issue:

https://github.com/apache/nifi/pull/2882
  
Any update on this? Am I safe to assume we are going to make the 1.8 
release?


> Implement record model processor for Pulsar, i.e. ConsumePulsarRecord, 
> PublishPulsarRecord
> --
>
> Key: NIFI-4914
> URL: https://issues.apache.org/jira/browse/NIFI-4914
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Affects Versions: 1.6.0
>Reporter: David Kjerrumgaard
>Priority: Minor
>   Original Estimate: 168h
>  Remaining Estimate: 168h
>
> Create record-based processors for Apache Pulsar 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2882: NIFI-4914

2018-08-31 Thread david-streamlio
Github user david-streamlio commented on the issue:

https://github.com/apache/nifi/pull/2882
  
Any update on this? Am I safe to assume we are going to make the 1.8 
release?


---


[jira] [Updated] (NIFI-4878) Update Docker docs to include all environment variables used on startup

2018-08-31 Thread Michael Moser (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-4878?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Michael Moser updated NIFI-4878:

Fix Version/s: (was: 1.5.0)

> Update Docker docs to include all environment variables used on startup
> ---
>
> Key: NIFI-4878
> URL: https://issues.apache.org/jira/browse/NIFI-4878
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Docker, Documentation & Website
>Reporter: Aldrin Piri
>Priority: Major
>
> -NIFI-4824- provided updates to allow specification of the various ports in 
> the container on startup via varying environment variables to aid the issue 
> of the host whitelisting.
> It does not appear this information is readily available in our docs and has 
> caused some confusion by users when trying to connect to an instance.  We 
> need to update the docs to enumerate these scenarios as well as the 
> environment variables that are anticipated.  
> We could additionally enhance the experience via performing some logical 
> checks.  One example could be verifying that if a custom port is set and we 
> are running secure, there is also an environment variable provided.  
> Additionally, providing variables that would be unused, such as 
> NIFI_WEB_HTTP_PORT when we are running secure could also cause an 
> error/warning. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (NIFI-4558) Populate default keystore/truststore types in SSLContextService

2018-08-31 Thread Andy LoPresto (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-4558?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Andy LoPresto resolved NIFI-4558.
-
   Resolution: Fixed
Fix Version/s: 1.8.0

> Populate default keystore/truststore types in SSLContextService
> ---
>
> Key: NIFI-4558
> URL: https://issues.apache.org/jira/browse/NIFI-4558
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Affects Versions: 1.4.0
>Reporter: Andy LoPresto
>Assignee: Nathan Gough
>Priority: Trivial
>  Labels: controller_services, jks, keystore, pkcs12, security, 
> ssl, tls, truststore, ux
> Fix For: 1.8.0
>
>
> The keystore and truststore type is almost always JKS as opposed to PKCS12 
> when creating SSL controller services. Both {{StandardSSLContextService}} and 
> {{StandardRestrictedSSLContextService}} should have those fields 
> autopopulated to JKS, saving 2-4 clicks per instantiation. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4558) Populate default keystore/truststore types in SSLContextService

2018-08-31 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4558?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16599072#comment-16599072
 ] 

ASF GitHub Bot commented on NIFI-4558:
--

Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/2982


> Populate default keystore/truststore types in SSLContextService
> ---
>
> Key: NIFI-4558
> URL: https://issues.apache.org/jira/browse/NIFI-4558
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Affects Versions: 1.4.0
>Reporter: Andy LoPresto
>Assignee: Nathan Gough
>Priority: Trivial
>  Labels: controller_services, jks, keystore, pkcs12, security, 
> ssl, tls, truststore, ux
>
> The keystore and truststore type is almost always JKS as opposed to PKCS12 
> when creating SSL controller services. Both {{StandardSSLContextService}} and 
> {{StandardRestrictedSSLContextService}} should have those fields 
> autopopulated to JKS, saving 2-4 clicks per instantiation. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4558) Populate default keystore/truststore types in SSLContextService

2018-08-31 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4558?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16599070#comment-16599070
 ] 

ASF subversion and git services commented on NIFI-4558:
---

Commit 97e0f6a6a700e425bf9fd39711da78dda74b87c9 in nifi's branch 
refs/heads/master from thenatog
[ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=97e0f6a ]

NIFI-4558 - Set JKS as the default keystore type and truststore type.

This closes #2982.

Signed-off-by: Andy LoPresto 


> Populate default keystore/truststore types in SSLContextService
> ---
>
> Key: NIFI-4558
> URL: https://issues.apache.org/jira/browse/NIFI-4558
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Affects Versions: 1.4.0
>Reporter: Andy LoPresto
>Assignee: Nathan Gough
>Priority: Trivial
>  Labels: controller_services, jks, keystore, pkcs12, security, 
> ssl, tls, truststore, ux
>
> The keystore and truststore type is almost always JKS as opposed to PKCS12 
> when creating SSL controller services. Both {{StandardSSLContextService}} and 
> {{StandardRestrictedSSLContextService}} should have those fields 
> autopopulated to JKS, saving 2-4 clicks per instantiation. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2982: NIFI-4558 - Set JKS as the default keystore type an...

2018-08-31 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/2982


---


[jira] [Commented] (NIFI-4558) Populate default keystore/truststore types in SSLContextService

2018-08-31 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4558?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16599067#comment-16599067
 ] 

ASF GitHub Bot commented on NIFI-4558:
--

Github user alopresto commented on the issue:

https://github.com/apache/nifi/pull/2982
  
Verified that this sets the defaults to "JKS" for both keystore and 
truststore on `StandardSSLContextService` and 
`StandardRestrictedSSLContextService`. 

Ran `contrib-check` and all tests pass. +1, merging. 


> Populate default keystore/truststore types in SSLContextService
> ---
>
> Key: NIFI-4558
> URL: https://issues.apache.org/jira/browse/NIFI-4558
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Affects Versions: 1.4.0
>Reporter: Andy LoPresto
>Assignee: Nathan Gough
>Priority: Trivial
>  Labels: controller_services, jks, keystore, pkcs12, security, 
> ssl, tls, truststore, ux
>
> The keystore and truststore type is almost always JKS as opposed to PKCS12 
> when creating SSL controller services. Both {{StandardSSLContextService}} and 
> {{StandardRestrictedSSLContextService}} should have those fields 
> autopopulated to JKS, saving 2-4 clicks per instantiation. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2982: NIFI-4558 - Set JKS as the default keystore type and trust...

2018-08-31 Thread alopresto
Github user alopresto commented on the issue:

https://github.com/apache/nifi/pull/2982
  
Verified that this sets the defaults to "JKS" for both keystore and 
truststore on `StandardSSLContextService` and 
`StandardRestrictedSSLContextService`. 

Ran `contrib-check` and all tests pass. +1, merging. 


---


[jira] [Commented] (NIFI-5147) Improve HashAttribute processor

2018-08-31 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5147?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16599016#comment-16599016
 ] 

ASF GitHub Bot commented on NIFI-5147:
--

Github user ottobackwards closed the pull request at:

https://github.com/apache/nifi/pull/2836


> Improve HashAttribute processor
> ---
>
> Key: NIFI-5147
> URL: https://issues.apache.org/jira/browse/NIFI-5147
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Affects Versions: 1.6.0
>Reporter: Andy LoPresto
>Assignee: Otto Fowler
>Priority: Major
>  Labels: hash, security
> Fix For: 1.8.0
>
>
> The {{HashAttribute}} processor currently has surprising behavior. Barring 
> familiarity with the processor, a user would expect {{HashAttribute}} to 
> generate a hash value over one or more attributes. Instead, the processor as 
> it is implemented "groups" incoming flowfiles into groups based on regular 
> expressions which match attribute values, and then generates a 
> (non-configurable) MD5 hash over the concatenation of the matching attribute 
> keys and values. 
> In addition:
> * the processor throws an error and routes to failure any incoming flowfile 
> which does not have all attributes specified in the processor
> * the use of MD5 is vastly deprecated
> * no other hash algorithms are available
> I am unaware of community use of this processor, but I do not want to break 
> backward compatibility. I propose the following steps:
> * Implement a new {{CalculateAttributeHash}} processor (awkward name, but 
> this processor already has the desired name)
> ** This processor will perform the "standard" use case -- identify an 
> attribute, calculate the specified hash over the value, and write it to an 
> output attribute
> ** This processor will have a required property descriptor allowing a 
> dropdown menu of valid hash algorithms
> ** This processor will accept arbitrary dynamic properties identifying the 
> attributes to be hashed as a key, and the resulting attribute name as a value
> ** Example: I want to generate a SHA-512 hash on the attribute {{username}}, 
> and a flowfile enters the processor with {{username}} value {{alopresto}}. I 
> configure {{algorithm}} with {{SHA-512}} and add a dynamic property 
> {{username}} -- {{username_SHA512}}. The resulting flowfile will have 
> attribute {{username_SHA512}} with value 
> {{739b4f6722fb5de20125751c7a1a358b2a7eb8f07e530e4bf18561fbff93234908aa9d250c876bca9ede5ba784d5ce6081dbbdfe5ddd446678f223b8d632}}
> * Improve the documentation of this processor to explain the goal/expected 
> use case (?)
> * Link in processor documentation to new processor for standard use cases
> * Remove the error alert when an incoming flowfile does not contain all 
> expected attributes. I propose changing the severity to INFO and still 
> routing to failure



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5147) Improve HashAttribute processor

2018-08-31 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5147?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16599015#comment-16599015
 ] 

ASF GitHub Bot commented on NIFI-5147:
--

Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2836
  
Sure.  For Jira's when you want to do it yourself, I suggest assigning it.  
That is my habit on the other Apache things I work with, and it will help guard 
against eager beavers ;)

Thanks for using the work as much as you did.



> Improve HashAttribute processor
> ---
>
> Key: NIFI-5147
> URL: https://issues.apache.org/jira/browse/NIFI-5147
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Affects Versions: 1.6.0
>Reporter: Andy LoPresto
>Assignee: Otto Fowler
>Priority: Major
>  Labels: hash, security
> Fix For: 1.8.0
>
>
> The {{HashAttribute}} processor currently has surprising behavior. Barring 
> familiarity with the processor, a user would expect {{HashAttribute}} to 
> generate a hash value over one or more attributes. Instead, the processor as 
> it is implemented "groups" incoming flowfiles into groups based on regular 
> expressions which match attribute values, and then generates a 
> (non-configurable) MD5 hash over the concatenation of the matching attribute 
> keys and values. 
> In addition:
> * the processor throws an error and routes to failure any incoming flowfile 
> which does not have all attributes specified in the processor
> * the use of MD5 is vastly deprecated
> * no other hash algorithms are available
> I am unaware of community use of this processor, but I do not want to break 
> backward compatibility. I propose the following steps:
> * Implement a new {{CalculateAttributeHash}} processor (awkward name, but 
> this processor already has the desired name)
> ** This processor will perform the "standard" use case -- identify an 
> attribute, calculate the specified hash over the value, and write it to an 
> output attribute
> ** This processor will have a required property descriptor allowing a 
> dropdown menu of valid hash algorithms
> ** This processor will accept arbitrary dynamic properties identifying the 
> attributes to be hashed as a key, and the resulting attribute name as a value
> ** Example: I want to generate a SHA-512 hash on the attribute {{username}}, 
> and a flowfile enters the processor with {{username}} value {{alopresto}}. I 
> configure {{algorithm}} with {{SHA-512}} and add a dynamic property 
> {{username}} -- {{username_SHA512}}. The resulting flowfile will have 
> attribute {{username_SHA512}} with value 
> {{739b4f6722fb5de20125751c7a1a358b2a7eb8f07e530e4bf18561fbff93234908aa9d250c876bca9ede5ba784d5ce6081dbbdfe5ddd446678f223b8d632}}
> * Improve the documentation of this processor to explain the goal/expected 
> use case (?)
> * Link in processor documentation to new processor for standard use cases
> * Remove the error alert when an incoming flowfile does not contain all 
> expected attributes. I propose changing the severity to INFO and still 
> routing to failure



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2836: NIFI-5147 Calculate hash attribute redux

2018-08-31 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2836
  
Sure.  For Jira's when you want to do it yourself, I suggest assigning it.  
That is my habit on the other Apache things I work with, and it will help guard 
against eager beavers ;)

Thanks for using the work as much as you did.



---


[GitHub] nifi pull request #2836: NIFI-5147 Calculate hash attribute redux

2018-08-31 Thread ottobackwards
Github user ottobackwards closed the pull request at:

https://github.com/apache/nifi/pull/2836


---


[jira] [Commented] (NIFI-5366) Implement Content Security Policy frame-ancestors directive

2018-08-31 Thread Andy LoPresto (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5366?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16599008#comment-16599008
 ] 

Andy LoPresto commented on NIFI-5366:
-

I'd read [this StackOverflow answer|https://stackoverflow.com/a/40417609/70465] 
to see how they interact. It appears that the {{frame-ancestors}} CSP obsoletes 
the {{X-Frame-Options}} header, but some legacy browsers rely on the header. 

> Implement Content Security Policy frame-ancestors directive
> ---
>
> Key: NIFI-5366
> URL: https://issues.apache.org/jira/browse/NIFI-5366
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core Framework
>Affects Versions: 1.7.0
>Reporter: Andy LoPresto
>Assignee: Nathan Gough
>Priority: Major
>  Labels: frame, header, http, security
>
> The {{X-Frame-Options}} headers [1] currently in place to prevent malicious 
> framing / clickjacking [2] are superseded by and should be replaced by the 
> Content Security Policy frame-ancestors [3] directive. 
> [1] https://tools.ietf.org/html/rfc7034
> [2] https://en.wikipedia.org/wiki/Clickjacking
> [3] 
> https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Content-Security-Policy/frame-ancestors



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5147) Improve HashAttribute processor

2018-08-31 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5147?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16599002#comment-16599002
 ] 

ASF GitHub Bot commented on NIFI-5147:
--

Github user alopresto commented on the issue:

https://github.com/apache/nifi/pull/2836
  
I didn't capture every detail of the issue in the Jira because I planned on 
writing it myself. However, some of the behavior (empty input should still 
return a hash, etc.) was standard. Yes, please close this PR and review 2980. 


> Improve HashAttribute processor
> ---
>
> Key: NIFI-5147
> URL: https://issues.apache.org/jira/browse/NIFI-5147
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Affects Versions: 1.6.0
>Reporter: Andy LoPresto
>Assignee: Otto Fowler
>Priority: Major
>  Labels: hash, security
> Fix For: 1.8.0
>
>
> The {{HashAttribute}} processor currently has surprising behavior. Barring 
> familiarity with the processor, a user would expect {{HashAttribute}} to 
> generate a hash value over one or more attributes. Instead, the processor as 
> it is implemented "groups" incoming flowfiles into groups based on regular 
> expressions which match attribute values, and then generates a 
> (non-configurable) MD5 hash over the concatenation of the matching attribute 
> keys and values. 
> In addition:
> * the processor throws an error and routes to failure any incoming flowfile 
> which does not have all attributes specified in the processor
> * the use of MD5 is vastly deprecated
> * no other hash algorithms are available
> I am unaware of community use of this processor, but I do not want to break 
> backward compatibility. I propose the following steps:
> * Implement a new {{CalculateAttributeHash}} processor (awkward name, but 
> this processor already has the desired name)
> ** This processor will perform the "standard" use case -- identify an 
> attribute, calculate the specified hash over the value, and write it to an 
> output attribute
> ** This processor will have a required property descriptor allowing a 
> dropdown menu of valid hash algorithms
> ** This processor will accept arbitrary dynamic properties identifying the 
> attributes to be hashed as a key, and the resulting attribute name as a value
> ** Example: I want to generate a SHA-512 hash on the attribute {{username}}, 
> and a flowfile enters the processor with {{username}} value {{alopresto}}. I 
> configure {{algorithm}} with {{SHA-512}} and add a dynamic property 
> {{username}} -- {{username_SHA512}}. The resulting flowfile will have 
> attribute {{username_SHA512}} with value 
> {{739b4f6722fb5de20125751c7a1a358b2a7eb8f07e530e4bf18561fbff93234908aa9d250c876bca9ede5ba784d5ce6081dbbdfe5ddd446678f223b8d632}}
> * Improve the documentation of this processor to explain the goal/expected 
> use case (?)
> * Link in processor documentation to new processor for standard use cases
> * Remove the error alert when an incoming flowfile does not contain all 
> expected attributes. I propose changing the severity to INFO and still 
> routing to failure



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2836: NIFI-5147 Calculate hash attribute redux

2018-08-31 Thread alopresto
Github user alopresto commented on the issue:

https://github.com/apache/nifi/pull/2836
  
I didn't capture every detail of the issue in the Jira because I planned on 
writing it myself. However, some of the behavior (empty input should still 
return a hash, etc.) was standard. Yes, please close this PR and review 2980. 


---


[jira] [Commented] (NIFI-5147) Improve HashAttribute processor

2018-08-31 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5147?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16598996#comment-16598996
 ] 

ASF GitHub Bot commented on NIFI-5147:
--

Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2836
  
Ok, that is great.  Did I miss something that _was_ in the jira?
Should I just close this PR now then?


> Improve HashAttribute processor
> ---
>
> Key: NIFI-5147
> URL: https://issues.apache.org/jira/browse/NIFI-5147
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Affects Versions: 1.6.0
>Reporter: Andy LoPresto
>Assignee: Otto Fowler
>Priority: Major
>  Labels: hash, security
> Fix For: 1.8.0
>
>
> The {{HashAttribute}} processor currently has surprising behavior. Barring 
> familiarity with the processor, a user would expect {{HashAttribute}} to 
> generate a hash value over one or more attributes. Instead, the processor as 
> it is implemented "groups" incoming flowfiles into groups based on regular 
> expressions which match attribute values, and then generates a 
> (non-configurable) MD5 hash over the concatenation of the matching attribute 
> keys and values. 
> In addition:
> * the processor throws an error and routes to failure any incoming flowfile 
> which does not have all attributes specified in the processor
> * the use of MD5 is vastly deprecated
> * no other hash algorithms are available
> I am unaware of community use of this processor, but I do not want to break 
> backward compatibility. I propose the following steps:
> * Implement a new {{CalculateAttributeHash}} processor (awkward name, but 
> this processor already has the desired name)
> ** This processor will perform the "standard" use case -- identify an 
> attribute, calculate the specified hash over the value, and write it to an 
> output attribute
> ** This processor will have a required property descriptor allowing a 
> dropdown menu of valid hash algorithms
> ** This processor will accept arbitrary dynamic properties identifying the 
> attributes to be hashed as a key, and the resulting attribute name as a value
> ** Example: I want to generate a SHA-512 hash on the attribute {{username}}, 
> and a flowfile enters the processor with {{username}} value {{alopresto}}. I 
> configure {{algorithm}} with {{SHA-512}} and add a dynamic property 
> {{username}} -- {{username_SHA512}}. The resulting flowfile will have 
> attribute {{username_SHA512}} with value 
> {{739b4f6722fb5de20125751c7a1a358b2a7eb8f07e530e4bf18561fbff93234908aa9d250c876bca9ede5ba784d5ce6081dbbdfe5ddd446678f223b8d632}}
> * Improve the documentation of this processor to explain the goal/expected 
> use case (?)
> * Link in processor documentation to new processor for standard use cases
> * Remove the error alert when an incoming flowfile does not contain all 
> expected attributes. I propose changing the severity to INFO and still 
> routing to failure



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2836: NIFI-5147 Calculate hash attribute redux

2018-08-31 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2836
  
Ok, that is great.  Did I miss something that _was_ in the jira?
Should I just close this PR now then?


---


[jira] [Commented] (NIFI-5366) Implement Content Security Policy frame-ancestors directive

2018-08-31 Thread Nathan Gough (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5366?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16598986#comment-16598986
 ] 

Nathan Gough commented on NIFI-5366:


Should this be implemented in combination with X-Frame-Options?

> Implement Content Security Policy frame-ancestors directive
> ---
>
> Key: NIFI-5366
> URL: https://issues.apache.org/jira/browse/NIFI-5366
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core Framework
>Affects Versions: 1.7.0
>Reporter: Andy LoPresto
>Assignee: Nathan Gough
>Priority: Major
>  Labels: frame, header, http, security
>
> The {{X-Frame-Options}} headers [1] currently in place to prevent malicious 
> framing / clickjacking [2] are superseded by and should be replaced by the 
> Content Security Policy frame-ancestors [3] directive. 
> [1] https://tools.ietf.org/html/rfc7034
> [2] https://en.wikipedia.org/wiki/Clickjacking
> [3] 
> https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Content-Security-Policy/frame-ancestors



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4558) Populate default keystore/truststore types in SSLContextService

2018-08-31 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4558?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16598980#comment-16598980
 ] 

ASF GitHub Bot commented on NIFI-4558:
--

Github user alopresto commented on the issue:

https://github.com/apache/nifi/pull/2982
  
Reviewing...


> Populate default keystore/truststore types in SSLContextService
> ---
>
> Key: NIFI-4558
> URL: https://issues.apache.org/jira/browse/NIFI-4558
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Affects Versions: 1.4.0
>Reporter: Andy LoPresto
>Assignee: Nathan Gough
>Priority: Trivial
>  Labels: controller_services, jks, keystore, pkcs12, security, 
> ssl, tls, truststore, ux
>
> The keystore and truststore type is almost always JKS as opposed to PKCS12 
> when creating SSL controller services. Both {{StandardSSLContextService}} and 
> {{StandardRestrictedSSLContextService}} should have those fields 
> autopopulated to JKS, saving 2-4 clicks per instantiation. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2982: NIFI-4558 - Set JKS as the default keystore type and trust...

2018-08-31 Thread alopresto
Github user alopresto commented on the issue:

https://github.com/apache/nifi/pull/2982
  
Reviewing...


---


[jira] [Commented] (NIFI-5147) Improve HashAttribute processor

2018-08-31 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5147?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16598969#comment-16598969
 ] 

ASF GitHub Bot commented on NIFI-5147:
--

Github user alopresto commented on the issue:

https://github.com/apache/nifi/pull/2836
  
I started reviewing this PR but realized it did not implement many of the 
behaviors I had needed in the original ticket. I opened [PR 
2980](https://github.com/apache/nifi/pull/2980) which includes this foundation 
instead. 


> Improve HashAttribute processor
> ---
>
> Key: NIFI-5147
> URL: https://issues.apache.org/jira/browse/NIFI-5147
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Affects Versions: 1.6.0
>Reporter: Andy LoPresto
>Assignee: Otto Fowler
>Priority: Major
>  Labels: hash, security
> Fix For: 1.8.0
>
>
> The {{HashAttribute}} processor currently has surprising behavior. Barring 
> familiarity with the processor, a user would expect {{HashAttribute}} to 
> generate a hash value over one or more attributes. Instead, the processor as 
> it is implemented "groups" incoming flowfiles into groups based on regular 
> expressions which match attribute values, and then generates a 
> (non-configurable) MD5 hash over the concatenation of the matching attribute 
> keys and values. 
> In addition:
> * the processor throws an error and routes to failure any incoming flowfile 
> which does not have all attributes specified in the processor
> * the use of MD5 is vastly deprecated
> * no other hash algorithms are available
> I am unaware of community use of this processor, but I do not want to break 
> backward compatibility. I propose the following steps:
> * Implement a new {{CalculateAttributeHash}} processor (awkward name, but 
> this processor already has the desired name)
> ** This processor will perform the "standard" use case -- identify an 
> attribute, calculate the specified hash over the value, and write it to an 
> output attribute
> ** This processor will have a required property descriptor allowing a 
> dropdown menu of valid hash algorithms
> ** This processor will accept arbitrary dynamic properties identifying the 
> attributes to be hashed as a key, and the resulting attribute name as a value
> ** Example: I want to generate a SHA-512 hash on the attribute {{username}}, 
> and a flowfile enters the processor with {{username}} value {{alopresto}}. I 
> configure {{algorithm}} with {{SHA-512}} and add a dynamic property 
> {{username}} -- {{username_SHA512}}. The resulting flowfile will have 
> attribute {{username_SHA512}} with value 
> {{739b4f6722fb5de20125751c7a1a358b2a7eb8f07e530e4bf18561fbff93234908aa9d250c876bca9ede5ba784d5ce6081dbbdfe5ddd446678f223b8d632}}
> * Improve the documentation of this processor to explain the goal/expected 
> use case (?)
> * Link in processor documentation to new processor for standard use cases
> * Remove the error alert when an incoming flowfile does not contain all 
> expected attributes. I propose changing the severity to INFO and still 
> routing to failure



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2836: NIFI-5147 Calculate hash attribute redux

2018-08-31 Thread alopresto
Github user alopresto commented on the issue:

https://github.com/apache/nifi/pull/2836
  
I started reviewing this PR but realized it did not implement many of the 
behaviors I had needed in the original ticket. I opened [PR 
2980](https://github.com/apache/nifi/pull/2980) which includes this foundation 
instead. 


---


[jira] [Assigned] (NIFI-5366) Implement Content Security Policy frame-ancestors directive

2018-08-31 Thread Nathan Gough (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-5366?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Nathan Gough reassigned NIFI-5366:
--

Assignee: Nathan Gough

> Implement Content Security Policy frame-ancestors directive
> ---
>
> Key: NIFI-5366
> URL: https://issues.apache.org/jira/browse/NIFI-5366
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core Framework
>Affects Versions: 1.7.0
>Reporter: Andy LoPresto
>Assignee: Nathan Gough
>Priority: Major
>  Labels: frame, header, http, security
>
> The {{X-Frame-Options}} headers [1] currently in place to prevent malicious 
> framing / clickjacking [2] are superseded by and should be replaced by the 
> Content Security Policy frame-ancestors [3] directive. 
> [1] https://tools.ietf.org/html/rfc7034
> [2] https://en.wikipedia.org/wiki/Clickjacking
> [3] 
> https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Content-Security-Policy/frame-ancestors



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5542) Add support for node groups to FileAccessPolicyProvider

2018-08-31 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5542?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16598819#comment-16598819
 ] 

ASF GitHub Bot commented on NIFI-5542:
--

Github user achristianson commented on the issue:

https://github.com/apache/nifi/pull/2970
  
@pepov I took that lookup out. It appeared to be completely unnecessary.


> Add support for node groups to FileAccessPolicyProvider
> ---
>
> Key: NIFI-5542
> URL: https://issues.apache.org/jira/browse/NIFI-5542
> Project: Apache NiFi
>  Issue Type: Improvement
>Reporter: Andrew Christianson
>Priority: Major
>
> Currently in FileAccessPolicyProvider, it is possible to specify a set of 
> node identities, which are given access to /proxy. This works well for static 
> clusters, but does not work so well for dynamic clusters (scaling up/down # 
> of nodes) because we don't know in advance what the node identities will be 
> or how many there will be.
> In order to support dynamic sets of node identities, add support for 
> specifying a "Node Group," for which all identities in the group will be 
> granted access to /proxy. A UserGroupProvider can then be implemented to 
> gather node identities dynamically from the cluster environment.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5282) GCPProcessor with HTTP Proxy with Authentication

2018-08-31 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5282?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16598818#comment-16598818
 ] 

ASF GitHub Bot commented on NIFI-5282:
--

Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/2943


> GCPProcessor with HTTP Proxy with Authentication
> 
>
> Key: NIFI-5282
> URL: https://issues.apache.org/jira/browse/NIFI-5282
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Affects Versions: 1.6.0
>Reporter: Julian Gimbel
>Assignee: Sivaprasanna Sethuraman
>Priority: Major
>
> The [AbstractGCPProcessor 
> |https://github.com/apache/nifi/blob/master/nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/AbstractGCPProcessor.java]
>  already accepts http proxy settings but it but be even better if it accepts 
> authenticated proxies with user and password aswell.
> In the best case it would support the ProxyService introduced in 
> [NIFI-4199|https://issues.apache.org/jira/projects/NIFI/issues/NIFI-4199] and 
> all of its options.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5282) GCPProcessor with HTTP Proxy with Authentication

2018-08-31 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5282?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16598816#comment-16598816
 ] 

ASF GitHub Bot commented on NIFI-5282:
--

Github user mcgilman commented on the issue:

https://github.com/apache/nifi/pull/2981
  
Thanks @ijokarumawak! This has been merged to master.


> GCPProcessor with HTTP Proxy with Authentication
> 
>
> Key: NIFI-5282
> URL: https://issues.apache.org/jira/browse/NIFI-5282
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Affects Versions: 1.6.0
>Reporter: Julian Gimbel
>Assignee: Sivaprasanna Sethuraman
>Priority: Major
>
> The [AbstractGCPProcessor 
> |https://github.com/apache/nifi/blob/master/nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/AbstractGCPProcessor.java]
>  already accepts http proxy settings but it but be even better if it accepts 
> authenticated proxies with user and password aswell.
> In the best case it would support the ProxyService introduced in 
> [NIFI-4199|https://issues.apache.org/jira/projects/NIFI/issues/NIFI-4199] and 
> all of its options.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5282) GCPProcessor with HTTP Proxy with Authentication

2018-08-31 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5282?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16598814#comment-16598814
 ] 

ASF subversion and git services commented on NIFI-5282:
---

Commit cdae2b14b3ec596887585f921a9490ded737b345 in nifi's branch 
refs/heads/master from jugi92
[ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=cdae2b1 ]

NIFI-5282: GCPProcessor with HTTP Proxy with Auth

added http proxy support with authentication for GCP processors
added proxy support for Google Credential Service

This closes #2943.

Signed-off-by: Koji Kawamura 


> GCPProcessor with HTTP Proxy with Authentication
> 
>
> Key: NIFI-5282
> URL: https://issues.apache.org/jira/browse/NIFI-5282
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Affects Versions: 1.6.0
>Reporter: Julian Gimbel
>Assignee: Sivaprasanna Sethuraman
>Priority: Major
>
> The [AbstractGCPProcessor 
> |https://github.com/apache/nifi/blob/master/nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/AbstractGCPProcessor.java]
>  already accepts http proxy settings but it but be even better if it accepts 
> authenticated proxies with user and password aswell.
> In the best case it would support the ProxyService introduced in 
> [NIFI-4199|https://issues.apache.org/jira/projects/NIFI/issues/NIFI-4199] and 
> all of its options.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2970: NIFI-5542 Added support for node groups to FileAccessPolic...

2018-08-31 Thread achristianson
Github user achristianson commented on the issue:

https://github.com/apache/nifi/pull/2970
  
@pepov I took that lookup out. It appeared to be completely unnecessary.


---


[jira] [Commented] (NIFI-5282) GCPProcessor with HTTP Proxy with Authentication

2018-08-31 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5282?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16598817#comment-16598817
 ] 

ASF GitHub Bot commented on NIFI-5282:
--

Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/2981


> GCPProcessor with HTTP Proxy with Authentication
> 
>
> Key: NIFI-5282
> URL: https://issues.apache.org/jira/browse/NIFI-5282
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Affects Versions: 1.6.0
>Reporter: Julian Gimbel
>Assignee: Sivaprasanna Sethuraman
>Priority: Major
>
> The [AbstractGCPProcessor 
> |https://github.com/apache/nifi/blob/master/nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/AbstractGCPProcessor.java]
>  already accepts http proxy settings but it but be even better if it accepts 
> authenticated proxies with user and password aswell.
> In the best case it would support the ProxyService introduced in 
> [NIFI-4199|https://issues.apache.org/jira/projects/NIFI/issues/NIFI-4199] and 
> all of its options.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (NIFI-5282) GCPProcessor with HTTP Proxy with Authentication

2018-08-31 Thread Matt Gilman (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-5282?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matt Gilman resolved NIFI-5282.
---
Resolution: Fixed

> GCPProcessor with HTTP Proxy with Authentication
> 
>
> Key: NIFI-5282
> URL: https://issues.apache.org/jira/browse/NIFI-5282
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Affects Versions: 1.6.0
>Reporter: Julian Gimbel
>Assignee: Sivaprasanna Sethuraman
>Priority: Major
>
> The [AbstractGCPProcessor 
> |https://github.com/apache/nifi/blob/master/nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/AbstractGCPProcessor.java]
>  already accepts http proxy settings but it but be even better if it accepts 
> authenticated proxies with user and password aswell.
> In the best case it would support the ProxyService introduced in 
> [NIFI-4199|https://issues.apache.org/jira/projects/NIFI/issues/NIFI-4199] and 
> all of its options.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5282) GCPProcessor with HTTP Proxy with Authentication

2018-08-31 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5282?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16598815#comment-16598815
 ] 

ASF subversion and git services commented on NIFI-5282:
---

Commit 5a58c9a1715df1da32f053ac8af5c73c593a569e in nifi's branch 
refs/heads/master from [~ijokarumawak]
[ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=5a58c9a ]

NIFI-5282: Add ProxyConfigurationService to GCSProcessors

This closes #2981


> GCPProcessor with HTTP Proxy with Authentication
> 
>
> Key: NIFI-5282
> URL: https://issues.apache.org/jira/browse/NIFI-5282
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Affects Versions: 1.6.0
>Reporter: Julian Gimbel
>Assignee: Sivaprasanna Sethuraman
>Priority: Major
>
> The [AbstractGCPProcessor 
> |https://github.com/apache/nifi/blob/master/nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/AbstractGCPProcessor.java]
>  already accepts http proxy settings but it but be even better if it accepts 
> authenticated proxies with user and password aswell.
> In the best case it would support the ProxyService introduced in 
> [NIFI-4199|https://issues.apache.org/jira/projects/NIFI/issues/NIFI-4199] and 
> all of its options.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2943: NIFI-5282 - Proxy support gcp

2018-08-31 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/2943


---


[GitHub] nifi pull request #2981: NIFI-5282: GCPProcessor with HTTP Proxy with Authen...

2018-08-31 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/2981


---


[GitHub] nifi issue #2981: NIFI-5282: GCPProcessor with HTTP Proxy with Authenticatio...

2018-08-31 Thread mcgilman
Github user mcgilman commented on the issue:

https://github.com/apache/nifi/pull/2981
  
Thanks @ijokarumawak! This has been merged to master.


---


[jira] [Commented] (NIFI-5542) Add support for node groups to FileAccessPolicyProvider

2018-08-31 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5542?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16598808#comment-16598808
 ] 

ASF GitHub Bot commented on NIFI-5542:
--

Github user pepov commented on the issue:

https://github.com/apache/nifi/pull/2970
  
yes


> Add support for node groups to FileAccessPolicyProvider
> ---
>
> Key: NIFI-5542
> URL: https://issues.apache.org/jira/browse/NIFI-5542
> Project: Apache NiFi
>  Issue Type: Improvement
>Reporter: Andrew Christianson
>Priority: Major
>
> Currently in FileAccessPolicyProvider, it is possible to specify a set of 
> node identities, which are given access to /proxy. This works well for static 
> clusters, but does not work so well for dynamic clusters (scaling up/down # 
> of nodes) because we don't know in advance what the node identities will be 
> or how many there will be.
> In order to support dynamic sets of node identities, add support for 
> specifying a "Node Group," for which all identities in the group will be 
> granted access to /proxy. A UserGroupProvider can then be implemented to 
> gather node identities dynamically from the cluster environment.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2970: NIFI-5542 Added support for node groups to FileAccessPolic...

2018-08-31 Thread pepov
Github user pepov commented on the issue:

https://github.com/apache/nifi/pull/2970
  
yes


---


[jira] [Assigned] (NIFI-4558) Populate default keystore/truststore types in SSLContextService

2018-08-31 Thread Nathan Gough (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-4558?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Nathan Gough reassigned NIFI-4558:
--

Assignee: Nathan Gough

> Populate default keystore/truststore types in SSLContextService
> ---
>
> Key: NIFI-4558
> URL: https://issues.apache.org/jira/browse/NIFI-4558
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Affects Versions: 1.4.0
>Reporter: Andy LoPresto
>Assignee: Nathan Gough
>Priority: Trivial
>  Labels: controller_services, jks, keystore, pkcs12, security, 
> ssl, tls, truststore, ux
>
> The keystore and truststore type is almost always JKS as opposed to PKCS12 
> when creating SSL controller services. Both {{StandardSSLContextService}} and 
> {{StandardRestrictedSSLContextService}} should have those fields 
> autopopulated to JKS, saving 2-4 clicks per instantiation. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5282) GCPProcessor with HTTP Proxy with Authentication

2018-08-31 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5282?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16598803#comment-16598803
 ] 

ASF GitHub Bot commented on NIFI-5282:
--

Github user mcgilman commented on the issue:

https://github.com/apache/nifi/pull/2981
  
Reviewing...


> GCPProcessor with HTTP Proxy with Authentication
> 
>
> Key: NIFI-5282
> URL: https://issues.apache.org/jira/browse/NIFI-5282
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Affects Versions: 1.6.0
>Reporter: Julian Gimbel
>Assignee: Sivaprasanna Sethuraman
>Priority: Major
>
> The [AbstractGCPProcessor 
> |https://github.com/apache/nifi/blob/master/nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/AbstractGCPProcessor.java]
>  already accepts http proxy settings but it but be even better if it accepts 
> authenticated proxies with user and password aswell.
> In the best case it would support the ProxyService introduced in 
> [NIFI-4199|https://issues.apache.org/jira/projects/NIFI/issues/NIFI-4199] and 
> all of its options.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


  1   2   >