[GitHub] nifi issue #3200: NIFI-5826 WIP Fix back-slash escaping at Lexers
Github user ottobackwards commented on the issue: https://github.com/apache/nifi/pull/3200 There should be new tests that go along with this ---
[GitHub] nifi pull request #3183: NIFI-5826 Fix to escaped backslash
Github user ottobackwards commented on a diff in the pull request: https://github.com/apache/nifi/pull/3183#discussion_r238675406 --- Diff: nifi-commons/nifi-record-path/src/main/java/org/apache/nifi/record/path/util/RecordPathUtils.java --- @@ -39,4 +39,52 @@ public static String getFirstStringValue(final RecordPathSegment segment, final return stringValue; } + +/** + * This method handles backslash sequences after ANTLR parser converts all backslash into double ones + * with exception for \t, \r and \n. See + * org/apache/nifi/record/path/RecordPathParser.g --- End diff -- What would help, is one or more clear, failing unit tests against the current code that illustrate the problem. We have regex escape routines in more than one place, some for places without grammars ( replace text). Where we _do_ have a grammar, the correct thing, or the preferred thing in a vacuum would be to have the grammar handle this. It is centralized and more maintainable. We may want to evaluate the regression as a community, based on the fix and the implications wrt maintainability and correctness. I would almost say that we would want to have discussion and review of both approaches. That would be my preference. ---
[GitHub] nifi pull request #3183: NIFI-5826 Fix to escaped backslash
Github user ottobackwards commented on a diff in the pull request: https://github.com/apache/nifi/pull/3183#discussion_r238675752 --- Diff: nifi-commons/nifi-record-path/src/main/java/org/apache/nifi/record/path/util/RecordPathUtils.java --- @@ -39,4 +39,52 @@ public static String getFirstStringValue(final RecordPathSegment segment, final return stringValue; } + +/** + * This method handles backslash sequences after ANTLR parser converts all backslash into double ones + * with exception for \t, \r and \n. See + * org/apache/nifi/record/path/RecordPathParser.g --- End diff -- Such discussion would solicit input from other experienced contributors as well. ---
[GitHub] nifi-minifi-cpp pull request #451: MINIFICPP-688 fix small spelling error an...
GitHub user ottobackwards opened a pull request: https://github.com/apache/nifi-minifi-cpp/pull/451 MINIFICPP-688 fix small spelling error and awkward wording Thank you for submitting a contribution to Apache NiFi - MiNiFi C++. In order to streamline the review of the contribution we ask you to ensure the following steps have been taken: ### For all changes: - [x] Is there a JIRA ticket associated with this PR? Is it referenced in the commit message? - [x] Does your PR title start with MINIFICPP- where is the JIRA number you are trying to resolve? Pay particular attention to the hyphen "-" character. - [x] Has your PR been rebased against the latest commit within the target branch (typically master)? - [x] Is your initial contribution a single, squashed commit? ### For code changes: - [-] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [-] If applicable, have you updated the LICENSE file? - [-] If applicable, have you updated the NOTICE file? ### For documentation related changes: - [x] Have you ensured that format looks appropriate for the output in which it is rendered? ### Note: Please ensure that once the PR is submitted, you check travis-ci for build issues and submit an update to your PR as soon as possible. You can merge this pull request into a Git repository by running: $ git pull https://github.com/ottobackwards/nifi-minifi-cpp python-md-spelling Alternatively you can review and apply these changes as the patch at: https://github.com/apache/nifi-minifi-cpp/pull/451.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #451 commit f8b8c5c266446ef444043596c85f0849c626428b Author: Otto Fowler Date: 2018-11-28T18:11:30Z fix small spelling error and awkward wording ---
[GitHub] nifi issue #3183: NIFI-5826 Fix to escaped backslash
Github user ottobackwards commented on the issue: https://github.com/apache/nifi/pull/3183 +1, LGTM ---
[GitHub] nifi pull request #3183: NIFI-5826 Fix to escaped backslash
Github user ottobackwards commented on a diff in the pull request: https://github.com/apache/nifi/pull/3183#discussion_r236830742 --- Diff: nifi-commons/nifi-record-path/src/main/java/org/apache/nifi/record/path/util/RecordPathUtils.java --- @@ -39,4 +39,44 @@ public static String getFirstStringValue(final RecordPathSegment segment, final return stringValue; } + +public static String unescapeBackslash(String value) { +if (value == null || value.isEmpty()) { +return value; +} +// need to escape characters after backslashes +final StringBuilder sb = new StringBuilder(); +boolean lastCharIsBackslash = false; +for (int i = 0; i < value.length(); i++) { +final char c = value.charAt(i); + --- End diff -- I am sorry, I meant if you could document the code. That way someone who goes in can tell that you are escaping these specific characters per some specification ---
[GitHub] nifi pull request #3183: NIFI-5826 Fix to escaped backslash
Github user ottobackwards commented on a diff in the pull request: https://github.com/apache/nifi/pull/3183#discussion_r236785266 --- Diff: nifi-commons/nifi-record-path/src/test/java/org/apache/nifi/record/path/TestRecordPath.java --- @@ -1008,12 +1008,16 @@ public void testReplaceRegex() { final List fields = new ArrayList<>(); fields.add(new RecordField("id", RecordFieldType.INT.getDataType())); fields.add(new RecordField("name", RecordFieldType.STRING.getDataType())); --- End diff -- There should be tests for all the escapable chars ---
[GitHub] nifi pull request #3183: NIFI-5826 Fix to escaped backslash
Github user ottobackwards commented on a diff in the pull request: https://github.com/apache/nifi/pull/3183#discussion_r236784930 --- Diff: nifi-commons/nifi-record-path/src/main/java/org/apache/nifi/record/path/util/RecordPathUtils.java --- @@ -39,4 +39,44 @@ public static String getFirstStringValue(final RecordPathSegment segment, final return stringValue; } + +public static String unescapeBackslash(String value) { +if (value == null || value.isEmpty()) { +return value; +} +// need to escape characters after backslashes +final StringBuilder sb = new StringBuilder(); +boolean lastCharIsBackslash = false; +for (int i = 0; i < value.length(); i++) { +final char c = value.charAt(i); + --- End diff -- Is this the entire list that has to be escaped? Maybe you can reference where that list is specified for maintenance. ---
[GitHub] nifi-registry issue #148: NIFIREG-211 Initial work for adding extenion bundl...
Github user ottobackwards commented on the issue: https://github.com/apache/nifi-registry/pull/148 I don't want to pepper you with questions on this, is there some documentation that goes along with this? ---
[GitHub] nifi-registry issue #148: NIFIREG-211 Initial work for adding extenion bundl...
Github user ottobackwards commented on the issue: https://github.com/apache/nifi-registry/pull/148 What are some of the rules that can be tested? Like uploading with duplicate names etc? ---
[GitHub] nifi-registry issue #148: NIFIREG-211 Initial work for adding extenion bundl...
Github user ottobackwards commented on the issue: https://github.com/apache/nifi-registry/pull/148 Would it make sense to have separate areas to store the project nars vs other nars? ---
[GitHub] nifi issue #3130: NIFI-5791: Add Apache Daffodil (incubating) bundle
Github user ottobackwards commented on the issue: https://github.com/apache/nifi/pull/3130 Thanks for taking the time to explain further. I see what you mean. Nifi needs a better way to get across to users when to use what processor in general. ---
[GitHub] nifi issue #3130: NIFI-5791: Add Apache Daffodil (incubating) bundle
Github user ottobackwards commented on the issue: https://github.com/apache/nifi/pull/3130 What I mean is that from a high level, it is a transformational capability, and may be used as an alternative. ---
[GitHub] nifi issue #3130: NIFI-5791: Add Apache Daffodil (incubating) bundle
Github user ottobackwards commented on the issue: https://github.com/apache/nifi/pull/3130 Isn't this like the Jolt capability? ---
[GitHub] nifi issue #3095: NIFI-5673 Fixing connection handling in MQTT processors
Github user ottobackwards commented on the issue: https://github.com/apache/nifi/pull/3095 @bbende can you change the title to refer to the correct jira, you got me all excited about the other jira ;) ---
[GitHub] nifi issue #3090: NIFI-3792 Adding a RetryCount processor to faciliate retry...
Github user ottobackwards commented on the issue: https://github.com/apache/nifi/pull/3090 I think this general subject, and processor in particular can use some usage documentation, maybe an additional documentation page would be a good addition for this? ---
[GitHub] nifi pull request #3084: NIFI-5689 ReplaceText does not handle end of line c...
Github user ottobackwards closed the pull request at: https://github.com/apache/nifi/pull/3084 ---
[GitHub] nifi issue #3084: NIFI-5689 ReplaceText does not handle end of line correctl...
Github user ottobackwards commented on the issue: https://github.com/apache/nifi/pull/3084 Closing until re-write ---
[GitHub] nifi issue #3084: NIFI-5689 ReplaceText does not handle end of line correctl...
Github user ottobackwards commented on the issue: https://github.com/apache/nifi/pull/3084 @patricker ---
[GitHub] nifi pull request #3084: NIFI-5689 ReplaceText does not handle end of line c...
GitHub user ottobackwards opened a pull request: https://github.com/apache/nifi/pull/3084 NIFI-5689 ReplaceText does not handle end of line correctly on buffer⦠⦠boundary Thank you for submitting a contribution to Apache NiFi. In order to streamline the review of the contribution we ask you to ensure the following steps have been taken: ### For all changes: - [x] Is there a JIRA ticket associated with this PR? Is it referenced in the commit message? - [x] Does your PR title start with NIFI- where is the JIRA number you are trying to resolve? Pay particular attention to the hyphen "-" character. - [x] Has your PR been rebased against the latest commit within the target branch (typically master)? - [x] Is your initial contribution a single, squashed commit? ### For code changes: - [x] Have you ensured that the full suite of tests is executed via mvn -Pcontrib-check clean install at the root nifi folder? - [x] Have you written or updated unit tests to verify your changes? - [-] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [-] If applicable, have you updated the LICENSE file, including the main LICENSE file under nifi-assembly? - [-] If applicable, have you updated the NOTICE file, including the main NOTICE file found under nifi-assembly? - [-] If adding new Properties, have you added .displayName in addition to .name (programmatic access) for each of the new properties? ### For documentation related changes: - [-] Have you ensured that format looks appropriate for the output in which it is rendered? ### Note: Please ensure that once the PR is submitted, you check travis-ci for build issues and submit an update to your PR as soon as possible. You can merge this pull request into a Git repository by running: $ git pull https://github.com/ottobackwards/nifi replacetext-line-endings Alternatively you can review and apply these changes as the patch at: https://github.com/apache/nifi/pull/3084.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #3084 commit 451c11e2eff0a9416563119ce8df639c30f12674 Author: Otto Fowler Date: 2018-10-16T19:45:05Z NIFI-5689 ReplaceText does not handle end of line correctly on buffer boundary ---
[GitHub] nifi issue #3070: NIFI-5695: Fixed bug that caused ports to not properly map...
Github user ottobackwards commented on the issue: https://github.com/apache/nifi/pull/3070 Is there a way that the duplicated code can be shared? ---
[GitHub] nifi pull request #3060: NIFI-5678: Fixed MAP type support of MapRecord obje...
Github user ottobackwards commented on a diff in the pull request: https://github.com/apache/nifi/pull/3060#discussion_r224208847 --- Diff: nifi-nar-bundles/nifi-extension-utils/nifi-record-utils/nifi-standard-record-utils/src/main/java/org/apache/nifi/schema/validation/StandardSchemaValidator.java --- @@ -196,21 +196,32 @@ private boolean isTypeCorrect(final Object value, final DataType dataType) { return true; case MAP: -if (!(value instanceof Map)) { -return false; -} - -final MapDataType mapDataType = (MapDataType) dataType; -final DataType valueDataType = mapDataType.getValueType(); -final Map map = (Map) value; - -for (final Object mapValue : map.values()) { -if (!isTypeCorrect(mapValue, valueDataType)) { -return false; +if (value instanceof Map) { +final MapDataType mapDataType = (MapDataType) dataType; +final DataType valueDataType = mapDataType.getValueType(); +final Map map = (Map) value; + +for (final Object mapValue : map.values()) { +if (!isTypeCorrect(mapValue, valueDataType)) { +return false; +} } +return true; --- End diff -- Sorry, I looked through the code and understand. excuse me ---
[GitHub] nifi pull request #3060: NIFI-5678: Fixed MAP type support of MapRecord obje...
Github user ottobackwards commented on a diff in the pull request: https://github.com/apache/nifi/pull/3060#discussion_r224205938 --- Diff: nifi-nar-bundles/nifi-extension-utils/nifi-record-utils/nifi-standard-record-utils/src/main/java/org/apache/nifi/schema/validation/StandardSchemaValidator.java --- @@ -196,21 +196,32 @@ private boolean isTypeCorrect(final Object value, final DataType dataType) { return true; case MAP: -if (!(value instanceof Map)) { -return false; -} - -final MapDataType mapDataType = (MapDataType) dataType; -final DataType valueDataType = mapDataType.getValueType(); -final Map map = (Map) value; - -for (final Object mapValue : map.values()) { -if (!isTypeCorrect(mapValue, valueDataType)) { -return false; +if (value instanceof Map) { +final MapDataType mapDataType = (MapDataType) dataType; +final DataType valueDataType = mapDataType.getValueType(); +final Map map = (Map) value; + +for (final Object mapValue : map.values()) { +if (!isTypeCorrect(mapValue, valueDataType)) { +return false; +} } +return true; --- End diff -- Would this be the same if it was a MapRecord? ---
[GitHub] nifi issue #3049: NIFI-5664 Support List in DataTypeUtils#toArray
Github user ottobackwards commented on the issue: https://github.com/apache/nifi/pull/3049 This looks great to me, I thought there was a specific use case for this, if so, it might be good to have a unit or integration text that showed that case. ---
[GitHub] nifi pull request #3049: NIFI-5664 Support ArrayList in DataTypeUtils#toArra...
Github user ottobackwards commented on a diff in the pull request: https://github.com/apache/nifi/pull/3049#discussion_r223141579 --- Diff: nifi-commons/nifi-record/src/main/java/org/apache/nifi/serialization/record/util/DataTypeUtils.java --- @@ -339,6 +339,11 @@ public static boolean isRecordTypeCompatible(final Object value) { return dest; } --- End diff -- Why wouldn't this be any List implementation? ---
[GitHub] nifi-minifi-cpp pull request #405: MINIFICPP-618: Add C2 triggers, first of ...
Github user ottobackwards commented on a diff in the pull request: https://github.com/apache/nifi-minifi-cpp/pull/405#discussion_r221070316 --- Diff: libminifi/include/properties/Properties.h --- @@ -60,6 +60,9 @@ class Properties { // Get the config value bool get(std::string key, std::string ); + // Get the config value --- End diff -- sorry ---
[GitHub] nifi-minifi-cpp pull request #405: MINIFICPP-618: Add C2 triggers, first of ...
Github user ottobackwards commented on a diff in the pull request: https://github.com/apache/nifi-minifi-cpp/pull/405#discussion_r221068265 --- Diff: libminifi/include/properties/Properties.h --- @@ -60,6 +60,9 @@ class Properties { // Get the config value bool get(std::string key, std::string ); + // Get the config value --- End diff -- This should document the alternate keys and why it is different than the above ---
[GitHub] nifi-minifi-cpp pull request #405: MINIFICPP-618: Add C2 triggers, first of ...
Github user ottobackwards commented on a diff in the pull request: https://github.com/apache/nifi-minifi-cpp/pull/405#discussion_r220403308 --- Diff: libminifi/include/c2/C2Trigger.h --- @@ -0,0 +1,63 @@ +/** + * + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +#ifndef LIBMINIFI_INCLUDE_C2_C2TRIGGER_H_ +#define LIBMINIFI_INCLUDE_C2_C2TRIGGER_H_ + +#include "core/Connectable.h" +#include "c2/C2Payload.h" +#include "properties/Configure.h" + +namespace org { +namespace apache { +namespace nifi { +namespace minifi { +namespace c2 { + +class C2Trigger : public core::Connectable{ + public: + + C2Trigger(std::string name, utils::Identifier uuid) +: core::Connectable(name, uuid){ + + } + virtual ~C2Trigger() { + } + + + /** --- End diff -- I think this comment is for a different function ---
[GitHub] nifi issue #2820: NIFI-5327 Adding Netflowv5 protocol parser
Github user ottobackwards commented on the issue: https://github.com/apache/nifi/pull/2820 You can trigger a build by closing and reopening the pr. ---
[GitHub] nifi issue #2920: NIFI-5449: Added Base64 Encode/Decode functions to RecordP...
Github user ottobackwards commented on the issue: https://github.com/apache/nifi/pull/2920 Yeah thanks @mattyb149 ---
[GitHub] nifi issue #2983: NIFI-5566 Improve HashContent processor and standardize Ha...
Github user ottobackwards commented on the issue: https://github.com/apache/nifi/pull/2983 There are checkstyle errors not related to my pr on this pr as well. ```bash [INFO] --- maven-checkstyle-plugin:2.17:check (check-style) @ nifi-standard-processors --- [WARNING] src/main/java/org/apache/nifi/processors/standard/CryptographicHashAttribute.java:[202] (sizes) LineLength: Line is longer than 200 characters (found 202). [WARNING] src/main/java/org/apache/nifi/security/util/crypto/HashService.java:[84] (sizes) LineLength: Line is longer than 200 characters (found 207). [INFO] ``` ---
[GitHub] nifi issue #2983: NIFI-5566 Improve HashContent processor and standardize Ha...
Github user ottobackwards commented on the issue: https://github.com/apache/nifi/pull/2983 https://github.com/alopresto/nifi/pull/6 ---
[GitHub] nifi issue #2983: NIFI-5566 Improve HashContent processor and standardize Ha...
Github user ottobackwards commented on the issue: https://github.com/apache/nifi/pull/2983 https://github.com/apache/nifi/pull/2802#discussion_r199346844 ---
[GitHub] nifi issue #2983: NIFI-5566 Improve HashContent processor and standardize Ha...
Github user ottobackwards commented on the issue: https://github.com/apache/nifi/pull/2983 Travis has the same issue : https://api.travis-ci.org/v3/job/425055118/log.txt ---
[GitHub] nifi issue #2956: NIFI-5537 Create Neo4J cypher execution processor
Github user ottobackwards commented on the issue: https://github.com/apache/nifi/pull/2956 @mans2singh This is great PR. +1 from me. You still need a committer +1 though. ---
[GitHub] nifi issue #2983: NIFI-5566 Improve HashContent processor and standardize Ha...
Github user ottobackwards commented on the issue: https://github.com/apache/nifi/pull/2983 https://github.com/alopresto/nifi/tree/NIFI-5566/nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/resources doesn't have that directory either. Isn't there something with git and empty directories? ---
[GitHub] nifi issue #2983: NIFI-5566 Improve HashContent processor and standardize Ha...
Github user ottobackwards commented on the issue: https://github.com/apache/nifi/pull/2983 ``` ottofowler@Winterfell î° ~/tmp/apache-nifi-pr-2983/nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test î° î pr-2983 î° ls resources CharacterSetConversionSamples TestEncryptContentTestJson TestSplitText localhost-ts.jks CompressedDataTestExtractGrok TestMergeContent TestTransformXml localhost.cer ExecuteCommandTestForkRecordTestModifyBytes TestUnpackContent logback-test.xml META-INF TestGetHTTP TestPostHTTP TestUpdateRecord randombytes-1 ScanAttribute TestIdentifyMimeType TestReplaceTextLineByLine TestXml simple.jpg TestConvertJSONToSQL TestInvokeHttp TestReplaceTextWithMappinghello.txt xxe_from_report.xml TestCountText TestJoltTransformJson TestScanContent localhost-ks.jks xxe_template.xml ottofowler@Winterfell î° ~/tmp/apache-nifi-pr-2983/nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test î° î pr-2983 î° git status On branch pr-2983 nothing to commit, working tree clean ottofowler@Winterfell î° ~/tmp/apache-nifi-pr-2983/nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test î° î pr-2983 î° ``` I do a clean checkout in my tmp directory using : https://github.com/ottobackwards/Metron-and-Nifi-Scripts/blob/master/nifi/checkout-nifi-pr ---
[GitHub] nifi pull request #2956: NIFI-5537 Create Neo4J cypher execution processor
Github user ottobackwards commented on a diff in the pull request: https://github.com/apache/nifi/pull/2956#discussion_r215271708 --- Diff: nifi-nar-bundles/nifi-neo4j-bundle/nifi-neo4j-processors/src/test/java/org/apache/nifi/processors/neo4j/ITNeo4JCyperExecutor.java --- @@ -0,0 +1,203 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.nifi.processors.neo4j; + +import org.apache.nifi.util.MockFlowFile; +import org.apache.nifi.util.TestRunner; +import org.apache.nifi.util.TestRunners; +import org.neo4j.driver.v1.AuthTokens; +import org.neo4j.driver.v1.Driver; +import org.neo4j.driver.v1.GraphDatabase; +import org.neo4j.driver.v1.Session; +import org.neo4j.driver.v1.StatementResult; +import org.junit.After; +import org.junit.Before; +import org.junit.Test; + +import java.nio.charset.Charset; +import java.util.HashMap; +import java.util.List; +import java.util.Map; + +import static org.junit.Assert.assertEquals; + --- End diff -- Maybe we want to describe how to basically setup neo4j, or reference your project? ---
[GitHub] nifi pull request #2956: NIFI-5537 Create Neo4J cypher execution processor
Github user ottobackwards commented on a diff in the pull request: https://github.com/apache/nifi/pull/2956#discussion_r215271238 --- Diff: nifi-nar-bundles/nifi-neo4j-bundle/nifi-neo4j-processors/src/main/java/org/apache/nifi/processors/neo4j/Neo4JCypherExecutor.java --- @@ -0,0 +1,203 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.nifi.processors.neo4j; + +import java.io.ByteArrayInputStream; +import java.nio.charset.Charset; +import java.util.ArrayList; +import java.util.Collection; +import java.util.Collections; +import java.util.HashMap; +import java.util.HashSet; +import java.util.List; +import java.util.Map; +import java.util.Set; +import java.util.stream.Collectors; + +import org.apache.nifi.annotation.behavior.EventDriven; +import org.apache.nifi.annotation.behavior.InputRequirement; +import org.apache.nifi.annotation.behavior.SupportsBatching; +import org.apache.nifi.annotation.behavior.WritesAttribute; +import org.apache.nifi.annotation.behavior.WritesAttributes; +import org.apache.nifi.annotation.documentation.CapabilityDescription; +import org.apache.nifi.annotation.documentation.Tags; +import org.apache.nifi.annotation.lifecycle.OnScheduled; +import org.apache.nifi.annotation.lifecycle.OnStopped; +import org.apache.nifi.components.PropertyDescriptor; +import org.apache.nifi.components.PropertyValue; +import org.apache.nifi.components.ValidationContext; +import org.apache.nifi.components.ValidationResult; +import org.apache.nifi.flowfile.FlowFile; +import org.apache.nifi.processor.ProcessContext; +import org.apache.nifi.processor.ProcessSession; +import org.apache.nifi.processor.Relationship; +import org.apache.nifi.processor.exception.ProcessException; +import org.neo4j.driver.v1.Session; +import org.neo4j.driver.v1.StatementResult; +import org.neo4j.driver.v1.summary.ResultSummary; +import org.neo4j.driver.v1.summary.SummaryCounters; + +import com.google.gson.Gson; + +@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED) +@EventDriven +@SupportsBatching +@Tags({"neo4j", "graph", "network", "insert", "update", "delete", "put", "get", "node", "relationship", "connection", "executor"}) +@CapabilityDescription("This processor executes a Neo4J Query (https://www.neo4j.com/) defined in the 'Neo4j Query' property of the " ++ "FlowFile and writes the result to the FlowFile body in JSON format.") +@WritesAttributes({ +@WritesAttribute(attribute = AbstractNeo4JCypherExecutor.ERROR_MESSAGE, description = "Neo4J error message"), +@WritesAttribute(attribute = AbstractNeo4JCypherExecutor.LABELS_ADDED, description = "Number of labels added"), +@WritesAttribute(attribute = AbstractNeo4JCypherExecutor.NODES_CREATED, description = "Number of nodes created"), +@WritesAttribute(attribute = AbstractNeo4JCypherExecutor.NODES_DELETED, description = "Number of nodes deleted"), +@WritesAttribute(attribute = AbstractNeo4JCypherExecutor.PROPERTIES_SET, description = "Number of properties set"), +@WritesAttribute(attribute = AbstractNeo4JCypherExecutor.RELATIONS_CREATED, description = "Number of relationships created"), +@WritesAttribute(attribute = AbstractNeo4JCypherExecutor.RELATIONS_DELETED, description = "Number of relationships deleted"), +@WritesAttribute(attribute = AbstractNeo4JCypherExecutor.ROWS_RETURNED, description = "Number of rows returned"), +}) +public class Neo4JCypherExecutor extends AbstractNeo4JCypherExecutor { + +private static final Set relationships; +private static final List propertyDescriptors; + +static { +final Set tempRelationships = new HashSet&l
[GitHub] nifi issue #2983: NIFI-5566 Improve HashContent processor and standardize Ha...
Github user ottobackwards commented on the issue: https://github.com/apache/nifi/pull/2983 [INFO] Running org.apache.nifi.security.util.crypto.HashServiceTest [ERROR] Tests run: 9, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 0.179 s <<< FAILURE! - in org.apache.nifi.security.util.crypto.HashServiceTest [ERROR] testShouldHashValueFromStream(org.apache.nifi.security.util.crypto.HashServiceTest) Time elapsed: 0.022 s <<< ERROR! java.io.FileNotFoundException: src/test/resources/HashServiceTest/largefile.txt (No such file or directory) at org.apache.nifi.security.util.crypto.HashServiceTest.testShouldHashValueFromStream(HashServiceTest.groovy:320) ---
[GitHub] nifi issue #2956: NIFI-5537 Create Neo4J cypher execution processor
Github user ottobackwards commented on the issue: https://github.com/apache/nifi/pull/2956 One question I have, do we have to be concerned with which version of neo4j works with the driver version we are using? Does that need to be documented? "Known to work with versions ."? ---
[GitHub] nifi issue #2956: NIFI-5537 Create Neo4J cypher execution processor
Github user ottobackwards commented on the issue: https://github.com/apache/nifi/pull/2956 Thanks @mans2singh. I'll definitely try to run through this. ---
[GitHub] nifi pull request #2983: NIFI-5566 Improve HashContent processor and standar...
Github user ottobackwards commented on a diff in the pull request: https://github.com/apache/nifi/pull/2983#discussion_r215020413 --- Diff: nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/HashAttribute.java --- @@ -51,6 +50,9 @@ import org.apache.nifi.processor.util.StandardValidators; /** --- End diff -- Yeah, I'll see about adding that ---
[GitHub] nifi pull request #2806: NIFI-5068 Script to automate parts of RC validation
Github user ottobackwards closed the pull request at: https://github.com/apache/nifi/pull/2806 ---
[GitHub] nifi issue #2806: NIFI-5068 Script to automate parts of RC validation
Github user ottobackwards commented on the issue: https://github.com/apache/nifi/pull/2806 closing, not much demand I guess. ---
[GitHub] nifi pull request #2983: NIFI-5566 Improve HashContent processor and standar...
Github user ottobackwards commented on a diff in the pull request: https://github.com/apache/nifi/pull/2983#discussion_r214887528 --- Diff: nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/HashAttribute.java --- @@ -51,6 +50,9 @@ import org.apache.nifi.processor.util.StandardValidators; /** --- End diff -- It is too bad there isn't a nifi deprecated annotation, or a way for the system to pick up the standard deprecated annotation and give a visual clue in the ui ---
[GitHub] nifi pull request #2980: NIFI-5147 Implement CalculateAttributeHash processo...
Github user ottobackwards commented on a diff in the pull request: https://github.com/apache/nifi/pull/2980#discussion_r214883268 --- Diff: nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/security/util/crypto/HashService.java --- @@ -0,0 +1,121 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.nifi.security.util.crypto; + +import java.nio.charset.Charset; +import java.nio.charset.StandardCharsets; +import org.apache.commons.codec.binary.Hex; +import org.apache.commons.codec.digest.DigestUtils; +import org.bouncycastle.crypto.digests.Blake2bDigest; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +/** + * This class provides a generic service for cryptographic hashing. It is used in + * {@link org.apache.nifi.processors.standard.CalculateAttributeHash} and + * {@link org.apache.nifi.processors.standard.HashContent}. + * + * See also: + * * {@link HashAlgorithm} + */ +public class HashService { +private static final Logger logger = LoggerFactory.getLogger(HashService.class); + +/** + * Returns the hex-encoded hash of the specified value. + * + * @param algorithm the hash algorithm to use + * @param value the value to hash (cannot be {@code null} but can be an empty String) + * @param charset the charset to use + * @return the hash value in hex + */ +public static String hashValue(HashAlgorithm algorithm, String value, Charset charset) { +byte[] rawHash = hashValueRaw(algorithm, value, charset); +return Hex.encodeHexString(rawHash); +} + +/** + * Returns the hex-encoded hash of the specified value. The default charset ({@code StandardCharsets.UTF_8}) is used. + * + * @param algorithm the hash algorithm to use + * @param value the value to hash (cannot be {@code null} but can be an empty String) + * @return the hash value in hex + */ +public static String hashValue(HashAlgorithm algorithm, String value) { +return hashValue(algorithm, value, StandardCharsets.UTF_8); +} + +/** + * Returns the raw {@code byte[]} hash of the specified value. + * + * @param algorithm the hash algorithm to use + * @param value the value to hash (cannot be {@code null} but can be an empty String) + * @param charset the charset to use + * @return the hash value in bytes + */ +public static byte[] hashValueRaw(HashAlgorithm algorithm, String value, Charset charset) { +if (value == null) { +throw new IllegalArgumentException("The value cannot be null"); +} +return hashValueRaw(algorithm, value.getBytes(charset)); +} + +/** + * Returns the raw {@code byte[]} hash of the specified value. The default charset ({@code StandardCharsets.UTF_8}) is used. + * + * @param algorithm the hash algorithm to use + * @param value the value to hash (cannot be {@code null} but can be an empty String) + * @return the hash value in bytes + */ +public static byte[] hashValueRaw(HashAlgorithm algorithm, String value) { +return hashValueRaw(algorithm, value, StandardCharsets.UTF_8); +} + +/** + * Returns the raw {@code byte[]} hash of the specified value. + * + * @param algorithm the hash algorithm to use + * @param value the value to hash + * @return the hash value in bytes + */ +public static byte[] hashValueRaw(HashAlgorithm algorithm, byte[] value) { +if (algorithm == null) { +throw new IllegalArgumentException("The hash algorithm cannot be null"); +} +if (value == null) { +throw new IllegalArg
[GitHub] nifi pull request #2980: NIFI-5147 Implement CalculateAttributeHash processo...
Github user ottobackwards commented on a diff in the pull request: https://github.com/apache/nifi/pull/2980#discussion_r214882758 --- Diff: nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/security/util/crypto/HashAlgorithm.java --- @@ -0,0 +1,151 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.nifi.security.util.crypto; + +import java.util.Arrays; +import java.util.List; +import org.apache.commons.lang3.StringUtils; +import org.apache.commons.lang3.builder.ToStringBuilder; +import org.apache.commons.lang3.builder.ToStringStyle; + +/** + * Enumeration capturing information about the cryptographic hash algorithms used in + * {@link org.apache.nifi.processors.standard.CalculateAttributeHash} and + * {@link org.apache.nifi.processors.standard.HashContent} processors. + */ +public enum HashAlgorithm { + +MD2("MD2", 16, "Cryptographically broken due to collisions"), +MD5("MD5", 16, "Cryptographically broken due to collisions"), +SHA1("SHA-1", 20, "Cryptographically broken due to collisions"), +SHA224("SHA-224", 28, "SHA-2 family"), +SHA256("SHA-256", 32, "SHA-2 family"), +SHA384("SHA-384", 48, "SHA-2 family"), +SHA512("SHA-512", 64, "SHA-2 family"), +SHA512_224("SHA-512/224", 28, "SHA-2 using SHA-512 with truncated output"), +SHA512_256("SHA-512/256", 32, "SHA-2 using SHA-512 with truncated output"), +SHA3_224("SHA3-224", 28, "Keccak-based SHA3 family"), +SHA3_256("SHA3-256", 32, "Keccak-based SHA3 family"), +SHA3_384("SHA3-384", 48, "Keccak-based SHA3 family"), +SHA3_512("SHA3-512", 64, "Keccak-based SHA3 family"), +BLAKE2_160("BLAKE2-160", 20, "Also known as Blake2b"), +BLAKE2_256("BLAKE2-256", 32, "Also known as Blake2b"), +BLAKE2_384("BLAKE2-384", 48, "Also known as Blake2b"), +BLAKE2_512("BLAKE2-512", 64, "Also known as Blake2b"); + +private final String name; +private final int digestBytesLength; +private final String description; + +private static final List BROKEN_ALGORITHMS = Arrays.asList(MD2.name, MD5.name, SHA1.name); + +HashAlgorithm(String name, int digestBytesLength, String description) { +this.name = name; +this.digestBytesLength = digestBytesLength; +this.description = description; +} + +public String getName() { +return name; +} + +public int getDigestBytesLength() { +return digestBytesLength; +} + +public String getDescription() { +return description; +} + +/** + * Returns {@code true} if this algorithm is considered cryptographically secure. These determinations were made as of 2018-08-30. + * + * Current strong algorithms: + * + * * SHA-224 (SHA2) + * * SHA-256 (SHA2) + * * SHA-384 (SHA2) + * * SHA-512 (SHA2) + * * SHA-512/224 (SHA2) + * * SHA-512/256 (SHA2) + * * SHA3-256 + * * SHA3-384 + * * SHA3-512 + * * Blake2b-256 + * * Blake2b-384 + * * Blake2b-512 + * + * Current broken algorithms: + * + * * MD2 + * * MD5 + * * SHA-1 + * + * @return true if the algorithm is considered strong + */ +public boolean isStrongAlgorithm() { +return (!BROKEN_ALGORITHMS.contains(name)); +} + --- End diff -- What is the isBlake2 check about? Is there a way to make it more general? It seems strange to call out by the name as opposed to the "why" ---
[GitHub] nifi pull request #2920: NIFI-5449: Added Base64 Encode/Decode functions to ...
Github user ottobackwards commented on a diff in the pull request: https://github.com/apache/nifi/pull/2920#discussion_r214872646 --- Diff: nifi-commons/nifi-record-path/src/main/java/org/apache/nifi/record/path/functions/Base64Encode.java --- @@ -0,0 +1,58 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.nifi.record.path.functions; + +import org.apache.nifi.record.path.FieldValue; +import org.apache.nifi.record.path.RecordPathEvaluationContext; +import org.apache.nifi.record.path.StandardFieldValue; +import org.apache.nifi.record.path.paths.RecordPathSegment; + +import java.io.UnsupportedEncodingException; +import java.util.Base64; +import java.util.stream.Stream; + +public class Base64Encode extends RecordPathSegment { +private final RecordPathSegment recordPath; + +public Base64Encode(final RecordPathSegment recordPath, final boolean absolute) { +super("base64Encode", null, absolute); +this.recordPath = recordPath; +} + +@Override +public Stream evaluate(final RecordPathEvaluationContext context) { +final Stream fieldValues = recordPath.evaluate(context); +return fieldValues.filter(fv -> fv.getValue() != null) +.map(fv -> { + +Object value = fv.getValue(); +if (value instanceof String) { +try { +return new StandardFieldValue(Base64.getEncoder().encodeToString(value.toString().getBytes("UTF-8")), fv.getField(), fv.getParent().orElse(null)); +} catch (final UnsupportedEncodingException e) { +return null;// won't happen. +} +} else if (value instanceof byte[]) { --- End diff -- I don't think there is a problem starting with string and byte[] ---
[GitHub] nifi issue #2836: NIFI-5147 Calculate hash attribute redux
Github user ottobackwards commented on the issue: https://github.com/apache/nifi/pull/2836 Sure. For Jira's when you want to do it yourself, I suggest assigning it. That is my habit on the other Apache things I work with, and it will help guard against eager beavers ;) Thanks for using the work as much as you did. ---
[GitHub] nifi pull request #2836: NIFI-5147 Calculate hash attribute redux
Github user ottobackwards closed the pull request at: https://github.com/apache/nifi/pull/2836 ---
[GitHub] nifi issue #2836: NIFI-5147 Calculate hash attribute redux
Github user ottobackwards commented on the issue: https://github.com/apache/nifi/pull/2836 Ok, that is great. Did I miss something that _was_ in the jira? Should I just close this PR now then? ---
[GitHub] nifi issue #2836: NIFI-5147 Calculate hash attribute redux
Github user ottobackwards commented on the issue: https://github.com/apache/nifi/pull/2836 Rebased on master, and fixed typos ---
[GitHub] nifi issue #2836: NIFI-5147 Calculate hash attribute redux
Github user ottobackwards commented on the issue: https://github.com/apache/nifi/pull/2836 @alopresto, just a ping. It has been a while ---
[GitHub] nifi pull request #2969: NIFI-5495 Made date format configurable.
Github user ottobackwards commented on a diff in the pull request: https://github.com/apache/nifi/pull/2969#discussion_r213383530 --- Diff: nifi-nar-bundles/nifi-mongodb-bundle/nifi-mongodb-processors/src/main/java/org/apache/nifi/processors/mongodb/AbstractMongoProcessor.java --- @@ -173,6 +175,29 @@ .expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES) .build(); --- End diff -- When I run the tests in intellij ( I did do a PR, but there was too much shrapnel ), the tests that compare the content to the known values fail, because the json is pretty printed in the result but not the expected. ---
[GitHub] nifi pull request #2969: NIFI-5495 Made date format configurable.
Github user ottobackwards commented on a diff in the pull request: https://github.com/apache/nifi/pull/2969#discussion_r213357727 --- Diff: nifi-nar-bundles/nifi-mongodb-bundle/nifi-mongodb-processors/src/main/java/org/apache/nifi/processors/mongodb/AbstractMongoProcessor.java --- @@ -173,6 +175,29 @@ .expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES) .build(); --- End diff -- Also, some of the IT tests fail because of pretty printing causes comparison failures. ---
[GitHub] nifi pull request #2969: NIFI-5495 Made date format configurable.
Github user ottobackwards commented on a diff in the pull request: https://github.com/apache/nifi/pull/2969#discussion_r213357507 --- Diff: nifi-nar-bundles/nifi-mongodb-bundle/nifi-mongodb-processors/src/main/java/org/apache/nifi/processors/mongodb/AbstractMongoProcessor.java --- @@ -173,6 +175,29 @@ .expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES) .build(); --- End diff -- Actually, this get's into the weeds if you are going to keep it required. Here is another idea - Can we rename the Display Name of the property to "Date Format For Standard JSON" ? ---
[GitHub] nifi pull request #2969: NIFI-5495 Made date format configurable.
Github user ottobackwards commented on a diff in the pull request: https://github.com/apache/nifi/pull/2969#discussion_r213317913 --- Diff: nifi-nar-bundles/nifi-mongodb-bundle/nifi-mongodb-processors/src/main/java/org/apache/nifi/processors/mongodb/AbstractMongoProcessor.java --- @@ -173,6 +175,29 @@ .expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES) .build(); --- End diff -- would you accept a PR on your ... PR? ---
[GitHub] nifi pull request #2969: NIFI-5495 Made date format configurable.
Github user ottobackwards commented on a diff in the pull request: https://github.com/apache/nifi/pull/2969#discussion_r213312718 --- Diff: nifi-nar-bundles/nifi-mongodb-bundle/nifi-mongodb-processors/src/main/java/org/apache/nifi/processors/mongodb/AbstractMongoProcessor.java --- @@ -173,6 +175,29 @@ .expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES) .build(); --- End diff -- Right, the code will work. The user however may be a bit surprised. "Why let me configure it, and then ignore what I put in without telling me?" ---
[GitHub] nifi pull request #2969: NIFI-5495 Made date format configurable.
Github user ottobackwards commented on a diff in the pull request: https://github.com/apache/nifi/pull/2969#discussion_r213049169 --- Diff: nifi-nar-bundles/nifi-mongodb-bundle/nifi-mongodb-processors/src/main/java/org/apache/nifi/processors/mongodb/AbstractMongoProcessor.java --- @@ -173,6 +175,29 @@ .expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES) .build(); --- End diff -- I think the documentation is good @MikeThomsen, my gut thinks it would be better to validate dynamically that they don't have extended and date format set at the same time, but I wouldn't go to the mattresses on that. ---
[GitHub] nifi issue #2968: NIFI-5456: AWS clients now work with private link endpoint...
Github user ottobackwards commented on the issue: https://github.com/apache/nifi/pull/2968 I tested to make sure it didn't regress normal working with the Web Api Gateway processor, but that processor doesn't modify the endpoint post client creation anyway. Works fine, no regression from this, as expected. I'm afraid I don't have a setup for testing the rest :( ---
[GitHub] nifi issue #2956: NIFI-5537 Create Neo4J cypher execution processor
Github user ottobackwards commented on the issue: https://github.com/apache/nifi/pull/2956 @mans2singh I'm not sure how to test this. I don't use/have neo4j. If you know a reviewer who does, you may want to tag them. I won't be able to review without some testing instructions in the PR, including simple running of neo4j with data to test statements against. ---
[GitHub] nifi pull request #2969: NIFI-5495 Made date format configurable.
Github user ottobackwards commented on a diff in the pull request: https://github.com/apache/nifi/pull/2969#discussion_r212960991 --- Diff: nifi-nar-bundles/nifi-mongodb-bundle/nifi-mongodb-services/src/main/java/org/apache/nifi/mongodb/MongoDBLookupService.java --- @@ -83,6 +84,7 @@ .displayName("Projection") .description("Specifies a projection for limiting which fields will be returned.") .required(false) --- End diff -- OK, Can you explain what you are changing and why then? ---
[GitHub] nifi pull request #2969: NIFI-5495 Made date format configurable.
Github user ottobackwards commented on a diff in the pull request: https://github.com/apache/nifi/pull/2969#discussion_r212960490 --- Diff: nifi-nar-bundles/nifi-mongodb-bundle/nifi-mongodb-processors/src/main/java/org/apache/nifi/processors/mongodb/AbstractMongoProcessor.java --- @@ -173,6 +175,29 @@ .expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES) .build(); --- End diff -- This property only applies to JSON_STANDARD. - Is there a reason it can't apply to extended? - If you use extended what do you get? - That should be in the description I think. - Do we want to validate that you don't set DATE_FORMAT with JSON_EXTENDED? ---
[GitHub] nifi pull request #2969: NIFI-5495 Made date format configurable.
Github user ottobackwards commented on a diff in the pull request: https://github.com/apache/nifi/pull/2969#discussion_r212959015 --- Diff: nifi-nar-bundles/nifi-mongodb-bundle/nifi-mongodb-services/src/main/java/org/apache/nifi/mongodb/MongoDBLookupService.java --- @@ -83,6 +84,7 @@ .displayName("Projection") .description("Specifies a projection for limiting which fields will be returned.") .required(false) --- End diff -- Is this change from another PR? ---
[GitHub] nifi issue #2963: NIFI-5541 - Added OWASP profile for dependency check
Github user ottobackwards commented on the issue: https://github.com/apache/nifi/pull/2963 I ran this and got the html file in the target directory. This probably needs to be documented somewhere though. ---
[GitHub] nifi issue #2956: NIFI-5537 Create Neo4J cypher execution processor
Github user ottobackwards commented on the issue: https://github.com/apache/nifi/pull/2956 That seems to be the convention from what I can see, and then there would be multiple neo4j processors, maybe services, service-api, common utils under there. ---
[GitHub] nifi issue #2956: NIFI-5537 Create Neo4J cypher execution processor
Github user ottobackwards commented on the issue: https://github.com/apache/nifi/pull/2956 Just a quick comment. Why not just a neo4j bundle? I think that is the convention ---
[GitHub] nifi issue #2929: NIFI-5474 ReplaceText RegexReplace evaluates payload as Ex...
Github user ottobackwards commented on the issue: https://github.com/apache/nifi/pull/2929 I have reviewed https://github.com/apache/nifi/pull/2748, I believe that is the better fix for these issues. I'm going to close this PR. ---
[GitHub] nifi pull request #2929: NIFI-5474 ReplaceText RegexReplace evaluates payloa...
Github user ottobackwards closed the pull request at: https://github.com/apache/nifi/pull/2929 ---
[GitHub] nifi issue #2951: NIFI-5474: When using Regex Replace with ReplaceText, and ...
Github user ottobackwards commented on the issue: https://github.com/apache/nifi/pull/2951 Ran with contrib-check, and stepped through. @markap14 this is certainly a better fix for the issues than my other PR's. I learned something I didn't know about the el evaluation and the additional attributes. I certainly tackled this from the wrong angle. I enthusiastically support this code's replacement of my initial PR and closing out my current one when this lands. +1 @mosermw, since you reviewed my other pr, can you take a look at this? ---
[GitHub] nifi pull request #2929: NIFI-5474 ReplaceText RegexReplace evaluates payloa...
Github user ottobackwards commented on a diff in the pull request: https://github.com/apache/nifi/pull/2929#discussion_r210089927 --- Diff: nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/TestReplaceText.java --- @@ -404,6 +404,27 @@ public void testRegexWithExpressionLanguageIsEscaped() throws IOException { out.assertContentEquals("Hello, World!"); } +/** + * Test for NIFI-5474 + */ +@Test +public void testExpressionLanguageInContentIsNotEvaluated() throws IOException { +final TestRunner runner = getRunner(); +runner.setProperty(ReplaceText.SEARCH_VALUE, "${replaceKey}"); +runner.setProperty(ReplaceText.REPLACEMENT_VALUE, "${replaceValue}"); + +final Map attributes = new HashMap<>(); +attributes.put("replaceKey", "Hello"); +attributes.put("replaceValue", "Good-bye"); + runner.enqueue(Paths.get("src/test/resources/hello_with_expression_like_text.txt"), attributes); --- End diff -- done, thanks ---
[GitHub] nifi pull request #2929: NIFI-5474 ReplaceText RegexReplace evaluates payloa...
Github user ottobackwards commented on a diff in the pull request: https://github.com/apache/nifi/pull/2929#discussion_r210089885 --- Diff: nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/TestReplaceText.java --- @@ -404,6 +404,27 @@ public void testRegexWithExpressionLanguageIsEscaped() throws IOException { out.assertContentEquals("Hello, World!"); } +/** + * Test for NIFI-5474 --- End diff -- done ---
[GitHub] nifi pull request #2929: NIFI-5474 ReplaceText RegexReplace evaluates payloa...
Github user ottobackwards commented on a diff in the pull request: https://github.com/apache/nifi/pull/2929#discussion_r210073665 --- Diff: nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/ReplaceText.java --- @@ -701,6 +709,27 @@ private static String wrapLiterals(String possibleLiteral) { return replacementFinal; } +/** + * Escapes Expression Language like text from content Strings. + * + * Since we do regular expression replacement on the content and then do Expression Language + * evaluations afterwards, it is possible that if there are Expression Language like text + * in the content that they will be evaluated when they should not be. + * + * + * This function is called to escape any such construct by prefixing a second $ to the ${...} text. + * + * + * @param content the content that may contain Expression Language like text + * @return A {@code String} with any Expression Language text escaped with a $. + */ +private static String escapeExpressionsInContent(String content) { +if (!content.contains("${")) { +return content; +} +return content.replaceAll("(\\$\\{.*\\})","\\$$1"); --- End diff -- What would be the correct way to escape this? ---
[GitHub] nifi pull request #2929: NIFI-5474 ReplaceText RegexReplace evaluates payloa...
Github user ottobackwards commented on a diff in the pull request: https://github.com/apache/nifi/pull/2929#discussion_r210073535 --- Diff: nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/ReplaceText.java --- @@ -701,6 +709,27 @@ private static String wrapLiterals(String possibleLiteral) { return replacementFinal; } +/** + * Escapes Expression Language like text from content Strings. + * + * Since we do regular expression replacement on the content and then do Expression Language + * evaluations afterwards, it is possible that if there are Expression Language like text + * in the content that they will be evaluated when they should not be. + * + * + * This function is called to escape any such construct by prefixing a second $ to the ${...} text. + * + * + * @param content the content that may contain Expression Language like text + * @return A {@code String} with any Expression Language text escaped with a $. + */ +private static String escapeExpressionsInContent(String content) { +if (!content.contains("${")) { +return content; +} +return content.replaceAll("(\\$\\{.*\\})","\\$$1"); --- End diff -- If you look at https://github.com/apache/nifi/pull/2748 and https://issues.apache.org/jira/browse/NIFI-4272, evaluation of the EL before the regex operations resulted in incorrect behavior. So it is evaluated after now. ---
[GitHub] nifi issue #2937: NIFI-4434 Fixed recursive listing with a custom regex filt...
Github user ottobackwards commented on the issue: https://github.com/apache/nifi/pull/2937 Built with -Pcontrib-check, verified the documentation, +1 from me, nice work ---
[GitHub] nifi issue #2937: NIFI-4434 Fixed recursive listing with a custom regex filt...
Github user ottobackwards commented on the issue: https://github.com/apache/nifi/pull/2937 @jtstorck I think that examples would help the user get the right one the first time. Not a deal breaker though. ---
[GitHub] nifi pull request #2937: NIFI-4434 Fixed recursive listing with a custom reg...
Github user ottobackwards commented on a diff in the pull request: https://github.com/apache/nifi/pull/2937#discussion_r208222852 --- Diff: nifi-nar-bundles/nifi-hadoop-bundle/nifi-hdfs-processors/src/main/java/org/apache/nifi/processors/hadoop/ListHDFS.java --- @@ -462,11 +523,15 @@ private String getPerms(final FsAction action) { private PathFilter createPathFilter(final ProcessContext context) { final Pattern filePattern = Pattern.compile(context.getProperty(FILE_FILTER).getValue()); --- End diff -- fair enough ---
[GitHub] nifi pull request #2937: NIFI-4434 Fixed recursive listing with a custom reg...
Github user ottobackwards commented on a diff in the pull request: https://github.com/apache/nifi/pull/2937#discussion_r207983251 --- Diff: nifi-nar-bundles/nifi-hadoop-bundle/nifi-hdfs-processors/src/main/java/org/apache/nifi/processors/hadoop/ListHDFS.java --- @@ -462,11 +523,15 @@ private String getPerms(final FsAction action) { private PathFilter createPathFilter(final ProcessContext context) { final Pattern filePattern = Pattern.compile(context.getProperty(FILE_FILTER).getValue()); --- End diff -- Does this need to support expression language? ---
[GitHub] nifi issue #2937: NIFI-4434 Fixed recursive listing with a custom regex filt...
Github user ottobackwards commented on the issue: https://github.com/apache/nifi/pull/2937 @jtstorck will review. First quick thing is to ask if you have considered adding an additionDetails talking about why you would choose one strategy over another, maybe with simple examples? ---
[GitHub] nifi issue #2933: NIFI-5479 Upgraded Jetty. Moved where we unpack bundled de...
Github user ottobackwards commented on the issue: https://github.com/apache/nifi/pull/2933 Would it help if you would load NAR's without unpacking them to disk? ---
[GitHub] nifi issue #2930: NIFI-4434 Fixed recursive listing with a custom regex filt...
Github user ottobackwards commented on the issue: https://github.com/apache/nifi/pull/2930 I don't know if ListHDFS has additional documentation, but you may want to think about adding an explanation and when you would want to use which, the problems etc ---
[GitHub] nifi issue #2930: NIFI-4434 Fixed recursive listing with a custom regex filt...
Github user ottobackwards commented on the issue: https://github.com/apache/nifi/pull/2930 We would have to make sure that prior tests and maybe new tests prove that out @bbende as part of acceptance. ---
[GitHub] nifi issue #2930: NIFI-4434 Fixed recursive listing with a custom regex filt...
Github user ottobackwards commented on the issue: https://github.com/apache/nifi/pull/2930 Yeah, that sounds good, then have a PathFilter implementation for each ---
[GitHub] nifi issue #2930: NIFI-4434 Fixed recursive listing with a custom regex filt...
Github user ottobackwards commented on the issue: https://github.com/apache/nifi/pull/2930 Why not have a new Property ( defaulting to the status quo anti ) for the search strategy, and have the recursive strategy use full paths and make it opt in? ---
[GitHub] nifi issue #2929: NIFI-5474 ReplaceText RegexReplace evaluates payload as Ex...
Github user ottobackwards commented on the issue: https://github.com/apache/nifi/pull/2929 @JPercivall ---
[GitHub] nifi pull request #2929: NIFI-5474 ReplaceText RegexReplace evaluates payloa...
GitHub user ottobackwards opened a pull request: https://github.com/apache/nifi/pull/2929 NIFI-5474 ReplaceText RegexReplace evaluates payload as Expression language Thank you for submitting a contribution to Apache NiFi. In order to streamline the review of the contribution we ask you to ensure the following steps have been taken: ### For all changes: - [x] Is there a JIRA ticket associated with this PR? Is it referenced in the commit message? - [x] Does your PR title start with NIFI- where is the JIRA number you are trying to resolve? Pay particular attention to the hyphen "-" character. - [x] Has your PR been rebased against the latest commit within the target branch (typically master)? - [x] Is your initial contribution a single, squashed commit? ### For code changes: - [x] Have you ensured that the full suite of tests is executed via mvn -Pcontrib-check clean install at the root nifi folder? - [x] Have you written or updated unit tests to verify your changes? - [-] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [-] If applicable, have you updated the LICENSE file, including the main LICENSE file under nifi-assembly? - [-] If applicable, have you updated the NOTICE file, including the main NOTICE file found under nifi-assembly? - [-] If adding new Properties, have you added .displayName in addition to .name (programmatic access) for each of the new properties? ### For documentation related changes: - [-] Have you ensured that format looks appropriate for the output in which it is rendered? ### Note: Please ensure that once the PR is submitted, you check travis-ci for build issues and submit an update to your PR as soon as possible. You can merge this pull request into a Git repository by running: $ git pull https://github.com/ottobackwards/nifi replacetext-content-el Alternatively you can review and apply these changes as the patch at: https://github.com/apache/nifi/pull/2929.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #2929 commit 6316b1cf324a2c0087d8554aaf5f9a4997e5a543 Author: Otto Fowler Date: 2018-07-31T18:33:49Z Account for Expression Language type text in the content ---
[GitHub] nifi issue #2820: NIFI-5327 Adding Netflowv5 protocol parser
Github user ottobackwards commented on the issue: https://github.com/apache/nifi/pull/2820 As far as I can see, this PR is in great same. Good works @PrashanthVenkatesan. Tests and contrib tests are great. I manually tested with the sample generator and the output looks like it will be useful. Also, the factoring of the submittal will make the record reader follow up a breeze ;) +1 from me. @MikeThomsen, @bbende, @mattyb149 ? anyone have time to take a look at this? ---
[GitHub] nifi pull request #2820: NIFI-5327 Adding Netflowv5 protocol parser
Github user ottobackwards commented on a diff in the pull request: https://github.com/apache/nifi/pull/2820#discussion_r206290668 --- Diff: nifi-nar-bundles/nifi-network-bundle/nifi-network-utils/src/main/java/org/apache/nifi/processors/network/parser/Netflowv5Parser.java --- @@ -29,6 +32,7 @@ private static final int SHORT_TYPE = 0; private static final int INTEGER_TYPE = 1; private static final int LONG_TYPE = 2; --- End diff -- I did some research, it doesn't support it in this version. ---
[GitHub] nifi pull request #2820: NIFI-5327 Adding Netflowv5 protocol parser
Github user ottobackwards commented on a diff in the pull request: https://github.com/apache/nifi/pull/2820#discussion_r206164076 --- Diff: nifi-nar-bundles/nifi-network-bundle/nifi-network-utils/src/main/java/org/apache/nifi/processors/network/parser/Netflowv5Parser.java --- @@ -29,6 +32,7 @@ private static final int SHORT_TYPE = 0; private static final int INTEGER_TYPE = 1; private static final int LONG_TYPE = 2; --- End diff -- So netflow5 doesn't support IPV6? If not then cool ---
[GitHub] nifi pull request #2820: NIFI-5327 Adding Netflowv5 protocol parser
Github user ottobackwards commented on a diff in the pull request: https://github.com/apache/nifi/pull/2820#discussion_r206125096 --- Diff: nifi-nar-bundles/nifi-network-bundle/nifi-network-utils/src/main/java/org/apache/nifi/processors/network/parser/Netflowv5Parser.java --- @@ -29,6 +32,7 @@ private static final int SHORT_TYPE = 0; private static final int INTEGER_TYPE = 1; private static final int LONG_TYPE = 2; --- End diff -- I hate to ask, but is there no IPV 6 that could be in there? ---
[GitHub] nifi issue #2820: NIFI-5327 Adding Netflowv5 protocol parser
Github user ottobackwards commented on the issue: https://github.com/apache/nifi/pull/2820 Also, thinking of it, the name of the Nar you have should be network processors nar, not just network nar. When we do the controller services (record readers) they will need a nar and that is the convention. ---
[GitHub] nifi pull request #2920: NIFI-5449: Added Base64 Encode/Decode functions to ...
Github user ottobackwards commented on a diff in the pull request: https://github.com/apache/nifi/pull/2920#discussion_r205752245 --- Diff: nifi-commons/nifi-record-path/src/main/java/org/apache/nifi/record/path/functions/Base64Decode.java --- @@ -0,0 +1,50 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.nifi.record.path.functions; + +import org.apache.nifi.record.path.FieldValue; +import org.apache.nifi.record.path.RecordPathEvaluationContext; +import org.apache.nifi.record.path.StandardFieldValue; +import org.apache.nifi.record.path.paths.RecordPathSegment; + +import java.util.Base64; +import java.util.stream.Stream; + +public class Base64Decode extends RecordPathSegment { +private final RecordPathSegment recordPath; + +public Base64Decode(final RecordPathSegment recordPath, final boolean absolute) { +super("base64Decode", null, absolute); +this.recordPath = recordPath; +} + +@Override +public Stream evaluate(final RecordPathEvaluationContext context) { +final Stream fieldValues = recordPath.evaluate(context); +return fieldValues.filter(fv -> fv.getValue() != null) +.map(fv -> { + +if (!(fv.getValue() instanceof String)) { +throw new IllegalArgumentException("Argument supplied to base64Encode must be a String"); --- End diff -- Could fv.getValue() ever return byte[]? If so that should be supported shouldn't it? ---
[GitHub] nifi issue #2820: NIFI-5327 Adding Netflowv5 protocol parser
Github user ottobackwards commented on the issue: https://github.com/apache/nifi/pull/2820 OK. Contrib check and build passes. Everything runs the way it should. My question is on the data. The src and dest addresses are just the numbers. They are supposed to be ip addresses. I think that formatting them as IP addresses rather than just as numbers would be more usable. Is there something I'm missing? I don't have an example of something else that shows net flow data. Are there any other fields that are output 'raw' that could be formatted? ---
[GitHub] nifi pull request #2820: NIFI-5327 Adding Netflowv5 protocol parser
Github user ottobackwards commented on a diff in the pull request: https://github.com/apache/nifi/pull/2820#discussion_r205314387 --- Diff: nifi-nar-bundles/nifi-network-bundle/nifi-network-processors/src/main/resources/docs/org.apache.nifi.processors.network.ParseNetflowv5/additionalDetails.html --- @@ -0,0 +1,74 @@ + + + + + +Netflowv5Parser + + + + + + Netflowv5Parser processor parses the ingress netflowv5 datagram format + and transfers it either as flowfile attributes or JSON object. + Netflowv5 format has predefined schema named "template" for parsing + the netflowv5 record. More information:https://www.cisco.com/c/en/us/td/docs/net_mgmt/netflow_collection_engine/3-6/user/guide/format.html;>RFC-netflowv5 + --- End diff -- I am sorry @PrashanthVenkatesan, I've been on vacation the last couple of days. I'm fine with the review, I just want to run everything again. I will try as soon as I can. I am not a committer though, so even with my potential +1 you will still need a committer to sign off and merge. ---
[GitHub] nifi pull request #2820: NIFI-5327 Adding Netflowv5 protocol parser
Github user ottobackwards commented on a diff in the pull request: https://github.com/apache/nifi/pull/2820#discussion_r204404608 --- Diff: nifi-nar-bundles/nifi-network-bundle/nifi-network-processors/src/main/resources/docs/org.apache.nifi.processors.network.ParseNetflowv5/additionalDetails.html --- @@ -0,0 +1,74 @@ + + + + + +Netflowv5Parser + + + + + + Netflowv5Parser processor parses the ingress netflowv5 datagram format + and transfers it either as flowfile attributes or JSON object. + Netflowv5 format has predefined schema named "template" for parsing + the netflowv5 record. More information:https://www.cisco.com/c/en/us/td/docs/net_mgmt/netflow_collection_engine/3-6/user/guide/format.html;>RFC-netflowv5 + --- End diff -- I'll put the avro in the record reader's additionalDetails. I think it is confusing to have it with the processor. ---
[GitHub] nifi pull request #2820: NIFI-5327 Adding Netflowv5 protocol parser
Github user ottobackwards commented on a diff in the pull request: https://github.com/apache/nifi/pull/2820#discussion_r204266885 --- Diff: nifi-nar-bundles/nifi-network-bundle/nifi-network-processors/src/main/resources/docs/org.apache.nifi.processors.network.ParseNetflowv5/additionalDetails.html --- @@ -0,0 +1,74 @@ + + + + + +Netflowv5Parser + + + + + + Netflowv5Parser processor parses the ingress netflowv5 datagram format + and transfers it either as flowfile attributes or JSON object. + Netflowv5 format has predefined schema named "template" for parsing + the netflowv5 record. More information:https://www.cisco.com/c/en/us/td/docs/net_mgmt/netflow_collection_engine/3-6/user/guide/format.html;>RFC-netflowv5 + --- End diff -- @bbende do you have an opinion on how to document the json? ---
[GitHub] nifi pull request #2820: NIFI-5327 Adding Netflowv5 protocol parser
Github user ottobackwards commented on a diff in the pull request: https://github.com/apache/nifi/pull/2820#discussion_r204250561 --- Diff: nifi-nar-bundles/nifi-network-bundle/nifi-network-processors/src/main/resources/docs/org.apache.nifi.processors.network.ParseNetflowv5/additionalDetails.html --- @@ -0,0 +1,74 @@ + + + + + +Netflowv5Parser + + + + + + Netflowv5Parser processor parses the ingress netflowv5 datagram format + and transfers it either as flowfile attributes or JSON object. + Netflowv5 format has predefined schema named "template" for parsing + the netflowv5 record. More information:https://www.cisco.com/c/en/us/td/docs/net_mgmt/netflow_collection_engine/3-6/user/guide/format.html;>RFC-netflowv5 + --- End diff -- yes that is the type of thing I'm referring to. If you want to put the description, put it outside maybe. ---
[GitHub] nifi pull request #2820: NIFI-5327 Adding Netflowv5 protocol parser
Github user ottobackwards commented on a diff in the pull request: https://github.com/apache/nifi/pull/2820#discussion_r204246930 --- Diff: nifi-nar-bundles/nifi-network-bundle/nifi-network-processors/src/main/resources/docs/org.apache.nifi.processors.network.ParseNetflowv5/additionalDetails.html --- @@ -0,0 +1,74 @@ + + + + + +Netflowv5Parser + + + + + + Netflowv5Parser processor parses the ingress netflowv5 datagram format + and transfers it either as flowfile attributes or JSON object. + Netflowv5 format has predefined schema named "template" for parsing + the netflowv5 record. More information:https://www.cisco.com/c/en/us/td/docs/net_mgmt/netflow_collection_engine/3-6/user/guide/format.html;>RFC-netflowv5 + --- End diff -- So since this isn't a record processor/reader I wouldn't put the avro schema there, we'll put the avro schema in that one. I would put the exact json that you output, with values being the 'types' from your schema. And above it, in the description just say that that is what you are doing. Maybe follow that with an example: " Here is the structure of the output net flow json, with the types:" the json " for example:" the json with data ---
[GitHub] nifi pull request #2820: NIFI-5327 Adding Netflowv5 protocol parser
Github user ottobackwards commented on a diff in the pull request: https://github.com/apache/nifi/pull/2820#discussion_r204240871 --- Diff: nifi-nar-bundles/nifi-network-bundle/nifi-network-processors/src/main/resources/docs/org.apache.nifi.processors.network.ParseNetflowv5/additionalDetails.html --- @@ -0,0 +1,74 @@ + + + + + +Netflowv5Parser + + + + + + Netflowv5Parser processor parses the ingress netflowv5 datagram format + and transfers it either as flowfile attributes or JSON object. + Netflowv5 format has predefined schema named "template" for parsing + the netflowv5 record. More information:https://www.cisco.com/c/en/us/td/docs/net_mgmt/netflow_collection_engine/3-6/user/guide/format.html;>RFC-netflowv5 + --- End diff -- The schema should match what is sent in the relationship. Does the data that gets sent out ( the json you generate ) have the word "template" in it? ---