[GitHub] [nifi] r-sidd commented on pull request #7319: NIFI-11587: Update questdb to 6.7
r-sidd commented on PR #7319: URL: https://github.com/apache/nifi/pull/7319#issuecomment-1571307118 @pvillard31 @MikeThomsen @exceptionfactory can someone review this -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[jira] [Commented] (NIFI-11250) InvokeHTTP drops the Body when using the DELETE method
[ https://issues.apache.org/jira/browse/NIFI-11250?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17728180#comment-17728180 ] David Handermann commented on NIFI-11250: - Thanks for the reply [~benj_928381923], and the pointer back to the Keycloak REST API reference, that is helpful. Keycloak is certainly widely used, and with InvokeHTTP supporting such a wide range of use cases, it seems like making an adjustment here would be warranted. > InvokeHTTP drops the Body when using the DELETE method > -- > > Key: NIFI-11250 > URL: https://issues.apache.org/jira/browse/NIFI-11250 > Project: Apache NiFi > Issue Type: Improvement > Components: Extensions >Reporter: Benji Benning >Assignee: David Handermann >Priority: Major > > Today, InvokeHTTP drops the Body when the method used isn't PUT, POST, or > PATCH (As stated in the documentation). RFC states that DELETE with body > isn't generally used, but doesn't disallow it. > In my case, i'm using InvokeHTTP to interact with Keycloak's Admin REST API. > They use DELETE with body in quite a few cases. for example in my specific > use case: > [https://www.keycloak.org/docs-api/21.0.1/rest-api/#_role_mapper_resource] > (referring to: Delete realm-level role mappings) > Additional information: > {noformat} > Although request message framing is independent of the method used, content > received in a DELETE request has no generally defined semantics, cannot alter > the meaning or target of the request, and might lead some implementations to > reject the request and close the connection because of its potential as a > request smuggling attack (Section 11.2 of [HTTP/1.1]). A client SHOULD NOT > generate content in a DELETE request unless it is made directly to an origin > server that has previously indicated, in or out of band, that such a request > has a purpose and will be adequately supported. An origin server SHOULD NOT > rely on private agreements to receive content, since participants in HTTP > communication are often unaware of intermediaries along the request > chain.{noformat} > [https://www.rfc-editor.org/rfc/rfc9110.html#name-delete] > > During discussion with Otto Fowler, he stated that this is disabled in the > [HTTPMethod > enum|https://github.com/apache/nifi/blob/7a47c8cfbd458ab037275762c385d50372c130a3/nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/http/HttpMethod.java]. -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (NIFI-11622) Create Bulletin Counter Metric
Yolanda M. Davis created NIFI-11622: --- Summary: Create Bulletin Counter Metric Key: NIFI-11622 URL: https://issues.apache.org/jira/browse/NIFI-11622 Project: Apache NiFi Issue Type: Improvement Components: Core Framework Affects Versions: 1.21.0 Reporter: Yolanda M. Davis Assignee: Manju Kalavala Currently NiFi publishes prometheus metrics which include bulletin gauges that reflects the existence of a specific bulletin. This request is to enhance the available metrics to include a counter metric that tracks the incidence of bulletins by category (e.g. ERROR, WARN, INFO). This will allow systems consuming metrics to determine increase or decrease of bulletins overtime. -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Updated] (NIFI-11619) Upgrade Azure BOM to 1.2.13
[ https://issues.apache.org/jira/browse/NIFI-11619?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] David Handermann updated NIFI-11619: Fix Version/s: 2.0.0 1.22.0 (was: 1.latest) (was: 2.latest) Resolution: Fixed Status: Resolved (was: Patch Available) > Upgrade Azure BOM to 1.2.13 > --- > > Key: NIFI-11619 > URL: https://issues.apache.org/jira/browse/NIFI-11619 > Project: Apache NiFi > Issue Type: Task > Components: Extensions >Reporter: Pierre Villard >Assignee: Pierre Villard >Priority: Major > Fix For: 2.0.0, 1.22.0 > > Time Spent: 20m > Remaining Estimate: 0h > > Upgrade Azure BOM to 1.2.13 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Updated] (NIFI-11618) Upgrade AWS SDK to 2.20.75 and 1.12.478
[ https://issues.apache.org/jira/browse/NIFI-11618?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] David Handermann updated NIFI-11618: Summary: Upgrade AWS SDK to 2.20.75 and 1.12.478 (was: Upgrade AWS SDK) > Upgrade AWS SDK to 2.20.75 and 1.12.478 > --- > > Key: NIFI-11618 > URL: https://issues.apache.org/jira/browse/NIFI-11618 > Project: Apache NiFi > Issue Type: Task > Components: Extensions >Reporter: Pierre Villard >Assignee: Pierre Villard >Priority: Major > Fix For: 2.0.0, 1.22.0 > > Time Spent: 20m > Remaining Estimate: 0h > > Upgrade AWS SDK from 2.20.41 to 2.20.75 and from 1.12.444 to 1.12.478 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Resolved] (NIFI-11617) Update jackson.bom.version to 2.15.2
[ https://issues.apache.org/jira/browse/NIFI-11617?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] David Handermann resolved NIFI-11617. - Fix Version/s: 2.0.0 1.22.0 Resolution: Fixed > Update jackson.bom.version to 2.15.2 > > > Key: NIFI-11617 > URL: https://issues.apache.org/jira/browse/NIFI-11617 > Project: Apache NiFi > Issue Type: Improvement >Affects Versions: 1.21.0 >Reporter: Mike R >Assignee: Mike R >Priority: Major > Fix For: 2.0.0, 1.22.0 > > Time Spent: 1h 10m > Remaining Estimate: 0h > > [Jackson Release 2.15.2 · FasterXML/jackson Wiki > (github.com)|https://github.com/FasterXML/jackson/wiki/Jackson-Release-2.15.2] > Update jackson.bom.version to 2.15.2 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Updated] (NIFI-11618) Upgrade AWS SDK
[ https://issues.apache.org/jira/browse/NIFI-11618?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] David Handermann updated NIFI-11618: Fix Version/s: 2.0.0 1.22.0 (was: 1.latest) (was: 2.latest) Resolution: Fixed Status: Resolved (was: Patch Available) > Upgrade AWS SDK > --- > > Key: NIFI-11618 > URL: https://issues.apache.org/jira/browse/NIFI-11618 > Project: Apache NiFi > Issue Type: Task > Components: Extensions >Reporter: Pierre Villard >Assignee: Pierre Villard >Priority: Major > Fix For: 2.0.0, 1.22.0 > > Time Spent: 20m > Remaining Estimate: 0h > > Upgrade AWS SDK from 2.20.41 to 2.20.75 and from 1.12.444 to 1.12.478 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Commented] (NIFI-11619) Upgrade Azure BOM to 1.2.13
[ https://issues.apache.org/jira/browse/NIFI-11619?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17728147#comment-17728147 ] ASF subversion and git services commented on NIFI-11619: Commit 06b49cbc32f4b3e59075e42b82314c1947cfffe4 in nifi's branch refs/heads/main from Pierre Villard [ https://gitbox.apache.org/repos/asf?p=nifi.git;h=06b49cbc32 ] NIFI-11619 Upgraded Azure BOM from 1.2.11 to 1.2.13 This closes #7320 Signed-off-by: David Handermann > Upgrade Azure BOM to 1.2.13 > --- > > Key: NIFI-11619 > URL: https://issues.apache.org/jira/browse/NIFI-11619 > Project: Apache NiFi > Issue Type: Task > Components: Extensions >Reporter: Pierre Villard >Assignee: Pierre Villard >Priority: Major > Fix For: 1.latest, 2.latest > > Time Spent: 20m > Remaining Estimate: 0h > > Upgrade Azure BOM to 1.2.13 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Commented] (NIFI-11620) Upgrade Google libraries-bom to 26.15.0
[ https://issues.apache.org/jira/browse/NIFI-11620?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17728148#comment-17728148 ] ASF subversion and git services commented on NIFI-11620: Commit d4f301f473f26cd4661fee13a1b7286f57b23bba in nifi's branch refs/heads/main from Pierre Villard [ https://gitbox.apache.org/repos/asf?p=nifi.git;h=d4f301f473 ] NIFI-11620 Upgraded Google libraries-bom from 26.12.0 to 26.15.0 This closes #7321 Signed-off-by: David Handermann > Upgrade Google libraries-bom to 26.15.0 > --- > > Key: NIFI-11620 > URL: https://issues.apache.org/jira/browse/NIFI-11620 > Project: Apache NiFi > Issue Type: Task > Components: Extensions >Reporter: Pierre Villard >Assignee: Pierre Villard >Priority: Major > Fix For: 1.latest, 2.latest > > Time Spent: 20m > Remaining Estimate: 0h > > Upgrade Google libraries-bom to 26.15.0 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Commented] (NIFI-11619) Upgrade Azure BOM to 1.2.13
[ https://issues.apache.org/jira/browse/NIFI-11619?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17728151#comment-17728151 ] ASF subversion and git services commented on NIFI-11619: Commit cb144e266f35372470646e780e2ef00bcc3a114c in nifi's branch refs/heads/support/nifi-1.x from Pierre Villard [ https://gitbox.apache.org/repos/asf?p=nifi.git;h=cb144e266f ] NIFI-11619 Upgraded Azure BOM from 1.2.11 to 1.2.13 This closes #7320 Signed-off-by: David Handermann (cherry picked from commit 06b49cbc32f4b3e59075e42b82314c1947cfffe4) > Upgrade Azure BOM to 1.2.13 > --- > > Key: NIFI-11619 > URL: https://issues.apache.org/jira/browse/NIFI-11619 > Project: Apache NiFi > Issue Type: Task > Components: Extensions >Reporter: Pierre Villard >Assignee: Pierre Villard >Priority: Major > Fix For: 1.latest, 2.latest > > Time Spent: 20m > Remaining Estimate: 0h > > Upgrade Azure BOM to 1.2.13 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Commented] (NIFI-11618) Upgrade AWS SDK
[ https://issues.apache.org/jira/browse/NIFI-11618?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17728146#comment-17728146 ] ASF subversion and git services commented on NIFI-11618: Commit 6649fdcd1fbb7ffbd8c79f6f5a51fd22e0f1e337 in nifi's branch refs/heads/main from Pierre Villard [ https://gitbox.apache.org/repos/asf?p=nifi.git;h=6649fdcd1f ] NIFI-11618 Upgraded AWS SDK from 2.20.41 to 2.20.75 - Upgraded AWS SDK 1.12.444 to 1.12.478 This closes #7318 Signed-off-by: David Handermann > Upgrade AWS SDK > --- > > Key: NIFI-11618 > URL: https://issues.apache.org/jira/browse/NIFI-11618 > Project: Apache NiFi > Issue Type: Task > Components: Extensions >Reporter: Pierre Villard >Assignee: Pierre Villard >Priority: Major > Fix For: 1.latest, 2.latest > > Time Spent: 20m > Remaining Estimate: 0h > > Upgrade AWS SDK from 2.20.41 to 2.20.75 and from 1.12.444 to 1.12.478 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Commented] (NIFI-11618) Upgrade AWS SDK
[ https://issues.apache.org/jira/browse/NIFI-11618?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17728150#comment-17728150 ] ASF subversion and git services commented on NIFI-11618: Commit c30b51154319c6c3eeec99095dfb0876b090dc41 in nifi's branch refs/heads/support/nifi-1.x from Pierre Villard [ https://gitbox.apache.org/repos/asf?p=nifi.git;h=c30b511543 ] NIFI-11618 Upgraded AWS SDK from 2.20.41 to 2.20.75 - Upgraded AWS SDK 1.12.444 to 1.12.478 This closes #7318 Signed-off-by: David Handermann (cherry picked from commit 6649fdcd1fbb7ffbd8c79f6f5a51fd22e0f1e337) > Upgrade AWS SDK > --- > > Key: NIFI-11618 > URL: https://issues.apache.org/jira/browse/NIFI-11618 > Project: Apache NiFi > Issue Type: Task > Components: Extensions >Reporter: Pierre Villard >Assignee: Pierre Villard >Priority: Major > Fix For: 1.latest, 2.latest > > Time Spent: 20m > Remaining Estimate: 0h > > Upgrade AWS SDK from 2.20.41 to 2.20.75 and from 1.12.444 to 1.12.478 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Commented] (NIFI-11617) Update jackson.bom.version to 2.15.2
[ https://issues.apache.org/jira/browse/NIFI-11617?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17728149#comment-17728149 ] ASF subversion and git services commented on NIFI-11617: Commit db3e92f8af13617aeb67a70f9d47e3b8e52a9c3a in nifi's branch refs/heads/main from mr1716 [ https://gitbox.apache.org/repos/asf?p=nifi.git;h=db3e92f8af ] NIFI-11617 Upgraded Jackson from 2.15.1 to 2.15.2 This closes #7317 Signed-off-by: David Handermann > Update jackson.bom.version to 2.15.2 > > > Key: NIFI-11617 > URL: https://issues.apache.org/jira/browse/NIFI-11617 > Project: Apache NiFi > Issue Type: Improvement >Affects Versions: 1.21.0 >Reporter: Mike R >Assignee: Mike R >Priority: Major > Time Spent: 1h 10m > Remaining Estimate: 0h > > [Jackson Release 2.15.2 · FasterXML/jackson Wiki > (github.com)|https://github.com/FasterXML/jackson/wiki/Jackson-Release-2.15.2] > Update jackson.bom.version to 2.15.2 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Commented] (NIFI-11617) Update jackson.bom.version to 2.15.2
[ https://issues.apache.org/jira/browse/NIFI-11617?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17728153#comment-17728153 ] ASF subversion and git services commented on NIFI-11617: Commit 1c9f977f49177c47a9ee6ba623acc00944890f26 in nifi's branch refs/heads/support/nifi-1.x from mr1716 [ https://gitbox.apache.org/repos/asf?p=nifi.git;h=1c9f977f49 ] NIFI-11617 Upgraded Jackson from 2.15.1 to 2.15.2 This closes #7317 Signed-off-by: David Handermann (cherry picked from commit db3e92f8af13617aeb67a70f9d47e3b8e52a9c3a) > Update jackson.bom.version to 2.15.2 > > > Key: NIFI-11617 > URL: https://issues.apache.org/jira/browse/NIFI-11617 > Project: Apache NiFi > Issue Type: Improvement >Affects Versions: 1.21.0 >Reporter: Mike R >Assignee: Mike R >Priority: Major > Time Spent: 1h 10m > Remaining Estimate: 0h > > [Jackson Release 2.15.2 · FasterXML/jackson Wiki > (github.com)|https://github.com/FasterXML/jackson/wiki/Jackson-Release-2.15.2] > Update jackson.bom.version to 2.15.2 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Commented] (NIFI-11620) Upgrade Google libraries-bom to 26.15.0
[ https://issues.apache.org/jira/browse/NIFI-11620?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17728152#comment-17728152 ] ASF subversion and git services commented on NIFI-11620: Commit 08cd14cfc4636c291950c7fd06fbf0e05641f1d2 in nifi's branch refs/heads/support/nifi-1.x from Pierre Villard [ https://gitbox.apache.org/repos/asf?p=nifi.git;h=08cd14cfc4 ] NIFI-11620 Upgraded Google libraries-bom from 26.12.0 to 26.15.0 This closes #7321 Signed-off-by: David Handermann (cherry picked from commit d4f301f473f26cd4661fee13a1b7286f57b23bba) > Upgrade Google libraries-bom to 26.15.0 > --- > > Key: NIFI-11620 > URL: https://issues.apache.org/jira/browse/NIFI-11620 > Project: Apache NiFi > Issue Type: Task > Components: Extensions >Reporter: Pierre Villard >Assignee: Pierre Villard >Priority: Major > Fix For: 1.latest, 2.latest > > Time Spent: 20m > Remaining Estimate: 0h > > Upgrade Google libraries-bom to 26.15.0 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[GitHub] [nifi] exceptionfactory closed pull request #7317: NIFI-11617 Update jackson.bom to 2.15.2
exceptionfactory closed pull request #7317: NIFI-11617 Update jackson.bom to 2.15.2 URL: https://github.com/apache/nifi/pull/7317 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[jira] [Resolved] (NIFI-11591) Address failing system tests
[ https://issues.apache.org/jira/browse/NIFI-11591?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] David Handermann resolved NIFI-11591. - Resolution: Fixed Closing following additional logging. There seems to be a remaining intermittent issue, which can be evaluated with the additional logs. > Address failing system tests > > > Key: NIFI-11591 > URL: https://issues.apache.org/jira/browse/NIFI-11591 > Project: Apache NiFi > Issue Type: Bug > Components: Core Framework, Tools and Build >Reporter: Mark Payne >Assignee: Mark Payne >Priority: Major > Fix For: 2.0.0, 1.22.0 > > Time Spent: 40m > Remaining Estimate: 0h > > We've been seeing intermittent failures in the following system tests: > - DynamicClasspathModificationIT > - RegistryClientIT#testControllerServiceUpdateWhileRunning > - ClusteredRegistryClientIT#testControllerServiceUpdateWhileRunning > Additionally, NIFI-11557 introduced a surefire-report step in the system > tests to attempt to capture more logs from failures, but that step is not > working as expected and additionally it appears that the log output from the > tests themselves is already captured into the diagnostic dump that is > included. -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Commented] (NIFI-11591) Address failing system tests
[ https://issues.apache.org/jira/browse/NIFI-11591?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17728142#comment-17728142 ] ASF subversion and git services commented on NIFI-11591: Commit bd26936f2432509d40b55008b547cdc8143429fb in nifi's branch refs/heads/support/nifi-1.x from Mark Payne [ https://gitbox.apache.org/repos/asf?p=nifi.git;h=bd26936f24 ] NIFI-11591: Added additional logging for DynamicallyModifyClasspath system test This closes #7312 Signed-off-by: David Handermann (cherry picked from commit 4b7e20740e4777f6e64e69db7b0c597cc0fb7e5e) > Address failing system tests > > > Key: NIFI-11591 > URL: https://issues.apache.org/jira/browse/NIFI-11591 > Project: Apache NiFi > Issue Type: Bug > Components: Core Framework, Tools and Build >Reporter: Mark Payne >Assignee: Mark Payne >Priority: Major > Fix For: 2.0.0, 1.22.0 > > Time Spent: 40m > Remaining Estimate: 0h > > We've been seeing intermittent failures in the following system tests: > - DynamicClasspathModificationIT > - RegistryClientIT#testControllerServiceUpdateWhileRunning > - ClusteredRegistryClientIT#testControllerServiceUpdateWhileRunning > Additionally, NIFI-11557 introduced a surefire-report step in the system > tests to attempt to capture more logs from failures, but that step is not > working as expected and additionally it appears that the log output from the > tests themselves is already captured into the diagnostic dump that is > included. -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Commented] (NIFI-11591) Address failing system tests
[ https://issues.apache.org/jira/browse/NIFI-11591?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17728141#comment-17728141 ] ASF subversion and git services commented on NIFI-11591: Commit 4b7e20740e4777f6e64e69db7b0c597cc0fb7e5e in nifi's branch refs/heads/main from Mark Payne [ https://gitbox.apache.org/repos/asf?p=nifi.git;h=4b7e20740e ] NIFI-11591: Added additional logging for DynamicallyModifyClasspath system test This closes #7312 Signed-off-by: David Handermann > Address failing system tests > > > Key: NIFI-11591 > URL: https://issues.apache.org/jira/browse/NIFI-11591 > Project: Apache NiFi > Issue Type: Bug > Components: Core Framework, Tools and Build >Reporter: Mark Payne >Assignee: Mark Payne >Priority: Major > Fix For: 2.0.0, 1.22.0 > > Time Spent: 40m > Remaining Estimate: 0h > > We've been seeing intermittent failures in the following system tests: > - DynamicClasspathModificationIT > - RegistryClientIT#testControllerServiceUpdateWhileRunning > - ClusteredRegistryClientIT#testControllerServiceUpdateWhileRunning > Additionally, NIFI-11557 introduced a surefire-report step in the system > tests to attempt to capture more logs from failures, but that step is not > working as expected and additionally it appears that the log output from the > tests themselves is already captured into the diagnostic dump that is > included. -- This message was sent by Atlassian Jira (v8.20.10#820010)
[GitHub] [nifi] exceptionfactory closed pull request #7312: NIFI-11591: Added additional logging to narrow down why DynamicallyMo…
exceptionfactory closed pull request #7312: NIFI-11591: Added additional logging to narrow down why DynamicallyMo… URL: https://github.com/apache/nifi/pull/7312 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [nifi] turcsanyip commented on a diff in pull request #7315: NIFI-3065 Per Process Group logging
turcsanyip commented on code in PR #7315: URL: https://github.com/apache/nifi/pull/7315#discussion_r1212320791 ## nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-framework-components/src/main/java/org/apache/nifi/processor/SimpleProcessLogger.java: ## @@ -559,4 +564,33 @@ private Throwable findLastThrowable(final Object[] arguments) { } return lastThrowable; } + +private String getDiscriminatorKey() { +return loggingContext.getDiscriminatorKey(); +} + +private String getLogFileSuffix() { +return loggingContext.getLogFileSuffix().orElse(null); +} + +private void log(final Level level, final String message, final Object argument) { +log(level, message, argument, null); +} + +private void log(final Level level, final String message, final Object argument, final Throwable throwable) { +if (throwable == null) { +logger.makeLoggingEventBuilder(level) +.setMessage(message) +.addArgument(argument) +.addKeyValue(getDiscriminatorKey(), getLogFileSuffix()) +.log(); Review Comment: @timeabarna Thanks for implementing the PG level logging! Without reviewing the whole feature, I would just like to add some comments on `SimpleProcessLogger` log methods. It seems `LoggingEventBuilder.addArgument()` cannot receive multiple arguments (in our case `Object argument` is an array with multiple arguments). As a result, the log messages are not rendered properly: the whole argument array is replaced in the first `{}` placeholder, while the remaining `{}` placeholders stay unresolved: ``` 2023-05-31 23:08:39,074 ERROR [Timer-Driven Process Thread-7] o.a.n.p.helloworld.HelloWorldProcessor [HelloWorldProcessor[id=01881001-eeed-1379-569e-d0e8dd187a7d], foo, bar] Fatal error, arg1={}, arg2={} ``` `addArgument()` could be called in a loop for each item in the array but I think there is a simpler solution: using vararg parameter. The `Throwable` does not need to be handled separately either, it can be the last item in the vararg. The underlying logging framework already handles it. So the method would simply look like this: ``` private void log(final Level level, final String message, final Object... arguments) { logger.makeLoggingEventBuilder(level) .addKeyValue(getDiscriminatorKey(), getLogFileSuffix()) .log(message, arguments); } ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[jira] [Commented] (NIFI-11621) Inferring schema for JSON fails when there's a CHOICE of different ARRAY types
[ https://issues.apache.org/jira/browse/NIFI-11621?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17728137#comment-17728137 ] ASF subversion and git services commented on NIFI-11621: Commit ac810671c5ad4d5b6a1d4b996d3b9a0da929105f in nifi's branch refs/heads/main from Mark Payne [ https://gitbox.apache.org/repos/asf?p=nifi.git;h=ac810671c5 ] NIFI-11621: Handle the case of CHOICE fields when inferring the type of ARRAY elements. E.g., support ARRAY> Signed-off-by: Matt Burgess This closes #7322 > Inferring schema for JSON fails when there's a CHOICE of different ARRAY types > -- > > Key: NIFI-11621 > URL: https://issues.apache.org/jira/browse/NIFI-11621 > Project: Apache NiFi > Issue Type: Bug > Components: Extensions >Reporter: Mark Payne >Assignee: Mark Payne >Priority: Major > Fix For: 1.latest, 2.latest > > Time Spent: 40m > Remaining Estimate: 0h > > From Apache Slack: > https://apachenifi.slack.com/archives/C0L9VCD47/p1685553667778359?thread_ts=1685461745.470939=C0L9VCD47 > When using ConvertRecord with a JSON Reader and an Avro Writer, when > inferring the JSON schema, each of the following two records works properly: > {code} > {"test_record":{"array_test_record":{"test_array":[]}}} > {code} > {code} > {"test_record":{"array_test_record":{"test_array":["test"]}}} > {code} > However, when combined into a single FlowFile: > {code} > {"test_record":{"array_test_record":{"test_array":[]}}} > {"test_record":{"array_test_record":{"test_array":["test"]}}} > {code} > It fails with a NullPointerException: > {code} > 2023-05-31 13:51:35,632 ERROR [Timer-Driven Process Thread-8] > o.a.n.processors.standard.ConvertRecord > ConvertRecord[id=72e564dc-0188-1000-360a-9f86b50ec8ac] Failed to process > StandardFlowFileRecord[uuid=9bf4f0fb-0942-48ba-8a16-0dbd98db3f97,claim=StandardContentClaim > [resourceClaim=StandardResourceClaim[id=1685554864966-1, container=default, > section=1], offset=3278, > length=117],offset=0,name=9bf4f0fb-0942-48ba-8a16-0dbd98db3f97,size=117]; > will route to failure > org.apache.nifi.processor.exception.ProcessException: Could not determine the > Avro Schema to use for writing the content > at > org.apache.nifi.avro.AvroRecordSetWriter.createWriter(AvroRecordSetWriter.java:154) > at > java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > at > java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.base/java.lang.reflect.Method.invoke(Method.java:566) > at > org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler.invoke(StandardControllerServiceInvocationHandler.java:254) > at > org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler.invoke(StandardControllerServiceInvocationHandler.java:105) > at com.sun.proxy.$Proxy177.createWriter(Unknown Source) > at > org.apache.nifi.processors.standard.AbstractRecordProcessor$1.process(AbstractRecordProcessor.java:150) > at > org.apache.nifi.controller.repository.StandardProcessSession.write(StandardProcessSession.java:3441) > at > org.apache.nifi.processors.standard.AbstractRecordProcessor.onTrigger(AbstractRecordProcessor.java:122) > at > org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27) > at > org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1360) > at > org.apache.nifi.controller.tasks.ConnectableTask.invoke(ConnectableTask.java:243) > at > org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:102) > at org.apache.nifi.engine.FlowEngine$2.run(FlowEngine.java:110) > at > java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) > at > java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) > at > java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) > at > java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) > at > java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) > at java.base/java.lang.Thread.run(Thread.java:829) > Caused by: org.apache.nifi.schema.access.SchemaNotFoundException: Failed to > compile Avro Schema > at > org.apache.nifi.avro.AvroRecordSetWriter.createWriter(AvroRecordSetWriter.java:145) > ... 21
[jira] [Updated] (NIFI-11621) Inferring schema for JSON fails when there's a CHOICE of different ARRAY types
[ https://issues.apache.org/jira/browse/NIFI-11621?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Matt Burgess updated NIFI-11621: Fix Version/s: 2.0.0 1.22.0 (was: 1.latest) (was: 2.latest) Resolution: Fixed Status: Resolved (was: Patch Available) > Inferring schema for JSON fails when there's a CHOICE of different ARRAY types > -- > > Key: NIFI-11621 > URL: https://issues.apache.org/jira/browse/NIFI-11621 > Project: Apache NiFi > Issue Type: Bug > Components: Extensions >Reporter: Mark Payne >Assignee: Mark Payne >Priority: Major > Fix For: 2.0.0, 1.22.0 > > Time Spent: 40m > Remaining Estimate: 0h > > From Apache Slack: > https://apachenifi.slack.com/archives/C0L9VCD47/p1685553667778359?thread_ts=1685461745.470939=C0L9VCD47 > When using ConvertRecord with a JSON Reader and an Avro Writer, when > inferring the JSON schema, each of the following two records works properly: > {code} > {"test_record":{"array_test_record":{"test_array":[]}}} > {code} > {code} > {"test_record":{"array_test_record":{"test_array":["test"]}}} > {code} > However, when combined into a single FlowFile: > {code} > {"test_record":{"array_test_record":{"test_array":[]}}} > {"test_record":{"array_test_record":{"test_array":["test"]}}} > {code} > It fails with a NullPointerException: > {code} > 2023-05-31 13:51:35,632 ERROR [Timer-Driven Process Thread-8] > o.a.n.processors.standard.ConvertRecord > ConvertRecord[id=72e564dc-0188-1000-360a-9f86b50ec8ac] Failed to process > StandardFlowFileRecord[uuid=9bf4f0fb-0942-48ba-8a16-0dbd98db3f97,claim=StandardContentClaim > [resourceClaim=StandardResourceClaim[id=1685554864966-1, container=default, > section=1], offset=3278, > length=117],offset=0,name=9bf4f0fb-0942-48ba-8a16-0dbd98db3f97,size=117]; > will route to failure > org.apache.nifi.processor.exception.ProcessException: Could not determine the > Avro Schema to use for writing the content > at > org.apache.nifi.avro.AvroRecordSetWriter.createWriter(AvroRecordSetWriter.java:154) > at > java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > at > java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.base/java.lang.reflect.Method.invoke(Method.java:566) > at > org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler.invoke(StandardControllerServiceInvocationHandler.java:254) > at > org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler.invoke(StandardControllerServiceInvocationHandler.java:105) > at com.sun.proxy.$Proxy177.createWriter(Unknown Source) > at > org.apache.nifi.processors.standard.AbstractRecordProcessor$1.process(AbstractRecordProcessor.java:150) > at > org.apache.nifi.controller.repository.StandardProcessSession.write(StandardProcessSession.java:3441) > at > org.apache.nifi.processors.standard.AbstractRecordProcessor.onTrigger(AbstractRecordProcessor.java:122) > at > org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27) > at > org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1360) > at > org.apache.nifi.controller.tasks.ConnectableTask.invoke(ConnectableTask.java:243) > at > org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:102) > at org.apache.nifi.engine.FlowEngine$2.run(FlowEngine.java:110) > at > java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) > at > java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) > at > java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) > at > java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) > at > java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) > at java.base/java.lang.Thread.run(Thread.java:829) > Caused by: org.apache.nifi.schema.access.SchemaNotFoundException: Failed to > compile Avro Schema > at > org.apache.nifi.avro.AvroRecordSetWriter.createWriter(AvroRecordSetWriter.java:145) > ... 21 common frames omitted > Caused by: java.lang.NullPointerException: null > at > org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:208) > at >
[jira] [Commented] (NIFI-11621) Inferring schema for JSON fails when there's a CHOICE of different ARRAY types
[ https://issues.apache.org/jira/browse/NIFI-11621?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17728136#comment-17728136 ] ASF subversion and git services commented on NIFI-11621: Commit 649494f7c109992e066d7e115b4953a566d37f6d in nifi's branch refs/heads/support/nifi-1.x from Mark Payne [ https://gitbox.apache.org/repos/asf?p=nifi.git;h=649494f7c1 ] NIFI-11621: Handle the case of CHOICE fields when inferring the type of ARRAY elements. E.g., support ARRAY> Signed-off-by: Matt Burgess > Inferring schema for JSON fails when there's a CHOICE of different ARRAY types > -- > > Key: NIFI-11621 > URL: https://issues.apache.org/jira/browse/NIFI-11621 > Project: Apache NiFi > Issue Type: Bug > Components: Extensions >Reporter: Mark Payne >Assignee: Mark Payne >Priority: Major > Fix For: 1.latest, 2.latest > > Time Spent: 20m > Remaining Estimate: 0h > > From Apache Slack: > https://apachenifi.slack.com/archives/C0L9VCD47/p1685553667778359?thread_ts=1685461745.470939=C0L9VCD47 > When using ConvertRecord with a JSON Reader and an Avro Writer, when > inferring the JSON schema, each of the following two records works properly: > {code} > {"test_record":{"array_test_record":{"test_array":[]}}} > {code} > {code} > {"test_record":{"array_test_record":{"test_array":["test"]}}} > {code} > However, when combined into a single FlowFile: > {code} > {"test_record":{"array_test_record":{"test_array":[]}}} > {"test_record":{"array_test_record":{"test_array":["test"]}}} > {code} > It fails with a NullPointerException: > {code} > 2023-05-31 13:51:35,632 ERROR [Timer-Driven Process Thread-8] > o.a.n.processors.standard.ConvertRecord > ConvertRecord[id=72e564dc-0188-1000-360a-9f86b50ec8ac] Failed to process > StandardFlowFileRecord[uuid=9bf4f0fb-0942-48ba-8a16-0dbd98db3f97,claim=StandardContentClaim > [resourceClaim=StandardResourceClaim[id=1685554864966-1, container=default, > section=1], offset=3278, > length=117],offset=0,name=9bf4f0fb-0942-48ba-8a16-0dbd98db3f97,size=117]; > will route to failure > org.apache.nifi.processor.exception.ProcessException: Could not determine the > Avro Schema to use for writing the content > at > org.apache.nifi.avro.AvroRecordSetWriter.createWriter(AvroRecordSetWriter.java:154) > at > java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > at > java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.base/java.lang.reflect.Method.invoke(Method.java:566) > at > org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler.invoke(StandardControllerServiceInvocationHandler.java:254) > at > org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler.invoke(StandardControllerServiceInvocationHandler.java:105) > at com.sun.proxy.$Proxy177.createWriter(Unknown Source) > at > org.apache.nifi.processors.standard.AbstractRecordProcessor$1.process(AbstractRecordProcessor.java:150) > at > org.apache.nifi.controller.repository.StandardProcessSession.write(StandardProcessSession.java:3441) > at > org.apache.nifi.processors.standard.AbstractRecordProcessor.onTrigger(AbstractRecordProcessor.java:122) > at > org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27) > at > org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1360) > at > org.apache.nifi.controller.tasks.ConnectableTask.invoke(ConnectableTask.java:243) > at > org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:102) > at org.apache.nifi.engine.FlowEngine$2.run(FlowEngine.java:110) > at > java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) > at > java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) > at > java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) > at > java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) > at > java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) > at java.base/java.lang.Thread.run(Thread.java:829) > Caused by: org.apache.nifi.schema.access.SchemaNotFoundException: Failed to > compile Avro Schema > at > org.apache.nifi.avro.AvroRecordSetWriter.createWriter(AvroRecordSetWriter.java:145) > ... 21 common
[GitHub] [nifi] markap14 commented on a diff in pull request #7299: NIFI-11603: Removed NetworkUtils.getAvailableUdpPort, NetworkUtils.ge…
markap14 commented on code in PR #7299: URL: https://github.com/apache/nifi/pull/7299#discussion_r1212309579 ## nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/TestListenUDP.java: ## @@ -173,7 +170,7 @@ public void testRunWhenNoEventsAvailable() { @Test public void testWithSendingHostAndPortSameAsSender() throws IOException, InterruptedException { -final Integer sendingPort = NetworkUtils.getAvailableUdpPort(); +final int sendingPort = 27911; Review Comment: Good catch, I meant to come back to that one, as I wasn't 100% sure what was happening there. Will address, though. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [nifi] exceptionfactory commented on a diff in pull request #7194: NIFI-11167 - Add Excel Record Reader
exceptionfactory commented on code in PR #7194: URL: https://github.com/apache/nifi/pull/7194#discussion_r1212254476 ## nifi-nar-bundles/nifi-standard-services/nifi-record-serialization-services-bundle/nifi-record-serialization-services/src/main/java/org/apache/nifi/excel/ExcelHeaderSchemaStrategy.java: ## @@ -0,0 +1,116 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.nifi.excel; + +import org.apache.nifi.context.PropertyContext; +import org.apache.nifi.logging.ComponentLog; +import org.apache.nifi.schema.access.SchemaAccessStrategy; +import org.apache.nifi.schema.access.SchemaField; +import org.apache.nifi.schema.access.SchemaNotFoundException; +import org.apache.nifi.schema.inference.FieldTypeInference; +import org.apache.nifi.schema.inference.RecordSource; +import org.apache.nifi.schema.inference.TimeValueInference; +import org.apache.nifi.serialization.DateTimeUtils; +import org.apache.nifi.serialization.SimpleRecordSchema; +import org.apache.nifi.serialization.record.DataType; +import org.apache.nifi.serialization.record.RecordField; +import org.apache.nifi.serialization.record.RecordSchema; +import org.apache.nifi.util.SchemaInferenceUtil; +import org.apache.poi.ss.usermodel.Cell; +import org.apache.poi.ss.usermodel.DataFormatter; +import org.apache.poi.ss.usermodel.Row; + +import java.io.IOException; +import java.io.InputStream; +import java.util.EnumSet; +import java.util.LinkedHashMap; +import java.util.List; +import java.util.Locale; +import java.util.Map; +import java.util.Set; +import java.util.stream.Collectors; +import java.util.stream.IntStream; + +public class ExcelHeaderSchemaStrategy implements SchemaAccessStrategy { +private static final Set schemaFields = EnumSet.noneOf(SchemaField.class); + +private final PropertyContext context; + +private final DataFormatter dataFormatter; + +private final ComponentLog logger; + +public ExcelHeaderSchemaStrategy(PropertyContext context, ComponentLog logger) { +this(context, logger, null); +} + +public ExcelHeaderSchemaStrategy(PropertyContext context, ComponentLog logger, Locale locale) { +this.context = context; +this.logger = logger; +this.dataFormatter = locale == null ? new DataFormatter() : new DataFormatter(locale); +} + +@Override +public RecordSchema getSchema(Map variables, InputStream contentStream, RecordSchema readSchema) throws SchemaNotFoundException, IOException { +if (this.context == null) { +throw new SchemaNotFoundException("Schema Access Strategy intended only for validation purposes and cannot obtain schema"); +} + +String errMsg = "Failed to read Header line from Excel worksheet"; +RecordSource recordSource; +try { +recordSource = new ExcelRecordSource(contentStream, context, variables, logger); +} catch (Exception e) { +throw new SchemaNotFoundException(errMsg, e); +} + +Row headerRow = recordSource.next(); +if (!ExcelUtils.hasCells(headerRow)) { +throw new SchemaNotFoundException("The chosen header line in the Excel worksheet had no cells"); +} + +try { +String dateFormat = context.getProperty(DateTimeUtils.DATE_FORMAT).getValue(); +String timeFormat = context.getProperty(DateTimeUtils.TIME_FORMAT).getValue(); +String timestampFormat = context.getProperty(DateTimeUtils.TIMESTAMP_FORMAT).getValue(); +final TimeValueInference timeValueInference = new TimeValueInference(dateFormat, timeFormat, timestampFormat); +final Map typeMap = new LinkedHashMap<>(); +IntStream.range(0, headerRow.getLastCellNum()) +.forEach(index -> { +final Cell cell = headerRow.getCell(index); +final String fieldName = Integer.toString(index); Review Comment: The string typing is also a problem as you noted. Theoretically that could also be changed to follow an approach similar to the Infer Schema strategy, reading all rows to determine potential values. In the interest of completing a first
[GitHub] [nifi] dan-s1 commented on a diff in pull request #7194: NIFI-11167 - Add Excel Record Reader
dan-s1 commented on code in PR #7194: URL: https://github.com/apache/nifi/pull/7194#discussion_r1212250185 ## nifi-nar-bundles/nifi-standard-services/nifi-record-serialization-services-bundle/nifi-record-serialization-services/src/main/java/org/apache/nifi/excel/ExcelHeaderSchemaStrategy.java: ## @@ -0,0 +1,116 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.nifi.excel; + +import org.apache.nifi.context.PropertyContext; +import org.apache.nifi.logging.ComponentLog; +import org.apache.nifi.schema.access.SchemaAccessStrategy; +import org.apache.nifi.schema.access.SchemaField; +import org.apache.nifi.schema.access.SchemaNotFoundException; +import org.apache.nifi.schema.inference.FieldTypeInference; +import org.apache.nifi.schema.inference.RecordSource; +import org.apache.nifi.schema.inference.TimeValueInference; +import org.apache.nifi.serialization.DateTimeUtils; +import org.apache.nifi.serialization.SimpleRecordSchema; +import org.apache.nifi.serialization.record.DataType; +import org.apache.nifi.serialization.record.RecordField; +import org.apache.nifi.serialization.record.RecordSchema; +import org.apache.nifi.util.SchemaInferenceUtil; +import org.apache.poi.ss.usermodel.Cell; +import org.apache.poi.ss.usermodel.DataFormatter; +import org.apache.poi.ss.usermodel.Row; + +import java.io.IOException; +import java.io.InputStream; +import java.util.EnumSet; +import java.util.LinkedHashMap; +import java.util.List; +import java.util.Locale; +import java.util.Map; +import java.util.Set; +import java.util.stream.Collectors; +import java.util.stream.IntStream; + +public class ExcelHeaderSchemaStrategy implements SchemaAccessStrategy { +private static final Set schemaFields = EnumSet.noneOf(SchemaField.class); + +private final PropertyContext context; + +private final DataFormatter dataFormatter; + +private final ComponentLog logger; + +public ExcelHeaderSchemaStrategy(PropertyContext context, ComponentLog logger) { +this(context, logger, null); +} + +public ExcelHeaderSchemaStrategy(PropertyContext context, ComponentLog logger, Locale locale) { +this.context = context; +this.logger = logger; +this.dataFormatter = locale == null ? new DataFormatter() : new DataFormatter(locale); +} + +@Override +public RecordSchema getSchema(Map variables, InputStream contentStream, RecordSchema readSchema) throws SchemaNotFoundException, IOException { +if (this.context == null) { +throw new SchemaNotFoundException("Schema Access Strategy intended only for validation purposes and cannot obtain schema"); +} + +String errMsg = "Failed to read Header line from Excel worksheet"; +RecordSource recordSource; +try { +recordSource = new ExcelRecordSource(contentStream, context, variables, logger); +} catch (Exception e) { +throw new SchemaNotFoundException(errMsg, e); +} + +Row headerRow = recordSource.next(); +if (!ExcelUtils.hasCells(headerRow)) { +throw new SchemaNotFoundException("The chosen header line in the Excel worksheet had no cells"); +} + +try { +String dateFormat = context.getProperty(DateTimeUtils.DATE_FORMAT).getValue(); +String timeFormat = context.getProperty(DateTimeUtils.TIME_FORMAT).getValue(); +String timestampFormat = context.getProperty(DateTimeUtils.TIMESTAMP_FORMAT).getValue(); +final TimeValueInference timeValueInference = new TimeValueInference(dateFormat, timeFormat, timestampFormat); +final Map typeMap = new LinkedHashMap<>(); +IntStream.range(0, headerRow.getLastCellNum()) +.forEach(index -> { +final Cell cell = headerRow.getCell(index); +final String fieldName = Integer.toString(index); Review Comment: @exceptionfactory Also the other problem or "feature" with that is that all the field types will be strings. I am open to whatever you prefer. Let me know whether we should remove it or modify it. -- This is an automated message from the Apache
[GitHub] [nifi] exceptionfactory commented on a diff in pull request #7194: NIFI-11167 - Add Excel Record Reader
exceptionfactory commented on code in PR #7194: URL: https://github.com/apache/nifi/pull/7194#discussion_r1212240524 ## nifi-nar-bundles/nifi-standard-services/nifi-record-serialization-services-bundle/nifi-record-serialization-services/src/main/java/org/apache/nifi/excel/ExcelHeaderSchemaStrategy.java: ## @@ -0,0 +1,116 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.nifi.excel; + +import org.apache.nifi.context.PropertyContext; +import org.apache.nifi.logging.ComponentLog; +import org.apache.nifi.schema.access.SchemaAccessStrategy; +import org.apache.nifi.schema.access.SchemaField; +import org.apache.nifi.schema.access.SchemaNotFoundException; +import org.apache.nifi.schema.inference.FieldTypeInference; +import org.apache.nifi.schema.inference.RecordSource; +import org.apache.nifi.schema.inference.TimeValueInference; +import org.apache.nifi.serialization.DateTimeUtils; +import org.apache.nifi.serialization.SimpleRecordSchema; +import org.apache.nifi.serialization.record.DataType; +import org.apache.nifi.serialization.record.RecordField; +import org.apache.nifi.serialization.record.RecordSchema; +import org.apache.nifi.util.SchemaInferenceUtil; +import org.apache.poi.ss.usermodel.Cell; +import org.apache.poi.ss.usermodel.DataFormatter; +import org.apache.poi.ss.usermodel.Row; + +import java.io.IOException; +import java.io.InputStream; +import java.util.EnumSet; +import java.util.LinkedHashMap; +import java.util.List; +import java.util.Locale; +import java.util.Map; +import java.util.Set; +import java.util.stream.Collectors; +import java.util.stream.IntStream; + +public class ExcelHeaderSchemaStrategy implements SchemaAccessStrategy { +private static final Set schemaFields = EnumSet.noneOf(SchemaField.class); + +private final PropertyContext context; + +private final DataFormatter dataFormatter; + +private final ComponentLog logger; + +public ExcelHeaderSchemaStrategy(PropertyContext context, ComponentLog logger) { +this(context, logger, null); +} + +public ExcelHeaderSchemaStrategy(PropertyContext context, ComponentLog logger, Locale locale) { +this.context = context; +this.logger = logger; +this.dataFormatter = locale == null ? new DataFormatter() : new DataFormatter(locale); +} + +@Override +public RecordSchema getSchema(Map variables, InputStream contentStream, RecordSchema readSchema) throws SchemaNotFoundException, IOException { +if (this.context == null) { +throw new SchemaNotFoundException("Schema Access Strategy intended only for validation purposes and cannot obtain schema"); +} + +String errMsg = "Failed to read Header line from Excel worksheet"; +RecordSource recordSource; +try { +recordSource = new ExcelRecordSource(contentStream, context, variables, logger); +} catch (Exception e) { +throw new SchemaNotFoundException(errMsg, e); +} + +Row headerRow = recordSource.next(); +if (!ExcelUtils.hasCells(headerRow)) { +throw new SchemaNotFoundException("The chosen header line in the Excel worksheet had no cells"); +} + +try { +String dateFormat = context.getProperty(DateTimeUtils.DATE_FORMAT).getValue(); +String timeFormat = context.getProperty(DateTimeUtils.TIME_FORMAT).getValue(); +String timestampFormat = context.getProperty(DateTimeUtils.TIMESTAMP_FORMAT).getValue(); +final TimeValueInference timeValueInference = new TimeValueInference(dateFormat, timeFormat, timestampFormat); +final Map typeMap = new LinkedHashMap<>(); +IntStream.range(0, headerRow.getLastCellNum()) +.forEach(index -> { +final Cell cell = headerRow.getCell(index); +final String fieldName = Integer.toString(index); Review Comment: @dan-s1, although the header column does not have to be a string, the expectation for this strategy is that the first row cell values should be used as the field names, analogous to the CSV implementation. In its current state, I agree the
[GitHub] [nifi] dan-s1 commented on a diff in pull request #7194: NIFI-11167 - Add Excel Record Reader
dan-s1 commented on code in PR #7194: URL: https://github.com/apache/nifi/pull/7194#discussion_r1212232519 ## nifi-nar-bundles/nifi-standard-services/nifi-record-serialization-services-bundle/nifi-record-serialization-services/src/test/java/org/apache/nifi/excel/TestExcelHeaderSchemaStrategy.java: ## @@ -0,0 +1,123 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.nifi.excel; + +import org.apache.nifi.components.PropertyDescriptor; +import org.apache.nifi.context.PropertyContext; +import org.apache.nifi.logging.ComponentLog; +import org.apache.nifi.schema.access.SchemaNotFoundException; +import org.apache.nifi.serialization.record.RecordFieldType; +import org.apache.nifi.serialization.record.RecordSchema; +import org.apache.nifi.util.MockConfigurationContext; +import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.extension.ExtendWith; +import org.junit.jupiter.params.ParameterizedTest; +import org.junit.jupiter.params.provider.Arguments; +import org.junit.jupiter.params.provider.MethodSource; +import org.mockito.Mock; +import org.mockito.junit.jupiter.MockitoExtension; + +import java.io.IOException; +import java.io.InputStream; +import java.nio.file.Files; +import java.nio.file.Path; +import java.nio.file.Paths; +import java.util.Arrays; +import java.util.Collections; +import java.util.HashMap; +import java.util.List; +import java.util.Locale; +import java.util.Map; +import java.util.stream.Stream; + +import static org.junit.jupiter.api.Assertions.assertEquals; + +@ExtendWith(MockitoExtension.class) +public class TestExcelHeaderSchemaStrategy { +@Mock +ComponentLog logger; + +@ParameterizedTest +@MethodSource("getLocales") +public void testInferenceAgainstDifferentLocales(Locale locale) throws IOException, SchemaNotFoundException { +final Map properties = new HashMap<>(); +new ExcelReader().getSupportedPropertyDescriptors().forEach(prop -> properties.put(prop, prop.getDefaultValue())); +final PropertyContext context = new MockConfigurationContext(properties, null); +ExcelHeaderSchemaStrategy headerSchemaStrategy = new ExcelHeaderSchemaStrategy(context, logger, locale); + +RecordSchema schema = headerSchemaStrategy.getSchema(null, getInputStream("numbers.xlsx"), null); + +final List fieldNames = schema.getFieldNames(); +assertEquals(Collections.singletonList("0"), fieldNames); +if (Locale.FRENCH.equals(locale)) { +assertEquals(RecordFieldType.STRING, schema.getDataType("0").get().getFieldType()); +} else { +assertEquals(RecordFieldType.FLOAT, schema.getDataType("0").get().getFieldType()); +} +} + +private static Stream getLocales() { +Locale hindi = new Locale("hin"); +return Stream.of( +Arguments.of(Locale.ENGLISH), +Arguments.of(hindi), +Arguments.of(Locale.JAPANESE), +Arguments.of(Locale.FRENCH) +); +} + +private InputStream getInputStream(String excelFile) throws IOException { +String excelResourcesDir = "src/test/resources/excel"; +Path excelDoc = Paths.get(excelResourcesDir, excelFile); +return Files.newInputStream(excelDoc); +} + +@Test +public void testColumnHeaders() throws IOException, SchemaNotFoundException { +final Map properties = new HashMap<>(); +new ExcelReader().getSupportedPropertyDescriptors().forEach(prop -> properties.put(prop, prop.getDefaultValue())); +final PropertyContext context = new MockConfigurationContext(properties, null); +ExcelHeaderSchemaStrategy headerSchemaStrategy = new ExcelHeaderSchemaStrategy(context, logger); + +RecordSchema schema = headerSchemaStrategy.getSchema(null, getInputStream("simpleDataFormatting.xlsx"), null); + +final List fieldNames = schema.getFieldNames(); +assertEquals(Arrays.asList("0", "1", "2", "3"), fieldNames); +assertEquals(RecordFieldType.STRING.getDataType(), schema.getDataType("0").get()); +assertEquals(RecordFieldType.STRING.getDataType(), schema.getDataType("1").get()); +
[GitHub] [nifi] dan-s1 commented on a diff in pull request #7194: NIFI-11167 - Add Excel Record Reader
dan-s1 commented on code in PR #7194: URL: https://github.com/apache/nifi/pull/7194#discussion_r1212232125 ## nifi-nar-bundles/nifi-standard-services/nifi-record-serialization-services-bundle/nifi-record-serialization-services/src/main/java/org/apache/nifi/excel/ExcelHeaderSchemaStrategy.java: ## @@ -0,0 +1,116 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.nifi.excel; + +import org.apache.nifi.context.PropertyContext; +import org.apache.nifi.logging.ComponentLog; +import org.apache.nifi.schema.access.SchemaAccessStrategy; +import org.apache.nifi.schema.access.SchemaField; +import org.apache.nifi.schema.access.SchemaNotFoundException; +import org.apache.nifi.schema.inference.FieldTypeInference; +import org.apache.nifi.schema.inference.RecordSource; +import org.apache.nifi.schema.inference.TimeValueInference; +import org.apache.nifi.serialization.DateTimeUtils; +import org.apache.nifi.serialization.SimpleRecordSchema; +import org.apache.nifi.serialization.record.DataType; +import org.apache.nifi.serialization.record.RecordField; +import org.apache.nifi.serialization.record.RecordSchema; +import org.apache.nifi.util.SchemaInferenceUtil; +import org.apache.poi.ss.usermodel.Cell; +import org.apache.poi.ss.usermodel.DataFormatter; +import org.apache.poi.ss.usermodel.Row; + +import java.io.IOException; +import java.io.InputStream; +import java.util.EnumSet; +import java.util.LinkedHashMap; +import java.util.List; +import java.util.Locale; +import java.util.Map; +import java.util.Set; +import java.util.stream.Collectors; +import java.util.stream.IntStream; + +public class ExcelHeaderSchemaStrategy implements SchemaAccessStrategy { +private static final Set schemaFields = EnumSet.noneOf(SchemaField.class); + +private final PropertyContext context; + +private final DataFormatter dataFormatter; + +private final ComponentLog logger; + +public ExcelHeaderSchemaStrategy(PropertyContext context, ComponentLog logger) { +this(context, logger, null); +} + +public ExcelHeaderSchemaStrategy(PropertyContext context, ComponentLog logger, Locale locale) { +this.context = context; +this.logger = logger; +this.dataFormatter = locale == null ? new DataFormatter() : new DataFormatter(locale); +} + +@Override +public RecordSchema getSchema(Map variables, InputStream contentStream, RecordSchema readSchema) throws SchemaNotFoundException, IOException { +if (this.context == null) { +throw new SchemaNotFoundException("Schema Access Strategy intended only for validation purposes and cannot obtain schema"); +} + +String errMsg = "Failed to read Header line from Excel worksheet"; +RecordSource recordSource; +try { +recordSource = new ExcelRecordSource(contentStream, context, variables, logger); +} catch (Exception e) { +throw new SchemaNotFoundException(errMsg, e); +} + +Row headerRow = recordSource.next(); +if (!ExcelUtils.hasCells(headerRow)) { +throw new SchemaNotFoundException("The chosen header line in the Excel worksheet had no cells"); +} + +try { +String dateFormat = context.getProperty(DateTimeUtils.DATE_FORMAT).getValue(); +String timeFormat = context.getProperty(DateTimeUtils.TIME_FORMAT).getValue(); +String timestampFormat = context.getProperty(DateTimeUtils.TIMESTAMP_FORMAT).getValue(); +final TimeValueInference timeValueInference = new TimeValueInference(dateFormat, timeFormat, timestampFormat); +final Map typeMap = new LinkedHashMap<>(); +IntStream.range(0, headerRow.getLastCellNum()) +.forEach(index -> { +final Cell cell = headerRow.getCell(index); +final String fieldName = Integer.toString(index); Review Comment: @exceptionfactory Actually that is the expected behavior as this is not CSV hence the values in the header column do not necessarily have to be strings. I used the field index since it is predictable and unchanging. Personally I am really not sure why
[jira] [Commented] (NIFI-11608) PutBigQuery: Missing Flowfile Attribute Evaluation
[ https://issues.apache.org/jira/browse/NIFI-11608?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17728121#comment-17728121 ] ASF subversion and git services commented on NIFI-11608: Commit 0344bd3e25c91e265d48a0410081af51ecb49092 in nifi's branch refs/heads/support/nifi-1.x from Steven Matison [ https://gitbox.apache.org/repos/asf?p=nifi.git;h=0344bd3e25 ] NIFI-11608 Fixed Expression Language Evaluation in PutBigQuery This closes #7316 Signed-off-by: David Handermann (cherry picked from commit 645618a6095a94488f845cb427a4d4768161953d) > PutBigQuery: Missing Flowfile Attribute Evaluation > -- > > Key: NIFI-11608 > URL: https://issues.apache.org/jira/browse/NIFI-11608 > Project: Apache NiFi > Issue Type: Bug > Components: Core Framework >Affects Versions: 1.21.0 >Reporter: Juan Ignacio Saitua >Assignee: Steven Matison >Priority: Major > Time Spent: 3h 10m > Remaining Estimate: 0h > > While the documentation states that PutBigQuery should support expression > language for DATASET, TABLE_NAME, and SKIP_INVALID_ROWS, that's not really > happening. An examination of the code reveals that these expressions are not > evaluated in the current implementation ([see code > here](https://github.com/apache/nifi/blob/cfd62c9511e43d5010fbfbb12b98b40bdfdb3fc2/nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/bigquery/PutBigQuery.java#L193)). > Notably, the issue does not exist in the PutBigQueryBatch processor. While > both processors extends the AbstractBigQueryProcessor class, only > PutBigQueryBatch utilizes the AbstractBigQueryProcessor.getTableId() method, > which correctly evaluates and retrieves the tableId. -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Resolved] (NIFI-11608) PutBigQuery: Missing Flowfile Attribute Evaluation
[ https://issues.apache.org/jira/browse/NIFI-11608?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] David Handermann resolved NIFI-11608. - Fix Version/s: 2.0.0 1.22.0 Resolution: Fixed > PutBigQuery: Missing Flowfile Attribute Evaluation > -- > > Key: NIFI-11608 > URL: https://issues.apache.org/jira/browse/NIFI-11608 > Project: Apache NiFi > Issue Type: Bug > Components: Core Framework >Affects Versions: 1.21.0 >Reporter: Juan Ignacio Saitua >Assignee: Steven Matison >Priority: Major > Fix For: 2.0.0, 1.22.0 > > Time Spent: 3h 10m > Remaining Estimate: 0h > > While the documentation states that PutBigQuery should support expression > language for DATASET, TABLE_NAME, and SKIP_INVALID_ROWS, that's not really > happening. An examination of the code reveals that these expressions are not > evaluated in the current implementation ([see code > here](https://github.com/apache/nifi/blob/cfd62c9511e43d5010fbfbb12b98b40bdfdb3fc2/nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/bigquery/PutBigQuery.java#L193)). > Notably, the issue does not exist in the PutBigQueryBatch processor. While > both processors extends the AbstractBigQueryProcessor class, only > PutBigQueryBatch utilizes the AbstractBigQueryProcessor.getTableId() method, > which correctly evaluates and retrieves the tableId. -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Commented] (NIFI-11608) PutBigQuery: Missing Flowfile Attribute Evaluation
[ https://issues.apache.org/jira/browse/NIFI-11608?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17728120#comment-17728120 ] ASF subversion and git services commented on NIFI-11608: Commit 645618a6095a94488f845cb427a4d4768161953d in nifi's branch refs/heads/main from Steven Matison [ https://gitbox.apache.org/repos/asf?p=nifi.git;h=645618a609 ] NIFI-11608 Fixed Expression Language Evaluation in PutBigQuery This closes #7316 Signed-off-by: David Handermann > PutBigQuery: Missing Flowfile Attribute Evaluation > -- > > Key: NIFI-11608 > URL: https://issues.apache.org/jira/browse/NIFI-11608 > Project: Apache NiFi > Issue Type: Bug > Components: Core Framework >Affects Versions: 1.21.0 >Reporter: Juan Ignacio Saitua >Assignee: Steven Matison >Priority: Major > Time Spent: 3h 10m > Remaining Estimate: 0h > > While the documentation states that PutBigQuery should support expression > language for DATASET, TABLE_NAME, and SKIP_INVALID_ROWS, that's not really > happening. An examination of the code reveals that these expressions are not > evaluated in the current implementation ([see code > here](https://github.com/apache/nifi/blob/cfd62c9511e43d5010fbfbb12b98b40bdfdb3fc2/nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/bigquery/PutBigQuery.java#L193)). > Notably, the issue does not exist in the PutBigQueryBatch processor. While > both processors extends the AbstractBigQueryProcessor class, only > PutBigQueryBatch utilizes the AbstractBigQueryProcessor.getTableId() method, > which correctly evaluates and retrieves the tableId. -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Updated] (NIFI-11595) StateProvider.replace() cannot create the initial state
[ https://issues.apache.org/jira/browse/NIFI-11595?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Peter Turcsanyi updated NIFI-11595: --- Status: Patch Available (was: Open) > StateProvider.replace() cannot create the initial state > --- > > Key: NIFI-11595 > URL: https://issues.apache.org/jira/browse/NIFI-11595 > Project: Apache NiFi > Issue Type: Bug > Components: Core Framework >Reporter: Peter Turcsanyi >Assignee: Peter Turcsanyi >Priority: Major > Time Spent: 10m > Remaining Estimate: 0h > > {{StateProvider.replace()}} method properly works when the state already > exists and persisted in the storage. However, it cannot create the state at > first run. > As a workaround, {{setState()}} needs to be used but it does not provide the > same compare-and-swap mechanism as {{replace()}} so it is the caller > responsibility to handle concurrency. > To lift this responsibility from the clients and also to provide a more > consistent API, {{replace()}} should support creating the initial state. It > should be able to move the state "from nothing to X" not only "from X1 to X2" > and also providing the same compare-and-swap logic. > Affected {{StateProvider}} implementations: > - {{ZooKeeperStateProvider}} > - {{RedisStateProvider}} > - {{KubernetesConfigMapStateProvider}} > - {{WriteAheadLocalStateProvider}} -- This message was sent by Atlassian Jira (v8.20.10#820010)
[GitHub] [nifi] exceptionfactory closed pull request #7316: NIFI-11608 Fixing Expression Language Evaluation in dataset and tablename.
exceptionfactory closed pull request #7316: NIFI-11608 Fixing Expression Language Evaluation in dataset and tablename. URL: https://github.com/apache/nifi/pull/7316 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [nifi] turcsanyip opened a new pull request, #7324: NIFI-11595: StateProvider.replace() supports creating the initial state
turcsanyip opened a new pull request, #7324: URL: https://github.com/apache/nifi/pull/7324 # Summary [NIFI-11595](https://issues.apache.org/jira/browse/NIFI-11595) # Tracking Please complete the following tracking steps prior to pull request creation. ### Issue Tracking - [ ] [Apache NiFi Jira](https://issues.apache.org/jira/browse/NIFI) issue created ### Pull Request Tracking - [ ] Pull Request title starts with Apache NiFi Jira issue number, such as `NIFI-0` - [ ] Pull Request commit message starts with Apache NiFi Jira issue number, as such `NIFI-0` ### Pull Request Formatting - [ ] Pull Request based on current revision of the `main` branch - [ ] Pull Request refers to a feature branch with one commit containing changes # Verification Please indicate the verification steps performed prior to pull request creation. ### Build - [ ] Build completed using `mvn clean install -P contrib-check` - [ ] JDK 11 - [ ] JDK 17 ### Licensing - [ ] New dependencies are compatible with the [Apache License 2.0](https://apache.org/licenses/LICENSE-2.0) according to the [License Policy](https://www.apache.org/legal/resolved.html) - [ ] New dependencies are documented in applicable `LICENSE` and `NOTICE` files ### Documentation - [ ] Documentation formatting appears as expected in rendered files -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [nifi] ohnoitsyou commented on a diff in pull request #7242: NIFI-11471: Define new stateless configuration points
ohnoitsyou commented on code in PR #7242: URL: https://github.com/apache/nifi/pull/7242#discussion_r1212209884 ## nifi-stateless/nifi-stateless-api/src/main/java/org/apache/nifi/stateless/config/PropertiesFileEngineConfigurationParser.java: ## @@ -111,6 +115,9 @@ public StatelessEngineConfiguration parseEngineConfiguration(final File properti final String statusTaskInterval = properties.getProperty(STATUS_TASK_INTERVAL, "1 min"); +final long processorStartTimeout = TimeUnit.SECONDS.toMillis(Long.parseLong(properties.getProperty(PROCESSOR_START_TIMEOUT, "10"))); +final long componentEnableTimeout = TimeUnit.SECONDS.toMillis(Long.parseLong(properties.getProperty(COMPONENT_ENABLE_TIMEOUT, "10"))); Review Comment: @markap14 After a quick chat with David H. I brought up that FormatUtils is located within `nifi-utils` which is something stateless doesn't already import. What would be your opinion on including this? If the answer is "don't", how would you recommend converting from strings to `Duration`? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[jira] [Updated] (NIFI-11243) Implement Dependent Properties in Python Processors
[ https://issues.apache.org/jira/browse/NIFI-11243?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Mark Payne updated NIFI-11243: -- Status: Patch Available (was: Open) > Implement Dependent Properties in Python Processors > --- > > Key: NIFI-11243 > URL: https://issues.apache.org/jira/browse/NIFI-11243 > Project: Apache NiFi > Issue Type: Sub-task > Components: NiFi Stateless >Reporter: Mark Payne >Assignee: Mark Payne >Priority: Major > Fix For: 2.0.0 > > Time Spent: 10m > Remaining Estimate: 0h > > The Python Processor API allows for declaring a property dependent on another > but doesn't yet convey this information to the Java side, so it doesn't take > effect. -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Updated] (NIFI-11243) Implement Dependent Properties in Python Processors
[ https://issues.apache.org/jira/browse/NIFI-11243?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Mark Payne updated NIFI-11243: -- Fix Version/s: 2.0.0 > Implement Dependent Properties in Python Processors > --- > > Key: NIFI-11243 > URL: https://issues.apache.org/jira/browse/NIFI-11243 > Project: Apache NiFi > Issue Type: Sub-task > Components: NiFi Stateless >Reporter: Mark Payne >Priority: Major > Fix For: 2.0.0 > > > The Python Processor API allows for declaring a property dependent on another > but doesn't yet convey this information to the Java side, so it doesn't take > effect. -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Assigned] (NIFI-11243) Implement Dependent Properties in Python Processors
[ https://issues.apache.org/jira/browse/NIFI-11243?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Mark Payne reassigned NIFI-11243: - Assignee: Mark Payne > Implement Dependent Properties in Python Processors > --- > > Key: NIFI-11243 > URL: https://issues.apache.org/jira/browse/NIFI-11243 > Project: Apache NiFi > Issue Type: Sub-task > Components: NiFi Stateless >Reporter: Mark Payne >Assignee: Mark Payne >Priority: Major > Fix For: 2.0.0 > > > The Python Processor API allows for declaring a property dependent on another > but doesn't yet convey this information to the Java side, so it doesn't take > effect. -- This message was sent by Atlassian Jira (v8.20.10#820010)
[GitHub] [nifi] markap14 opened a new pull request, #7323: NIFI-11243: Implemented Dependent Properties on the Python side
markap14 opened a new pull request, #7323: URL: https://github.com/apache/nifi/pull/7323 # Summary [NIFI-0](https://issues.apache.org/jira/browse/NIFI-0) # Tracking Please complete the following tracking steps prior to pull request creation. ### Issue Tracking - [ ] [Apache NiFi Jira](https://issues.apache.org/jira/browse/NIFI) issue created ### Pull Request Tracking - [ ] Pull Request title starts with Apache NiFi Jira issue number, such as `NIFI-0` - [ ] Pull Request commit message starts with Apache NiFi Jira issue number, as such `NIFI-0` ### Pull Request Formatting - [ ] Pull Request based on current revision of the `main` branch - [ ] Pull Request refers to a feature branch with one commit containing changes # Verification Please indicate the verification steps performed prior to pull request creation. ### Build - [ ] Build completed using `mvn clean install -P contrib-check` - [ ] JDK 11 - [ ] JDK 17 ### Licensing - [ ] New dependencies are compatible with the [Apache License 2.0](https://apache.org/licenses/LICENSE-2.0) according to the [License Policy](https://www.apache.org/legal/resolved.html) - [ ] New dependencies are documented in applicable `LICENSE` and `NOTICE` files ### Documentation - [ ] Documentation formatting appears as expected in rendered files -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [nifi] exceptionfactory commented on pull request #7317: NIFI-11617 Update jackson.bom to 2.15.2
exceptionfactory commented on PR #7317: URL: https://github.com/apache/nifi/pull/7317#issuecomment-1570790283 That is for `jackson-databind`, not `jackson-module-scala`. It looks like `jackson-module-scala` was just tagged, so it may become available on Maven Central soon. https://github.com/FasterXML/jackson-module-scala/releases/tag/v2.15.2 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [nifi] exceptionfactory commented on a diff in pull request #7194: NIFI-11167 - Add Excel Record Reader
exceptionfactory commented on code in PR #7194: URL: https://github.com/apache/nifi/pull/7194#discussion_r1212152365 ## nifi-nar-bundles/nifi-standard-services/nifi-record-serialization-services-bundle/nifi-record-serialization-services/src/main/java/org/apache/nifi/excel/ExcelHeaderSchemaStrategy.java: ## @@ -0,0 +1,116 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.nifi.excel; + +import org.apache.nifi.context.PropertyContext; +import org.apache.nifi.logging.ComponentLog; +import org.apache.nifi.schema.access.SchemaAccessStrategy; +import org.apache.nifi.schema.access.SchemaField; +import org.apache.nifi.schema.access.SchemaNotFoundException; +import org.apache.nifi.schema.inference.FieldTypeInference; +import org.apache.nifi.schema.inference.RecordSource; +import org.apache.nifi.schema.inference.TimeValueInference; +import org.apache.nifi.serialization.DateTimeUtils; +import org.apache.nifi.serialization.SimpleRecordSchema; +import org.apache.nifi.serialization.record.DataType; +import org.apache.nifi.serialization.record.RecordField; +import org.apache.nifi.serialization.record.RecordSchema; +import org.apache.nifi.util.SchemaInferenceUtil; +import org.apache.poi.ss.usermodel.Cell; +import org.apache.poi.ss.usermodel.DataFormatter; +import org.apache.poi.ss.usermodel.Row; + +import java.io.IOException; +import java.io.InputStream; +import java.util.EnumSet; +import java.util.LinkedHashMap; +import java.util.List; +import java.util.Locale; +import java.util.Map; +import java.util.Set; +import java.util.stream.Collectors; +import java.util.stream.IntStream; + +public class ExcelHeaderSchemaStrategy implements SchemaAccessStrategy { +private static final Set schemaFields = EnumSet.noneOf(SchemaField.class); + +private final PropertyContext context; + +private final DataFormatter dataFormatter; + +private final ComponentLog logger; + +public ExcelHeaderSchemaStrategy(PropertyContext context, ComponentLog logger) { +this(context, logger, null); +} + +public ExcelHeaderSchemaStrategy(PropertyContext context, ComponentLog logger, Locale locale) { +this.context = context; +this.logger = logger; +this.dataFormatter = locale == null ? new DataFormatter() : new DataFormatter(locale); +} + +@Override +public RecordSchema getSchema(Map variables, InputStream contentStream, RecordSchema readSchema) throws SchemaNotFoundException, IOException { +if (this.context == null) { +throw new SchemaNotFoundException("Schema Access Strategy intended only for validation purposes and cannot obtain schema"); +} + +String errMsg = "Failed to read Header line from Excel worksheet"; +RecordSource recordSource; +try { +recordSource = new ExcelRecordSource(contentStream, context, variables, logger); +} catch (Exception e) { +throw new SchemaNotFoundException(errMsg, e); +} + +Row headerRow = recordSource.next(); +if (!ExcelUtils.hasCells(headerRow)) { +throw new SchemaNotFoundException("The chosen header line in the Excel worksheet had no cells"); +} + +try { +String dateFormat = context.getProperty(DateTimeUtils.DATE_FORMAT).getValue(); +String timeFormat = context.getProperty(DateTimeUtils.TIME_FORMAT).getValue(); +String timestampFormat = context.getProperty(DateTimeUtils.TIMESTAMP_FORMAT).getValue(); +final TimeValueInference timeValueInference = new TimeValueInference(dateFormat, timeFormat, timestampFormat); +final Map typeMap = new LinkedHashMap<>(); +IntStream.range(0, headerRow.getLastCellNum()) +.forEach(index -> { +final Cell cell = headerRow.getCell(index); +final String fieldName = Integer.toString(index); Review Comment: This approach uses the field index as the field name, instead of using the cell value as the field name, which would be the expected behavior. ##
[jira] [Updated] (NIFI-11621) Inferring schema for JSON fails when there's a CHOICE of different ARRAY types
[ https://issues.apache.org/jira/browse/NIFI-11621?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Mark Payne updated NIFI-11621: -- Status: Patch Available (was: Open) > Inferring schema for JSON fails when there's a CHOICE of different ARRAY types > -- > > Key: NIFI-11621 > URL: https://issues.apache.org/jira/browse/NIFI-11621 > Project: Apache NiFi > Issue Type: Bug > Components: Extensions >Reporter: Mark Payne >Assignee: Mark Payne >Priority: Major > Fix For: 1.latest, 2.latest > > Time Spent: 10m > Remaining Estimate: 0h > > From Apache Slack: > https://apachenifi.slack.com/archives/C0L9VCD47/p1685553667778359?thread_ts=1685461745.470939=C0L9VCD47 > When using ConvertRecord with a JSON Reader and an Avro Writer, when > inferring the JSON schema, each of the following two records works properly: > {code} > {"test_record":{"array_test_record":{"test_array":[]}}} > {code} > {code} > {"test_record":{"array_test_record":{"test_array":["test"]}}} > {code} > However, when combined into a single FlowFile: > {code} > {"test_record":{"array_test_record":{"test_array":[]}}} > {"test_record":{"array_test_record":{"test_array":["test"]}}} > {code} > It fails with a NullPointerException: > {code} > 2023-05-31 13:51:35,632 ERROR [Timer-Driven Process Thread-8] > o.a.n.processors.standard.ConvertRecord > ConvertRecord[id=72e564dc-0188-1000-360a-9f86b50ec8ac] Failed to process > StandardFlowFileRecord[uuid=9bf4f0fb-0942-48ba-8a16-0dbd98db3f97,claim=StandardContentClaim > [resourceClaim=StandardResourceClaim[id=1685554864966-1, container=default, > section=1], offset=3278, > length=117],offset=0,name=9bf4f0fb-0942-48ba-8a16-0dbd98db3f97,size=117]; > will route to failure > org.apache.nifi.processor.exception.ProcessException: Could not determine the > Avro Schema to use for writing the content > at > org.apache.nifi.avro.AvroRecordSetWriter.createWriter(AvroRecordSetWriter.java:154) > at > java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > at > java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.base/java.lang.reflect.Method.invoke(Method.java:566) > at > org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler.invoke(StandardControllerServiceInvocationHandler.java:254) > at > org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler.invoke(StandardControllerServiceInvocationHandler.java:105) > at com.sun.proxy.$Proxy177.createWriter(Unknown Source) > at > org.apache.nifi.processors.standard.AbstractRecordProcessor$1.process(AbstractRecordProcessor.java:150) > at > org.apache.nifi.controller.repository.StandardProcessSession.write(StandardProcessSession.java:3441) > at > org.apache.nifi.processors.standard.AbstractRecordProcessor.onTrigger(AbstractRecordProcessor.java:122) > at > org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27) > at > org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1360) > at > org.apache.nifi.controller.tasks.ConnectableTask.invoke(ConnectableTask.java:243) > at > org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:102) > at org.apache.nifi.engine.FlowEngine$2.run(FlowEngine.java:110) > at > java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) > at > java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) > at > java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) > at > java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) > at > java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) > at java.base/java.lang.Thread.run(Thread.java:829) > Caused by: org.apache.nifi.schema.access.SchemaNotFoundException: Failed to > compile Avro Schema > at > org.apache.nifi.avro.AvroRecordSetWriter.createWriter(AvroRecordSetWriter.java:145) > ... 21 common frames omitted > Caused by: java.lang.NullPointerException: null > at > org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:208) > at > org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130) > at > org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284) > at >
[GitHub] [nifi] markap14 opened a new pull request, #7322: NIFI-11621: Handle the case of CHOICE fields when inferring the type …
markap14 opened a new pull request, #7322: URL: https://github.com/apache/nifi/pull/7322 …of ARRAY elements. E.g., support ARRAY> # Summary [NIFI-0](https://issues.apache.org/jira/browse/NIFI-0) # Tracking Please complete the following tracking steps prior to pull request creation. ### Issue Tracking - [ ] [Apache NiFi Jira](https://issues.apache.org/jira/browse/NIFI) issue created ### Pull Request Tracking - [ ] Pull Request title starts with Apache NiFi Jira issue number, such as `NIFI-0` - [ ] Pull Request commit message starts with Apache NiFi Jira issue number, as such `NIFI-0` ### Pull Request Formatting - [ ] Pull Request based on current revision of the `main` branch - [ ] Pull Request refers to a feature branch with one commit containing changes # Verification Please indicate the verification steps performed prior to pull request creation. ### Build - [ ] Build completed using `mvn clean install -P contrib-check` - [ ] JDK 11 - [ ] JDK 17 ### Licensing - [ ] New dependencies are compatible with the [Apache License 2.0](https://apache.org/licenses/LICENSE-2.0) according to the [License Policy](https://www.apache.org/legal/resolved.html) - [ ] New dependencies are documented in applicable `LICENSE` and `NOTICE` files ### Documentation - [ ] Documentation formatting appears as expected in rendered files -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [nifi] mr1716 commented on pull request #7317: NIFI-11617 Update jackson.bom to 2.15.2
mr1716 commented on PR #7317: URL: https://github.com/apache/nifi/pull/7317#issuecomment-1570710897 @exceptionfactory https://github.com/FasterXML/jackson-databind/releases/tag/jackson-databind-2.15.2 what about this release? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[jira] [Created] (NIFI-11621) Inferring schema for JSON fails when there's a CHOICE of different ARRAY types
Mark Payne created NIFI-11621: - Summary: Inferring schema for JSON fails when there's a CHOICE of different ARRAY types Key: NIFI-11621 URL: https://issues.apache.org/jira/browse/NIFI-11621 Project: Apache NiFi Issue Type: Bug Components: Extensions Reporter: Mark Payne Assignee: Mark Payne Fix For: 1.latest, 2.latest >From Apache Slack: >https://apachenifi.slack.com/archives/C0L9VCD47/p1685553667778359?thread_ts=1685461745.470939=C0L9VCD47 When using ConvertRecord with a JSON Reader and an Avro Writer, when inferring the JSON schema, each of the following two records works properly: {code} {"test_record":{"array_test_record":{"test_array":[]}}} {code} {code} {"test_record":{"array_test_record":{"test_array":["test"]}}} {code} However, when combined into a single FlowFile: {code} {"test_record":{"array_test_record":{"test_array":[]}}} {"test_record":{"array_test_record":{"test_array":["test"]}}} {code} It fails with a NullPointerException: {code} 2023-05-31 13:51:35,632 ERROR [Timer-Driven Process Thread-8] o.a.n.processors.standard.ConvertRecord ConvertRecord[id=72e564dc-0188-1000-360a-9f86b50ec8ac] Failed to process StandardFlowFileRecord[uuid=9bf4f0fb-0942-48ba-8a16-0dbd98db3f97,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1685554864966-1, container=default, section=1], offset=3278, length=117],offset=0,name=9bf4f0fb-0942-48ba-8a16-0dbd98db3f97,size=117]; will route to failure org.apache.nifi.processor.exception.ProcessException: Could not determine the Avro Schema to use for writing the content at org.apache.nifi.avro.AvroRecordSetWriter.createWriter(AvroRecordSetWriter.java:154) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler.invoke(StandardControllerServiceInvocationHandler.java:254) at org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler.invoke(StandardControllerServiceInvocationHandler.java:105) at com.sun.proxy.$Proxy177.createWriter(Unknown Source) at org.apache.nifi.processors.standard.AbstractRecordProcessor$1.process(AbstractRecordProcessor.java:150) at org.apache.nifi.controller.repository.StandardProcessSession.write(StandardProcessSession.java:3441) at org.apache.nifi.processors.standard.AbstractRecordProcessor.onTrigger(AbstractRecordProcessor.java:122) at org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27) at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1360) at org.apache.nifi.controller.tasks.ConnectableTask.invoke(ConnectableTask.java:243) at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:102) at org.apache.nifi.engine.FlowEngine$2.run(FlowEngine.java:110) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: org.apache.nifi.schema.access.SchemaNotFoundException: Failed to compile Avro Schema at org.apache.nifi.avro.AvroRecordSetWriter.createWriter(AvroRecordSetWriter.java:145) ... 21 common frames omitted Caused by: java.lang.NullPointerException: null at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:208) at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130) at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284) at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:241) at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130) at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284) at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130) at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:122) at
[GitHub] [nifi] dan-s1 commented on pull request #7194: NIFI-11167 - Add Excel Record Reader
dan-s1 commented on PR #7194: URL: https://github.com/apache/nifi/pull/7194#issuecomment-1570599292 @exceptionfactory Finally that Windows build succeeded. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[jira] [Updated] (NIFI-11620) Upgrade Google libraries-bom to 26.15.0
[ https://issues.apache.org/jira/browse/NIFI-11620?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Pierre Villard updated NIFI-11620: -- Status: Patch Available (was: Open) > Upgrade Google libraries-bom to 26.15.0 > --- > > Key: NIFI-11620 > URL: https://issues.apache.org/jira/browse/NIFI-11620 > Project: Apache NiFi > Issue Type: Task > Components: Extensions >Reporter: Pierre Villard >Assignee: Pierre Villard >Priority: Major > Fix For: 1.latest, 2.latest > > Time Spent: 10m > Remaining Estimate: 0h > > Upgrade Google libraries-bom to 26.15.0 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[GitHub] [nifi] pvillard31 opened a new pull request, #7321: NIFI-11620 - Upgrade Google libraries-bom to 26.15.0
pvillard31 opened a new pull request, #7321: URL: https://github.com/apache/nifi/pull/7321 # Summary [NIFI-11620](https://issues.apache.org/jira/browse/NIFI-11620) Upgrade Google libraries-bom to 26.15.0 # Tracking Please complete the following tracking steps prior to pull request creation. ### Issue Tracking - [ ] [Apache NiFi Jira](https://issues.apache.org/jira/browse/NIFI) issue created ### Pull Request Tracking - [ ] Pull Request title starts with Apache NiFi Jira issue number, such as `NIFI-0` - [ ] Pull Request commit message starts with Apache NiFi Jira issue number, as such `NIFI-0` ### Pull Request Formatting - [ ] Pull Request based on current revision of the `main` branch - [ ] Pull Request refers to a feature branch with one commit containing changes # Verification Please indicate the verification steps performed prior to pull request creation. ### Build - [ ] Build completed using `mvn clean install -P contrib-check` - [ ] JDK 11 - [ ] JDK 17 ### Licensing - [ ] New dependencies are compatible with the [Apache License 2.0](https://apache.org/licenses/LICENSE-2.0) according to the [License Policy](https://www.apache.org/legal/resolved.html) - [ ] New dependencies are documented in applicable `LICENSE` and `NOTICE` files ### Documentation - [ ] Documentation formatting appears as expected in rendered files -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[jira] [Created] (NIFI-11620) Upgrade Google libraries-bom to 26.15.0
Pierre Villard created NIFI-11620: - Summary: Upgrade Google libraries-bom to 26.15.0 Key: NIFI-11620 URL: https://issues.apache.org/jira/browse/NIFI-11620 Project: Apache NiFi Issue Type: Task Components: Extensions Reporter: Pierre Villard Assignee: Pierre Villard Fix For: 1.latest, 2.latest Upgrade Google libraries-bom to 26.15.0 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Updated] (NIFI-11619) Upgrade Azure BOM to 1.2.13
[ https://issues.apache.org/jira/browse/NIFI-11619?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Pierre Villard updated NIFI-11619: -- Status: Patch Available (was: Open) > Upgrade Azure BOM to 1.2.13 > --- > > Key: NIFI-11619 > URL: https://issues.apache.org/jira/browse/NIFI-11619 > Project: Apache NiFi > Issue Type: Task > Components: Extensions >Reporter: Pierre Villard >Assignee: Pierre Villard >Priority: Major > Fix For: 1.latest, 2.latest > > Time Spent: 10m > Remaining Estimate: 0h > > Upgrade Azure BOM to 1.2.13 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[GitHub] [nifi] pvillard31 opened a new pull request, #7320: NIFI-11619 - Upgrade Azure BOM to 1.2.13
pvillard31 opened a new pull request, #7320: URL: https://github.com/apache/nifi/pull/7320 # Summary [NIFI-11619](https://issues.apache.org/jira/browse/NIFI-11619) - Upgrade Azure BOM to 1.2.13 # Tracking Please complete the following tracking steps prior to pull request creation. ### Issue Tracking - [ ] [Apache NiFi Jira](https://issues.apache.org/jira/browse/NIFI) issue created ### Pull Request Tracking - [ ] Pull Request title starts with Apache NiFi Jira issue number, such as `NIFI-0` - [ ] Pull Request commit message starts with Apache NiFi Jira issue number, as such `NIFI-0` ### Pull Request Formatting - [ ] Pull Request based on current revision of the `main` branch - [ ] Pull Request refers to a feature branch with one commit containing changes # Verification Please indicate the verification steps performed prior to pull request creation. ### Build - [ ] Build completed using `mvn clean install -P contrib-check` - [ ] JDK 11 - [ ] JDK 17 ### Licensing - [ ] New dependencies are compatible with the [Apache License 2.0](https://apache.org/licenses/LICENSE-2.0) according to the [License Policy](https://www.apache.org/legal/resolved.html) - [ ] New dependencies are documented in applicable `LICENSE` and `NOTICE` files ### Documentation - [ ] Documentation formatting appears as expected in rendered files -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [nifi] r-sidd opened a new pull request, #7319: NIFI-11587: Update questdb to 6.7
r-sidd opened a new pull request, #7319: URL: https://github.com/apache/nifi/pull/7319 # Summary [NIFI-11587](https://issues.apache.org/jira/browse/NIFI-11587) # Tracking Please complete the following tracking steps prior to pull request creation. ### Issue Tracking - [ ] [Apache NiFi Jira](https://issues.apache.org/jira/browse/NIFI) issue created ### Pull Request Tracking - [ ] Pull Request title starts with Apache NiFi Jira issue number, such as `NIFI-0` - [ ] Pull Request commit message starts with Apache NiFi Jira issue number, as such `NIFI-0` ### Pull Request Formatting - [ ] Pull Request based on current revision of the `main` branch - [ ] Pull Request refers to a feature branch with one commit containing changes # Verification Please indicate the verification steps performed prior to pull request creation. ### Build - [ ] Build completed using `mvn clean install -P contrib-check` - [ ] JDK 11 - [ ] JDK 17 ### Licensing - [ ] New dependencies are compatible with the [Apache License 2.0](https://apache.org/licenses/LICENSE-2.0) according to the [License Policy](https://www.apache.org/legal/resolved.html) - [ ] New dependencies are documented in applicable `LICENSE` and `NOTICE` files ### Documentation - [ ] Documentation formatting appears as expected in rendered files -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[jira] [Updated] (NIFI-11618) Upgrade AWS SDK
[ https://issues.apache.org/jira/browse/NIFI-11618?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Pierre Villard updated NIFI-11618: -- Status: Patch Available (was: Open) > Upgrade AWS SDK > --- > > Key: NIFI-11618 > URL: https://issues.apache.org/jira/browse/NIFI-11618 > Project: Apache NiFi > Issue Type: Task > Components: Extensions >Reporter: Pierre Villard >Assignee: Pierre Villard >Priority: Major > Fix For: 1.latest, 2.latest > > Time Spent: 10m > Remaining Estimate: 0h > > Upgrade AWS SDK from 2.20.41 to 2.20.75 and from 1.12.444 to 1.12.478 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (NIFI-11619) Upgrade Azure BOM to 1.2.13
Pierre Villard created NIFI-11619: - Summary: Upgrade Azure BOM to 1.2.13 Key: NIFI-11619 URL: https://issues.apache.org/jira/browse/NIFI-11619 Project: Apache NiFi Issue Type: Task Components: Extensions Reporter: Pierre Villard Assignee: Pierre Villard Fix For: 1.latest, 2.latest Upgrade Azure BOM to 1.2.13 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[GitHub] [nifi] pvillard31 opened a new pull request, #7318: NIFI-11618 - Upgrade AWS SDK
pvillard31 opened a new pull request, #7318: URL: https://github.com/apache/nifi/pull/7318 # Summary [NIFI-11618](https://issues.apache.org/jira/browse/NIFI-11618) Upgrade AWS SDK Upgrade AWS SDK from 2.20.41 to 2.20.75 and from 1.12.444 to 1.12.478 # Tracking Please complete the following tracking steps prior to pull request creation. ### Issue Tracking - [ ] [Apache NiFi Jira](https://issues.apache.org/jira/browse/NIFI) issue created ### Pull Request Tracking - [ ] Pull Request title starts with Apache NiFi Jira issue number, such as `NIFI-0` - [ ] Pull Request commit message starts with Apache NiFi Jira issue number, as such `NIFI-0` ### Pull Request Formatting - [ ] Pull Request based on current revision of the `main` branch - [ ] Pull Request refers to a feature branch with one commit containing changes # Verification Please indicate the verification steps performed prior to pull request creation. ### Build - [ ] Build completed using `mvn clean install -P contrib-check` - [ ] JDK 11 - [ ] JDK 17 ### Licensing - [ ] New dependencies are compatible with the [Apache License 2.0](https://apache.org/licenses/LICENSE-2.0) according to the [License Policy](https://www.apache.org/legal/resolved.html) - [ ] New dependencies are documented in applicable `LICENSE` and `NOTICE` files ### Documentation - [ ] Documentation formatting appears as expected in rendered files -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[jira] [Created] (NIFI-11618) Upgrade AWS SDK
Pierre Villard created NIFI-11618: - Summary: Upgrade AWS SDK Key: NIFI-11618 URL: https://issues.apache.org/jira/browse/NIFI-11618 Project: Apache NiFi Issue Type: Task Components: Extensions Reporter: Pierre Villard Assignee: Pierre Villard Fix For: 1.latest, 2.latest Upgrade AWS SDK from 2.20.41 to 2.20.75 and from 1.12.444 to 1.12.478 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Updated] (MINIFICPP-2119) fix CronTests error with C.UTF-8 locale
[ https://issues.apache.org/jira/browse/MINIFICPP-2119?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Marton Szasz updated MINIFICPP-2119: Resolution: Fixed Status: Resolved (was: Patch Available) > fix CronTests error with C.UTF-8 locale > --- > > Key: MINIFICPP-2119 > URL: https://issues.apache.org/jira/browse/MINIFICPP-2119 > Project: Apache NiFi MiNiFi C++ > Issue Type: Bug >Reporter: Marton Szasz >Assignee: Martin Zink >Priority: Major > Time Spent: 20m > Remaining Estimate: 0h > > Didn't happen with en_US.UTF-8. > {noformat} > --- > Cron expression ctor tests > --- > /home/szaszm/nifi-minifi-cpp/libminifi/test/unit/CronTests.cpp:35 > ... > /home/szaszm/nifi-minifi-cpp/libminifi/test/unit/CronTests.cpp:104: FAILED: > REQUIRE_NOTHROW( Cron("0 10,44 14 ? 3 WED") ) > due to unexpected exception with message: > Couldn't parse cron field: 3 locale::facet::_S_create_c_locale name not > valid > --- > Cron allowed nonnumerical inputs > --- > /home/szaszm/nifi-minifi-cpp/libminifi/test/unit/CronTests.cpp:125 > ... > /home/szaszm/nifi-minifi-cpp/libminifi/test/unit/CronTests.cpp:126: FAILED: > REQUIRE_NOTHROW( Cron("* * * * > Jan,fEb,MAR,Apr,May,jun,Jul,Aug,Sep,Oct,Nov,Dec * *") ) > due to unexpected exception with message: > Couldn't parse cron field: Jan,fEb,MAR,Apr,May,jun,Jul,Aug,Sep,Oct,Nov,Dec > locale::facet::_S_create_c_locale name not valid > --- > Cron::calculateNextTrigger > --- > /home/szaszm/nifi-minifi-cpp/libminifi/test/unit/CronTests.cpp:130 > ... > /home/szaszm/nifi-minifi-cpp/libminifi/test/unit/CronTests.cpp:130: FAILED: > {Unknown expression after the reported line} > due to unexpected exception with message: > Couldn't parse cron field: 11 locale::facet::_S_create_c_locale name not > valid > {noformat} -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Resolved] (MINIFICPP-2123) Upgrade curl to version 8.1.0
[ https://issues.apache.org/jira/browse/MINIFICPP-2123?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Marton Szasz resolved MINIFICPP-2123. - Resolution: Done > Upgrade curl to version 8.1.0 > - > > Key: MINIFICPP-2123 > URL: https://issues.apache.org/jira/browse/MINIFICPP-2123 > Project: Apache NiFi MiNiFi C++ > Issue Type: Improvement >Reporter: Gábor Gyimesi >Assignee: Gábor Gyimesi >Priority: Major > Time Spent: 0.5h > Remaining Estimate: 0h > > There has been a major version change in curl since our current version > 7.64.0. We should upgrade to the latest released version. -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Resolved] (MINIFICPP-2112) MiNiFi controller crashes on flow update
[ https://issues.apache.org/jira/browse/MINIFICPP-2112?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Marton Szasz resolved MINIFICPP-2112. - Resolution: Fixed > MiNiFi controller crashes on flow update > > > Key: MINIFICPP-2112 > URL: https://issues.apache.org/jira/browse/MINIFICPP-2112 > Project: Apache NiFi MiNiFi C++ > Issue Type: Bug >Reporter: Gábor Gyimesi >Assignee: Gábor Gyimesi >Priority: Major > Time Spent: 3h 20m > Remaining Estimate: 0h > > MiNiFi controller crashes using the flow update option with resource deadlock. > {code:java} > (gdb) bt > #0 __pthread_kill_implementation (no_tid=0, signo=6, > threadid=140737018590784) at ./nptl/pthread_kill.c:44 > #1 __pthread_kill_internal (signo=6, threadid=140737018590784) at > ./nptl/pthread_kill.c:78 > #2 __GI___pthread_kill (threadid=140737018590784, signo=signo@entry=6) at > ./nptl/pthread_kill.c:89 > #3 0x76042476 in __GI_raise (sig=sig@entry=6) at > ../sysdeps/posix/raise.c:26 > #4 0x760287f3 in __GI_abort () at ./stdlib/abort.c:79 > #5 0x764a2bbe in ?? () from /lib/x86_64-linux-gnu/libstdc++.so.6 > #6 0x764ae24c in ?? () from /lib/x86_64-linux-gnu/libstdc++.so.6 > #7 0x764ad229 in ?? () from /lib/x86_64-linux-gnu/libstdc++.so.6 > #8 0x764ad999 in __gxx_personality_v0 () from > /lib/x86_64-linux-gnu/libstdc++.so.6 > #9 0x77f8fc64 in ?? () from /lib/x86_64-linux-gnu/libgcc_s.so.1 > #10 0x77f90321 in _Unwind_RaiseException () from > /lib/x86_64-linux-gnu/libgcc_s.so.1 > #11 0x764ae50b in __cxa_throw () from > /lib/x86_64-linux-gnu/libstdc++.so.6 > #12 0x764a589f in std::__throw_system_error(int) () from > /lib/x86_64-linux-gnu/libstdc++.so.6 > #13 0x764dc340 in std::thread::join() () from > /lib/x86_64-linux-gnu/libstdc++.so.6 > #14 0x776d8864 in > org::apache::nifi::minifi::io::ServerSocket::~ServerSocket > (this=0x55a54dd0, __in_chrg=, __vtt_parm=) > at > /home/ggyimesi/projects/nifi-minifi-cpp-fork/libminifi/src/io/ServerSocket.cpp:52 > #15 0x776d890c in > org::apache::nifi::minifi::io::ServerSocket::~ServerSocket > (this=0x55a54dd0, __in_chrg=, __vtt_parm=) > at > /home/ggyimesi/projects/nifi-minifi-cpp-fork/libminifi/src/io/ServerSocket.cpp:53 > #16 0x7745782e in > std::default_delete::operator() > (this=0x559fb590, __ptr=0x55a54dd0) at > /usr/include/c++/11/bits/unique_ptr.h:85 > #17 0x7751ac1e in > std::__uniq_ptr_impl std::default_delete >::reset > (this=0x559fb590, __p=0x7fffc0010500) at > /usr/include/c++/11/bits/unique_ptr.h:182 > #18 0x77519f9e in > std::__uniq_ptr_impl std::default_delete > >::operator= (this=0x559fb590, __u=...) at > /usr/include/c++/11/bits/unique_ptr.h:167 > #19 0x77519903 in > std::__uniq_ptr_data std::default_delete, true, > true>::operator= (this=0x559fb590) at > /usr/include/c++/11/bits/unique_ptr.h:212 > #20 0x77519931 in > std::unique_ptr std::default_delete > >::operator= (this=0x559fb590) at > /usr/include/c++/11/bits/unique_ptr.h:371 > #21 0x77515f72 in > org::apache::nifi::minifi::c2::ControllerSocketProtocol::initialize > (this=0x559fb580) at > /home/ggyimesi/projects/nifi-minifi-cpp-fork/libminifi/src/c2/ControllerSocketProtocol.cpp:78 > #22 0x77439037 in org::apache::nifi::minifi::FlowController::start > (this=0x55a56cc0) at > /home/ggyimesi/projects/nifi-minifi-cpp-fork/libminifi/src/FlowController.cpp:337 > #23 0x774377f8 in > org::apache::nifi::minifi::FlowController::applyConfiguration > (this=0x55a56cc0, source="ControllerSocketProtocol", > configurePayload="# Licensed to the Apache Software Foundation (ASF) > under one or more\n# contributor license agreements. See the NOTICE file > distributed with\n# this work for additional information regarding copyright > "..., > flow_id=std::optional [no contained value]) at > /home/ggyimesi/projects/nifi-minifi-cpp-fork/libminifi/src/FlowController.cpp:142 > #24 0x77439551 in > org::apache::nifi::minifi::FlowController::applyUpdate (this=0x55a56cc0, > source="ControllerSocketProtocol", > configuration="# Licensed to the Apache Software Foundation (ASF) under > one or more\n# contributor license agreements. See the NOTICE file > distributed with\n# this work for additional information regarding copyright > "..., persist=false, > flow_id=std::optional [no contained value]) at > /home/ggyimesi/projects/nifi-minifi-cpp-fork/libminifi/src/FlowController.cpp:390 > #25 0x77516954 in > org::apache::nifi::minifi::c2::ControllerSocketProtocol::handleUpdate > (this=0x559fb580, stream=0x7fffe3ffec40) at >
[GitHub] [nifi-minifi-cpp] szaszm closed pull request #1578: MINIFICPP-2123 Upgrade to curl to v8.1.0
szaszm closed pull request #1578: MINIFICPP-2123 Upgrade to curl to v8.1.0 URL: https://github.com/apache/nifi-minifi-cpp/pull/1578 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [nifi-minifi-cpp] szaszm closed pull request #1579: MINIFICPP-2119 fix CronTests error with classic locale
szaszm closed pull request #1579: MINIFICPP-2119 fix CronTests error with classic locale URL: https://github.com/apache/nifi-minifi-cpp/pull/1579 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [nifi-minifi-cpp] szaszm closed pull request #1568: MINIFICPP-2112 Fix flow update and restart with minifi controller
szaszm closed pull request #1568: MINIFICPP-2112 Fix flow update and restart with minifi controller URL: https://github.com/apache/nifi-minifi-cpp/pull/1568 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [nifi] exceptionfactory commented on pull request #6816: NIFI-9206: Add RemoveRecordField processor and implement the ability to remove fields from records
exceptionfactory commented on PR #6816: URL: https://github.com/apache/nifi/pull/6816#issuecomment-1570527950 Thanks for the reply and summary @ChrisSamo632. On further review, the changes to those other components make sense together with the adjustments necessary for this PR, so breaking things up is not necessary. I think it would be worthwhile to review the scope of the test files and reduce the number of test files if sufficient coverage can be achieved. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[jira] [Updated] (NIFI-11596) Add FlowRegistryClient to Reference Values for Swagger Model Property
[ https://issues.apache.org/jira/browse/NIFI-11596?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] David Handermann updated NIFI-11596: Summary: Add FlowRegistryClient to Reference Values for Swagger Model Property (was: Adjust Swagger contract) > Add FlowRegistryClient to Reference Values for Swagger Model Property > - > > Key: NIFI-11596 > URL: https://issues.apache.org/jira/browse/NIFI-11596 > Project: Apache NiFi > Issue Type: Bug > Components: Core Framework >Reporter: Simon Bence >Assignee: Simon Bence >Priority: Minor > Time Spent: 20m > Remaining Estimate: 0h > > With changes around the Registry, ControllerServices now might be referenced > by Registry Clients. As of this the referencing entry value set of > ControllerServiceReferencingComponentDTO must be extended. -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Resolved] (NIFI-11596) Add FlowRegistryClient to Reference Values for Swagger Model Property
[ https://issues.apache.org/jira/browse/NIFI-11596?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] David Handermann resolved NIFI-11596. - Fix Version/s: 2.0.0 1.22.0 Resolution: Fixed > Add FlowRegistryClient to Reference Values for Swagger Model Property > - > > Key: NIFI-11596 > URL: https://issues.apache.org/jira/browse/NIFI-11596 > Project: Apache NiFi > Issue Type: Bug > Components: Core Framework >Reporter: Simon Bence >Assignee: Simon Bence >Priority: Minor > Fix For: 2.0.0, 1.22.0 > > Time Spent: 20m > Remaining Estimate: 0h > > With changes around the Registry, ControllerServices now might be referenced > by Registry Clients. As of this the referencing entry value set of > ControllerServiceReferencingComponentDTO must be extended. -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Commented] (NIFI-11596) Adjust Swagger contract
[ https://issues.apache.org/jira/browse/NIFI-11596?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17728045#comment-17728045 ] ASF subversion and git services commented on NIFI-11596: Commit c8fbad875d981634df1a6cb4e89e7d58cf485fed in nifi's branch refs/heads/support/nifi-1.x from Simon Bence [ https://gitbox.apache.org/repos/asf?p=nifi.git;h=c8fbad875d ] NIFI-11596 Adjusting Swagger contract to cover RegistryClients as possible reference type for Controller Services This closes #7295 Signed-off-by: David Handermann (cherry picked from commit 007bf3bcec8c1dcc898abe0673d55465df37ee29) > Adjust Swagger contract > --- > > Key: NIFI-11596 > URL: https://issues.apache.org/jira/browse/NIFI-11596 > Project: Apache NiFi > Issue Type: Bug > Components: Core Framework >Reporter: Simon Bence >Assignee: Simon Bence >Priority: Minor > Time Spent: 20m > Remaining Estimate: 0h > > With changes around the Registry, ControllerServices now might be referenced > by Registry Clients. As of this the referencing entry value set of > ControllerServiceReferencingComponentDTO must be extended. -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Commented] (NIFI-11596) Adjust Swagger contract
[ https://issues.apache.org/jira/browse/NIFI-11596?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17728044#comment-17728044 ] ASF subversion and git services commented on NIFI-11596: Commit 007bf3bcec8c1dcc898abe0673d55465df37ee29 in nifi's branch refs/heads/main from Simon Bence [ https://gitbox.apache.org/repos/asf?p=nifi.git;h=007bf3bcec ] NIFI-11596 Adjusting Swagger contract to cover RegistryClients as possible reference type for Controller Services This closes #7295 Signed-off-by: David Handermann > Adjust Swagger contract > --- > > Key: NIFI-11596 > URL: https://issues.apache.org/jira/browse/NIFI-11596 > Project: Apache NiFi > Issue Type: Bug > Components: Core Framework >Reporter: Simon Bence >Assignee: Simon Bence >Priority: Minor > Time Spent: 20m > Remaining Estimate: 0h > > With changes around the Registry, ControllerServices now might be referenced > by Registry Clients. As of this the referencing entry value set of > ControllerServiceReferencingComponentDTO must be extended. -- This message was sent by Atlassian Jira (v8.20.10#820010)
[GitHub] [nifi] exceptionfactory closed pull request #7295: NIFI-11596 Adjusting Swagger contract to cover RegistryClients as possible reference type for ControllerServices
exceptionfactory closed pull request #7295: NIFI-11596 Adjusting Swagger contract to cover RegistryClients as possible reference type for ControllerServices URL: https://github.com/apache/nifi/pull/7295 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [nifi] exceptionfactory commented on a diff in pull request #7299: NIFI-11603: Removed NetworkUtils.getAvailableUdpPort, NetworkUtils.ge…
exceptionfactory commented on code in PR #7299: URL: https://github.com/apache/nifi/pull/7299#discussion_r1211922678 ## nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/TestListenUDP.java: ## @@ -173,7 +170,7 @@ public void testRunWhenNoEventsAvailable() { @Test public void testWithSendingHostAndPortSameAsSender() throws IOException, InterruptedException { -final Integer sendingPort = NetworkUtils.getAvailableUdpPort(); +final int sendingPort = 27911; Review Comment: Will this hard-coded port result in intermittent failures? Can it be set to `0` for constructing the `DatagramSocket`? ## nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/ListenSyslog.java: ## @@ -216,6 +216,7 @@ public class ListenSyslog extends AbstractSyslogProcessor { private volatile SyslogParser parser; private volatile BlockingQueue syslogEvents = new LinkedBlockingQueue<>(); private volatile byte[] messageDemarcatorBytes; //it is only the array reference that is volatile - not the contents. +private volatile int listeningPort; Review Comment: This value does not appear to be assigned. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [nifi-minifi-cpp] lordgamez opened a new pull request, #1583: MINIFICPP-1719 Replace LibreSSL with OpenSSL 3.1
lordgamez opened a new pull request, #1583: URL: https://github.com/apache/nifi-minifi-cpp/pull/1583 https://issues.apache.org/jira/browse/MINIFICPP-1719 Thank you for submitting a contribution to Apache NiFi - MiNiFi C++. In order to streamline the review of the contribution we ask you to ensure the following steps have been taken: ### For all changes: - [ ] Is there a JIRA ticket associated with this PR? Is it referenced in the commit message? - [ ] Does your PR title start with MINIFICPP- where is the JIRA number you are trying to resolve? Pay particular attention to the hyphen "-" character. - [ ] Has your PR been rebased against the latest commit within the target branch (typically main)? - [ ] Is your initial contribution a single, squashed commit? ### For code changes: - [ ] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [ ] If applicable, have you updated the LICENSE file? - [ ] If applicable, have you updated the NOTICE file? ### For documentation related changes: - [ ] Have you ensured that format looks appropriate for the output in which it is rendered? ### Note: Please ensure that once the PR is submitted, you check GitHub Actions CI results for build issues and submit an update to your PR as soon as possible. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[jira] [Commented] (NIFI-11608) PutBigQuery: Missing Flowfile Attribute Evaluation
[ https://issues.apache.org/jira/browse/NIFI-11608?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17728036#comment-17728036 ] Steven Matison commented on NIFI-11608: --- [~jisaitua] working on this now! Thanks! > PutBigQuery: Missing Flowfile Attribute Evaluation > -- > > Key: NIFI-11608 > URL: https://issues.apache.org/jira/browse/NIFI-11608 > Project: Apache NiFi > Issue Type: Bug > Components: Core Framework >Affects Versions: 1.21.0 >Reporter: Juan Ignacio Saitua >Assignee: Steven Matison >Priority: Major > Time Spent: 2.5h > Remaining Estimate: 0h > > While the documentation states that PutBigQuery should support expression > language for DATASET, TABLE_NAME, and SKIP_INVALID_ROWS, that's not really > happening. An examination of the code reveals that these expressions are not > evaluated in the current implementation ([see code > here](https://github.com/apache/nifi/blob/cfd62c9511e43d5010fbfbb12b98b40bdfdb3fc2/nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/bigquery/PutBigQuery.java#L193)). > Notably, the issue does not exist in the PutBigQueryBatch processor. While > both processors extends the AbstractBigQueryProcessor class, only > PutBigQueryBatch utilizes the AbstractBigQueryProcessor.getTableId() method, > which correctly evaluates and retrieves the tableId. -- This message was sent by Atlassian Jira (v8.20.10#820010)
[GitHub] [nifi-minifi-cpp] martinzink commented on a diff in pull request #1572: MINIFICPP-2027 Upgrade Google Cloud library to version 2.10.1
martinzink commented on code in PR #1572: URL: https://github.com/apache/nifi-minifi-cpp/pull/1572#discussion_r1211914535 ## extensions/gcp/tests/GCPCredentialsControllerServiceTests.cpp: ## @@ -89,7 +105,7 @@ TEST_F(GCPCredentialsTests, DefaultGCPCredentialsWithEnv) { auto temp_directory = test_controller_.createTempDirectory(); auto path = create_mock_json_file(temp_directory); ASSERT_TRUE(path.has_value()); - google::cloud::internal::SetEnv("GOOGLE_APPLICATION_CREDENTIALS", path->string()); + setGoogleEnvironmentCredentials(path->string().c_str()); Review Comment: Now that you mention it, I remember something like this, maybe thats why I was using the internal function. Thanks for explaining :) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[jira] [Commented] (NIFI-11591) Address failing system tests
[ https://issues.apache.org/jira/browse/NIFI-11591?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17728022#comment-17728022 ] David Handermann commented on NIFI-11591: - Thanks for continuing to work on this issue [~markap14]. I reopened this issue after the second PR since it ran to completion on the PR, but failed again on the main branch. I see PR 7312 is a draft with additional logging, hopefully that will help track down the true source of the problem. > Address failing system tests > > > Key: NIFI-11591 > URL: https://issues.apache.org/jira/browse/NIFI-11591 > Project: Apache NiFi > Issue Type: Bug > Components: Core Framework, Tools and Build >Reporter: Mark Payne >Assignee: Mark Payne >Priority: Major > Fix For: 2.0.0, 1.22.0 > > Time Spent: 0.5h > Remaining Estimate: 0h > > We've been seeing intermittent failures in the following system tests: > - DynamicClasspathModificationIT > - RegistryClientIT#testControllerServiceUpdateWhileRunning > - ClusteredRegistryClientIT#testControllerServiceUpdateWhileRunning > Additionally, NIFI-11557 introduced a surefire-report step in the system > tests to attempt to capture more logs from failures, but that step is not > working as expected and additionally it appears that the log output from the > tests themselves is already captured into the diagnostic dump that is > included. -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Updated] (NIFI-11608) PutBigQuery: Missing Flowfile Attribute Evaluation
[ https://issues.apache.org/jira/browse/NIFI-11608?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Joe Witt updated NIFI-11608: Fix Version/s: (was: 1.22.0) > PutBigQuery: Missing Flowfile Attribute Evaluation > -- > > Key: NIFI-11608 > URL: https://issues.apache.org/jira/browse/NIFI-11608 > Project: Apache NiFi > Issue Type: Bug > Components: Core Framework >Affects Versions: 1.21.0 >Reporter: Juan Ignacio Saitua >Assignee: Steven Matison >Priority: Major > Time Spent: 2.5h > Remaining Estimate: 0h > > While the documentation states that PutBigQuery should support expression > language for DATASET, TABLE_NAME, and SKIP_INVALID_ROWS, that's not really > happening. An examination of the code reveals that these expressions are not > evaluated in the current implementation ([see code > here](https://github.com/apache/nifi/blob/cfd62c9511e43d5010fbfbb12b98b40bdfdb3fc2/nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/bigquery/PutBigQuery.java#L193)). > Notably, the issue does not exist in the PutBigQueryBatch processor. While > both processors extends the AbstractBigQueryProcessor class, only > PutBigQueryBatch utilizes the AbstractBigQueryProcessor.getTableId() method, > which correctly evaluates and retrieves the tableId. -- This message was sent by Atlassian Jira (v8.20.10#820010)
[GitHub] [nifi] exceptionfactory closed pull request #7317: NIFI-11617 Update jackson.bom to 2.15.2
exceptionfactory closed pull request #7317: NIFI-11617 Update jackson.bom to 2.15.2 URL: https://github.com/apache/nifi/pull/7317 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [nifi] exceptionfactory commented on pull request #7317: NIFI-11617 Update jackson.bom to 2.15.2
exceptionfactory commented on PR #7317: URL: https://github.com/apache/nifi/pull/7317#issuecomment-1570429102 Thanks for the upgrade @mr1716. Although this would normally be a straightforward incremental upgrade, it appears that the Jackson BOM references a Jackson Scala module version that does not exist or was not published when released. Maven Central shows 2.15.1 as the latest version of that module, which accounts for the build failures: https://central.sonatype.com/artifact/com.fasterxml.jackson.module/jackson-module-scala_2.13/2.15.1/versions Once this is corrected, this should be a simple upgrade, but I am closing this pull request for now until this is corrected. If you find that the Scala module is released, or a new Jackson version is released, feel free to submit a new PR. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [nifi-minifi-cpp] lordgamez commented on a diff in pull request #1572: MINIFICPP-2027 Upgrade Google Cloud library to version 2.10.1
lordgamez commented on code in PR #1572: URL: https://github.com/apache/nifi-minifi-cpp/pull/1572#discussion_r1211872671 ## extensions/gcp/tests/GCPCredentialsControllerServiceTests.cpp: ## @@ -89,7 +105,7 @@ TEST_F(GCPCredentialsTests, DefaultGCPCredentialsWithEnv) { auto temp_directory = test_controller_.createTempDirectory(); auto path = create_mock_json_file(temp_directory); ASSERT_TRUE(path.has_value()); - google::cloud::internal::SetEnv("GOOGLE_APPLICATION_CREDENTIALS", path->string()); + setGoogleEnvironmentCredentials(path->string().c_str()); Review Comment: That was my first thought, but unfortunately it does not work on Windows. In `setEnvironmentVariable` the `SetEnvironmentVariableA` is called on Windows which is different from the `_putenv_s`. For the changes to be visible for the Google library we need to use `_putenv_s`. The difference is explained here: https://github.com/googleapis/google-cloud-cpp/issues/100#issuecomment-357246853 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [nifi-minifi-cpp] lordgamez commented on a diff in pull request #1572: MINIFICPP-2027 Upgrade Google Cloud library to version 2.10.1
lordgamez commented on code in PR #1572: URL: https://github.com/apache/nifi-minifi-cpp/pull/1572#discussion_r1211872671 ## extensions/gcp/tests/GCPCredentialsControllerServiceTests.cpp: ## @@ -89,7 +105,7 @@ TEST_F(GCPCredentialsTests, DefaultGCPCredentialsWithEnv) { auto temp_directory = test_controller_.createTempDirectory(); auto path = create_mock_json_file(temp_directory); ASSERT_TRUE(path.has_value()); - google::cloud::internal::SetEnv("GOOGLE_APPLICATION_CREDENTIALS", path->string()); + setGoogleEnvironmentCredentials(path->string().c_str()); Review Comment: That was my first thought, but unfortunately it does not work on Windows. In `setEnvironmentVariable` the `SetEnvironmentVariableA` which is different from the `_putenv_s`. For the changes to be visible for the Google library we need to use `_putenv_s`. The difference is explained here: https://github.com/googleapis/google-cloud-cpp/issues/100#issuecomment-357246853 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[jira] [Updated] (NIFI-11537) Add support for Iceberg tables to UpdateHive3Table
[ https://issues.apache.org/jira/browse/NIFI-11537?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Matt Burgess updated NIFI-11537: Fix Version/s: (was: 1.latest) Status: Open (was: Patch Available) > Add support for Iceberg tables to UpdateHive3Table > --- > > Key: NIFI-11537 > URL: https://issues.apache.org/jira/browse/NIFI-11537 > Project: Apache NiFi > Issue Type: Improvement > Components: Extensions >Reporter: Matt Burgess >Assignee: Matt Burgess >Priority: Major > Fix For: 2.latest > > Time Spent: 10m > Remaining Estimate: 0h > > UpdateHive3Table currently adds columns to Iceberg-backed tables > successfully, but Iceberg needs a special CREATE TABLE command to specify the > Iceberg Storage Handler and table properties. > This Jira proposes to add a Create Table Storage Handler property with > Default and Iceberg as the initial choices. Default does not generate a > STORED BY clause, and Iceberg will generate the appropriate STORED BY clause > and set the necessary table properties. This approach can be used in the > future to add support for HBase- and Kudu-backed Hive tables. -- This message was sent by Atlassian Jira (v8.20.10#820010)
[GitHub] [nifi-minifi-cpp] lordgamez commented on a diff in pull request #1580: MINIFICPP-2126 Use sccache and Ninja in Windows CI
lordgamez commented on code in PR #1580: URL: https://github.com/apache/nifi-minifi-cpp/pull/1580#discussion_r1211858683 ## .github/workflows/ci.yml: ## @@ -73,8 +74,15 @@ jobs: run: git config --system core.longpaths true - id: checkout uses: actions/checkout@v3 - - name: Set up MSBuild -uses: microsoft/setup-msbuild@v1.1 + - name: Run sccache-cache +uses: mozilla-actions/sccache-action@v0.0.3 + - name: sccache cache +uses: actions/cache@v3 +with: + path: ~/AppData/Local/Mozilla/sccache/cache + key: ${{ runner.os }}-sccache Review Comment: We could use github ref and hash specific caches here as well similarly to other build caches, that would probably result in more cache hits for job runs on specific branches: ``` key: windows-sccache-${{github.ref}}-${{github.sha}} restore-keys: | windows-sccache-${{github.ref}}- windows-sccache-refs/heads/main- ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[jira] [Commented] (NIFI-11608) PutBigQuery: Missing Flowfile Attribute Evaluation
[ https://issues.apache.org/jira/browse/NIFI-11608?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17728012#comment-17728012 ] Juan Ignacio Saitua commented on NIFI-11608: Thank you [~stevenmatison] for the patch. One comment: I think that from the docs, SKIP_INVALID_ROWS should also evaluate the flowfile attributes. > PutBigQuery: Missing Flowfile Attribute Evaluation > -- > > Key: NIFI-11608 > URL: https://issues.apache.org/jira/browse/NIFI-11608 > Project: Apache NiFi > Issue Type: Bug > Components: Core Framework >Affects Versions: 1.21.0 >Reporter: Juan Ignacio Saitua >Assignee: Steven Matison >Priority: Major > Fix For: 1.22.0 > > Time Spent: 2.5h > Remaining Estimate: 0h > > While the documentation states that PutBigQuery should support expression > language for DATASET, TABLE_NAME, and SKIP_INVALID_ROWS, that's not really > happening. An examination of the code reveals that these expressions are not > evaluated in the current implementation ([see code > here](https://github.com/apache/nifi/blob/cfd62c9511e43d5010fbfbb12b98b40bdfdb3fc2/nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/bigquery/PutBigQuery.java#L193)). > Notably, the issue does not exist in the PutBigQueryBatch processor. While > both processors extends the AbstractBigQueryProcessor class, only > PutBigQueryBatch utilizes the AbstractBigQueryProcessor.getTableId() method, > which correctly evaluates and retrieves the tableId. -- This message was sent by Atlassian Jira (v8.20.10#820010)
[GitHub] [nifi] exceptionfactory commented on a diff in pull request #7313: NIFI-11614 Improve Validation for JndiJmsConnectionFactoryProvider
exceptionfactory commented on code in PR #7313: URL: https://github.com/apache/nifi/pull/7313#discussion_r1211851261 ## nifi-nar-bundles/nifi-jms-bundle/nifi-jms-processors/src/main/java/org/apache/nifi/jms/cf/JndiJmsConnectionFactoryProperties.java: ## @@ -114,4 +119,67 @@ public static PropertyDescriptor getDynamicPropertyDescriptor(final String prope .build(); } +private static class JndiJmsContextFactoryValidator implements Validator { +private static final String DISALLOWED_CONTEXT_FACTORY = "LdapCtxFactory"; + +@Override +public ValidationResult validate(final String subject, final String input, final ValidationContext context) { +final ValidationResult.Builder builder = new ValidationResult.Builder().subject(subject).input(input); + +if (input == null || input.isEmpty()) { +builder.valid(false); +builder.explanation("Context Factory is required"); +} else if (input.endsWith(DISALLOWED_CONTEXT_FACTORY)) { +builder.valid(false); +builder.explanation(String.format("Context Factory [%s] not allowed", DISALLOWED_CONTEXT_FACTORY)); +} else { +builder.valid(true); +builder.explanation("Context Factory allowed"); +} + +return builder.build(); +} +} + +private static class JndiJmsProviderUrlValidator implements Validator { +/** JNDI JMS URL Allowed Schemes based on ActiveMQ Connection Factory */ +private static final Set ALLOWED_SCHEMES = Collections.unmodifiableSet(new LinkedHashSet<>(Arrays.asList( +"jgroups", +"tcp", +"udp", +"vm" +))); Review Comment: Thanks for highlighting these concerns @turcsanyip. The restrictive approach to URL schemes does present some potential limitations. I expanded the default list of allowed schemes, and I also introduced a Java System property that supports overriding the default values. I updated the additional details documentation to list the property name and describe how it can be configured. This approach will allow an administrator to customize validation for specific environments and custom drivers while providing default validation for general deployments. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [nifi-minifi-cpp] martinzink commented on a diff in pull request #1572: MINIFICPP-2027 Upgrade Google Cloud library to version 2.10.1
martinzink commented on code in PR #1572: URL: https://github.com/apache/nifi-minifi-cpp/pull/1572#discussion_r1211837353 ## extensions/gcp/tests/GCPCredentialsControllerServiceTests.cpp: ## @@ -79,7 +95,7 @@ class GCPCredentialsTests : public ::testing::Test { }; TEST_F(GCPCredentialsTests, DefaultGCPCredentialsWithoutEnv) { - google::cloud::internal::UnsetEnv("GOOGLE_APPLICATION_CREDENTIALS"); + unsetGoogleEnvironmentCredentials(); Review Comment: ```suggestion org::apache::nifi::minifi::utils::Environment::unsetEnvironmentVariable("GOOGLE_APPLICATION_CREDENTIALS"); ``` ## extensions/gcp/tests/GCPCredentialsControllerServiceTests.cpp: ## @@ -89,7 +105,7 @@ TEST_F(GCPCredentialsTests, DefaultGCPCredentialsWithEnv) { auto temp_directory = test_controller_.createTempDirectory(); auto path = create_mock_json_file(temp_directory); ASSERT_TRUE(path.has_value()); - google::cloud::internal::SetEnv("GOOGLE_APPLICATION_CREDENTIALS", path->string()); + setGoogleEnvironmentCredentials(path->string().c_str()); Review Comment: ```suggestion org::apache::nifi::minifi::utils::Environment::setEnvironmentVariable("GOOGLE_APPLICATION_CREDENTIALS", path->c_str()); ``` ## extensions/gcp/tests/GCPCredentialsControllerServiceTests.cpp: ## @@ -63,6 +63,22 @@ std::optional create_mock_json_file(const std::filesystem p.close(); return path; } + +void setGoogleEnvironmentCredentials(char const* value) { +#ifdef WIN32 +_putenv_s("GOOGLE_APPLICATION_CREDENTIALS", value); +#else + setenv("GOOGLE_APPLICATION_CREDENTIALS", value, 1); +#endif +} + +void unsetGoogleEnvironmentCredentials() { +#ifdef _WIN32 + (void)_putenv_s("GOOGLE_APPLICATION_CREDENTIALS", ""); +#else + unsetenv("GOOGLE_APPLICATION_CREDENTIALS"); +#endif +} Review Comment: ```suggestion ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [nifi-minifi-cpp] szaszm commented on a diff in pull request #1581: MINIFICPP-2125 fix for waking up prematurely after processor yields
szaszm commented on code in PR #1581: URL: https://github.com/apache/nifi-minifi-cpp/pull/1581#discussion_r1211836375 ## libminifi/test/unit/ThreadPoolTests.cpp: ## @@ -84,7 +81,33 @@ TEST_CASE("ThreadPoolTest2", "[TPT2]") { utils::Worker functor(f_ex, "id", std::move(after_execute)); pool.start(); std::future fut; - pool.execute(std::move(functor), fut); // NOLINT(bugprone-use-after-move) + pool.execute(std::move(functor), fut); fut.wait(); REQUIRE(20 == fut.get()); } + +TEST_CASE("Worker wait time should be relative to the last run") { + std::mutex cv_m; + std::condition_variable cv; Review Comment: You could use `std::promise` or `std::atomic_flag` (like #1576) for a simpler code. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[jira] [Updated] (NIFI-11595) StateProvider.replace() cannot create the initial state
[ https://issues.apache.org/jira/browse/NIFI-11595?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Peter Turcsanyi updated NIFI-11595: --- Description: {{StateProvider.replace()}} method properly works when the state already exists and persisted in the storage. However, it cannot create the state at first run. As a workaround, {{setState()}} needs to be used but it does not provide the same compare-and-swap mechanism as {{replace()}} so it is the caller responsibility to handle concurrency. To lift this responsibility from the clients and also to provide a more consistent API, {{replace()}} should support creating the initial state. It should be able to move the state "from nothing to X" not only "from X1 to X2" and also providing the same compare-and-swap logic. Affected {{StateProvider}} implementations: - {{ZooKeeperStateProvider}} - {{RedisStateProvider}} - {{KubernetesConfigMapStateProvider}} - {{WriteAheadLocalStateProvider}} was:ZooKeeperStateProvider.replace() method properly works when the state already exists and persisted in Zookeeper. However, it cannot create the state (ZK node) at first run. > StateProvider.replace() cannot create the initial state > --- > > Key: NIFI-11595 > URL: https://issues.apache.org/jira/browse/NIFI-11595 > Project: Apache NiFi > Issue Type: Bug > Components: Core Framework >Reporter: Peter Turcsanyi >Assignee: Peter Turcsanyi >Priority: Major > > {{StateProvider.replace()}} method properly works when the state already > exists and persisted in the storage. However, it cannot create the state at > first run. > As a workaround, {{setState()}} needs to be used but it does not provide the > same compare-and-swap mechanism as {{replace()}} so it is the caller > responsibility to handle concurrency. > To lift this responsibility from the clients and also to provide a more > consistent API, {{replace()}} should support creating the initial state. It > should be able to move the state "from nothing to X" not only "from X1 to X2" > and also providing the same compare-and-swap logic. > Affected {{StateProvider}} implementations: > - {{ZooKeeperStateProvider}} > - {{RedisStateProvider}} > - {{KubernetesConfigMapStateProvider}} > - {{WriteAheadLocalStateProvider}} -- This message was sent by Atlassian Jira (v8.20.10#820010)
[GitHub] [nifi-minifi-cpp] adamdebreceni opened a new pull request, #1582: MINIFICPP-2128 - Verify thirdparty test utility hash
adamdebreceni opened a new pull request, #1582: URL: https://github.com/apache/nifi-minifi-cpp/pull/1582 Thank you for submitting a contribution to Apache NiFi - MiNiFi C++. In order to streamline the review of the contribution we ask you to ensure the following steps have been taken: ### For all changes: - [ ] Is there a JIRA ticket associated with this PR? Is it referenced in the commit message? - [ ] Does your PR title start with MINIFICPP- where is the JIRA number you are trying to resolve? Pay particular attention to the hyphen "-" character. - [ ] Has your PR been rebased against the latest commit within the target branch (typically main)? - [ ] Is your initial contribution a single, squashed commit? ### For code changes: - [ ] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [ ] If applicable, have you updated the LICENSE file? - [ ] If applicable, have you updated the NOTICE file? ### For documentation related changes: - [ ] Have you ensured that format looks appropriate for the output in which it is rendered? ### Note: Please ensure that once the PR is submitted, you check GitHub Actions CI results for build issues and submit an update to your PR as soon as possible. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [nifi-minifi-cpp] martinzink commented on a diff in pull request #1581: MINIFICPP-2125 fix for waking up prematurely after processor yields
martinzink commented on code in PR #1581: URL: https://github.com/apache/nifi-minifi-cpp/pull/1581#discussion_r1211807969 ## libminifi/src/utils/ThreadPool.cpp: ## @@ -151,10 +153,10 @@ void ThreadPool::manageWorkers() { if (nullptr != thread_manager_) { while (running_) { - auto waitperiod = std::chrono::milliseconds(500); + auto wait_period = 500ms; { -std::unique_lock lock(manager_mutex_, std::try_to_lock); -if (!lock.owns_lock()) { +std::unique_lock manager_lock(manager_mutex_, std::try_to_lock); +if (!manager_lock.owns_lock()) { Review Comment: I think you are right, since we couldnt find anything that would indicate this was conscious choice 3 years ago I think its safe to modify so it retries to lock it after this 10ms. https://github.com/apache/nifi-minifi-cpp/pull/1581/commits/4f5a1c9f14b7e49eeed18d2dfcb0582ef35e8186#diff-7bc3087dc395ad862c065930e775dfbaf79006c55df75e6a4c055b03cd4586b6R159-R160 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [nifi] dan-s1 commented on pull request #7194: NIFI-11167 - Add Excel Record Reader
dan-s1 commented on PR #7194: URL: https://github.com/apache/nifi/pull/7194#issuecomment-1570338648 @exceptionfactory Can you please restart the Windows Zulu JDK 11 FR ? It had timed out yesterday. Thanks! -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [nifi-minifi-cpp] martinzink commented on a diff in pull request #1581: MINIFICPP-2125 fix for waking up prematurely after processor yields
martinzink commented on code in PR #1581: URL: https://github.com/apache/nifi-minifi-cpp/pull/1581#discussion_r1211802167 ## libminifi/src/core/Processor.cpp: ## @@ -372,26 +372,26 @@ void Processor::setMaxConcurrentTasks(const uint8_t tasks) { } void Processor::yield() { - yield_expiration_ = std::chrono::system_clock::now() + yield_period_msec_.load(); + yield_expiration_ = std::chrono::steady_clock::now() + yield_period_msec_.load(); } void Processor::yield(std::chrono::milliseconds delta_time) { - yield_expiration_ = std::chrono::system_clock::now() + delta_time; + yield_expiration_ = std::chrono::steady_clock::now() + delta_time; } bool Processor::isYield() { - return yield_expiration_.load() >= std::chrono::system_clock::now(); + return getYieldTime() > 0ms; } void Processor::clearYield() { - yield_expiration_ = std::chrono::system_clock::time_point(); + yield_expiration_ = std::chrono::steady_clock::time_point(); } std::chrono::milliseconds Processor::getYieldTime() const { auto yield_expiration = yield_expiration_.load(); - auto current_time = std::chrono::system_clock::now(); + auto current_time = std::chrono::steady_clock::now(); Review Comment: afaik we dont serialize these anywhere -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [nifi-minifi-cpp] martinzink commented on a diff in pull request #1581: MINIFICPP-2125 fix for waking up prematurely after processor yields
martinzink commented on code in PR #1581: URL: https://github.com/apache/nifi-minifi-cpp/pull/1581#discussion_r1211800505 ## libminifi/include/utils/ThreadPool.h: ## @@ -94,7 +94,7 @@ class Worker { promise->set_value(result); return false; } -next_exec_time_ = std::max(next_exec_time_ + run_determinant_->wait_time(), std::chrono::steady_clock::now()); +next_exec_time_ = std::max(next_exec_time_, std::chrono::steady_clock::now() + run_determinant_->wait_time()); Review Comment: You are right, but I think this should be TimerDrivenSchedulelingAgent's responibility. I've changed it in https://github.com/apache/nifi-minifi-cpp/pull/1581/commits/64abd78c460099378682ac8bef8a734ec44e70a3#diff-dad78983913a0ec9fc95090e0322f6535e7d690b3ad5a14d47ea27c89006ccc3R36-R37 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [nifi-minifi-cpp] martinzink commented on a diff in pull request #1581: MINIFICPP-2125 fix for waking up prematurely after processor yields
martinzink commented on code in PR #1581: URL: https://github.com/apache/nifi-minifi-cpp/pull/1581#discussion_r1211801649 ## libminifi/test/unit/ThreadPoolTests.cpp: ## Review Comment: Replaced these with lambdas in https://github.com/apache/nifi-minifi-cpp/pull/1581/commits/64abd78c460099378682ac8bef8a734ec44e70a3#diff-c5171c6414a98ddf1b6b7a2b62956d80c96251c29bbec9278c478cf8fc9cbb9bR57 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [nifi] turcsanyip commented on a diff in pull request #7313: NIFI-11614 Improve Validation for JndiJmsConnectionFactoryProvider
turcsanyip commented on code in PR #7313: URL: https://github.com/apache/nifi/pull/7313#discussion_r1211762688 ## nifi-nar-bundles/nifi-jms-bundle/nifi-jms-processors/src/main/java/org/apache/nifi/jms/cf/JndiJmsConnectionFactoryProperties.java: ## @@ -114,4 +119,67 @@ public static PropertyDescriptor getDynamicPropertyDescriptor(final String prope .build(); } +private static class JndiJmsContextFactoryValidator implements Validator { +private static final String DISALLOWED_CONTEXT_FACTORY = "LdapCtxFactory"; + +@Override +public ValidationResult validate(final String subject, final String input, final ValidationContext context) { +final ValidationResult.Builder builder = new ValidationResult.Builder().subject(subject).input(input); + +if (input == null || input.isEmpty()) { +builder.valid(false); +builder.explanation("Context Factory is required"); +} else if (input.endsWith(DISALLOWED_CONTEXT_FACTORY)) { +builder.valid(false); +builder.explanation(String.format("Context Factory [%s] not allowed", DISALLOWED_CONTEXT_FACTORY)); +} else { +builder.valid(true); +builder.explanation("Context Factory allowed"); +} + +return builder.build(); +} +} + +private static class JndiJmsProviderUrlValidator implements Validator { +/** JNDI JMS URL Allowed Schemes based on ActiveMQ Connection Factory */ +private static final Set ALLOWED_SCHEMES = Collections.unmodifiableSet(new LinkedHashSet<>(Arrays.asList( +"jgroups", +"tcp", +"udp", +"vm" +))); Review Comment: @exceptionfactory I have concerns that we can enumerate all supported schemes because they are vendor specific. E.g. WebLogic uses `t3(s)://`. Basically, it can be an arbitrary name. I also understand that `ldap://` is problematic but WebSphere MQ relies on it when using JNDI (or the local `file://` can be used). Do you have any idea how to handle these shortcomings while keeping the validation logic? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [nifi] steven-matison commented on a diff in pull request #7316: NIFI-11608 Fixing Expression Language Evaluation in dataset and tablename.
steven-matison commented on code in PR #7316: URL: https://github.com/apache/nifi/pull/7316#discussion_r1211726531 ## nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/bigquery/PutBigQuery.java: ## @@ -203,20 +201,24 @@ public void onUnscheduled() { public void onTrigger(ProcessContext context, ProcessSession session) { WriteStream writeStream; Descriptors.Descriptor protoDescriptor; + +FlowFile flowFile = session.get(); +if (flowFile == null) { +return; +} + +final TableName tableName = TableName.of(context.getProperty(PROJECT_ID).getValue(), context.getProperty(DATASET).evaluateAttributeExpressions(flowFile).getValue(), context.getProperty(TABLE_NAME).evaluateAttributeExpressions(flowFile).getValue()); Review Comment: Done ## nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/bigquery/PutBigQuery.java: ## @@ -203,20 +201,24 @@ public void onUnscheduled() { public void onTrigger(ProcessContext context, ProcessSession session) { WriteStream writeStream; Descriptors.Descriptor protoDescriptor; + +FlowFile flowFile = session.get(); +if (flowFile == null) { +return; +} + +final TableName tableName = TableName.of(context.getProperty(PROJECT_ID).getValue(), context.getProperty(DATASET).evaluateAttributeExpressions(flowFile).getValue(), context.getProperty(TABLE_NAME).evaluateAttributeExpressions(flowFile).getValue()); Review Comment: Done -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[jira] [Assigned] (NIFI-11617) Update jackson.bom.version to 2.15.2
[ https://issues.apache.org/jira/browse/NIFI-11617?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Mike R reassigned NIFI-11617: - Assignee: Mike R > Update jackson.bom.version to 2.15.2 > > > Key: NIFI-11617 > URL: https://issues.apache.org/jira/browse/NIFI-11617 > Project: Apache NiFi > Issue Type: Improvement >Affects Versions: 1.21.0 >Reporter: Mike R >Assignee: Mike R >Priority: Major > Time Spent: 10m > Remaining Estimate: 0h > > [Jackson Release 2.15.2 · FasterXML/jackson Wiki > (github.com)|https://github.com/FasterXML/jackson/wiki/Jackson-Release-2.15.2] > Update jackson.bom.version to 2.15.2 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[GitHub] [nifi] mr1716 opened a new pull request, #7317: NIFI-11617 Update jackson.bom to 2.15.2
mr1716 opened a new pull request, #7317: URL: https://github.com/apache/nifi/pull/7317 # Summary [NIFI-11617](https://issues.apache.org/jira/browse/NIFI-11617) # Tracking Please complete the following tracking steps prior to pull request creation. ### Issue Tracking - [X] [Apache NiFi Jira](https://issues.apache.org/jira/browse/NIFI-11617) issue created ### Pull Request Tracking - [X] Pull Request title starts with Apache NiFi Jira issue number, such as `NIFI-0` - [X] Pull Request commit message starts with Apache NiFi Jira issue number, as such `NIFI-0` ### Pull Request Formatting - [X] Pull Request based on current revision of the `main` branch - [X] Pull Request refers to a feature branch with one commit containing changes # Verification Please indicate the verification steps performed prior to pull request creation. ### Build - [ ] Build completed using `mvn clean install -P contrib-check` - [ ] JDK 11 - [ ] JDK 17 ### Licensing - [ ] New dependencies are compatible with the [Apache License 2.0](https://apache.org/licenses/LICENSE-2.0) according to the [License Policy](https://www.apache.org/legal/resolved.html) - [ ] New dependencies are documented in applicable `LICENSE` and `NOTICE` files ### Documentation - [ ] Documentation formatting appears as expected in rendered files -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[jira] [Created] (NIFI-11617) Update jackson.bom.version to 2.15.2
Mike R created NIFI-11617: - Summary: Update jackson.bom.version to 2.15.2 Key: NIFI-11617 URL: https://issues.apache.org/jira/browse/NIFI-11617 Project: Apache NiFi Issue Type: Improvement Affects Versions: 1.21.0 Reporter: Mike R [Jackson Release 2.15.2 · FasterXML/jackson Wiki (github.com)|https://github.com/FasterXML/jackson/wiki/Jackson-Release-2.15.2] Update jackson.bom.version to 2.15.2 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Commented] (NIFI-11604) AbstractKerberosUser should check if ticket is renewable
[ https://issues.apache.org/jira/browse/NIFI-11604?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17727956#comment-17727956 ] ASF subversion and git services commented on NIFI-11604: Commit 5f5bf48d7429b2123c00f99834b14cac2d40e3f6 in nifi's branch refs/heads/support/nifi-1.x from Bryan Bende [ https://gitbox.apache.org/repos/asf?p=nifi.git;h=5f5bf48d74 ] NIFI-11604 Improve handling of non-renewable tickets in AbstractKerberosUser Signed-off-by: Pierre Villard This closes #7301. > AbstractKerberosUser should check if ticket is renewable > > > Key: NIFI-11604 > URL: https://issues.apache.org/jira/browse/NIFI-11604 > Project: Apache NiFi > Issue Type: Improvement >Reporter: Bryan Bende >Assignee: Bryan Bende >Priority: Major > Time Spent: 20m > Remaining Estimate: 0h > > Currently the checkTgtAndRelogin method assumes all tickets are renewable and > will attempt to call refresh() on the KerberosTicket when reaching > 80% of > the ticket lifetime. If the ticket is not renewable the refresh will fail and > it will do a full logout/login, same as if the ticket was no longer > renewable, but it could be handled more gracefully by not even calling > refresh when knowing it won't work. -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Updated] (NIFI-11604) AbstractKerberosUser should check if ticket is renewable
[ https://issues.apache.org/jira/browse/NIFI-11604?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Pierre Villard updated NIFI-11604: -- Fix Version/s: 2.0.0 1.22.0 Resolution: Fixed Status: Resolved (was: Patch Available) > AbstractKerberosUser should check if ticket is renewable > > > Key: NIFI-11604 > URL: https://issues.apache.org/jira/browse/NIFI-11604 > Project: Apache NiFi > Issue Type: Improvement >Reporter: Bryan Bende >Assignee: Bryan Bende >Priority: Major > Fix For: 2.0.0, 1.22.0 > > Time Spent: 20m > Remaining Estimate: 0h > > Currently the checkTgtAndRelogin method assumes all tickets are renewable and > will attempt to call refresh() on the KerberosTicket when reaching > 80% of > the ticket lifetime. If the ticket is not renewable the refresh will fail and > it will do a full logout/login, same as if the ticket was no longer > renewable, but it could be handled more gracefully by not even calling > refresh when knowing it won't work. -- This message was sent by Atlassian Jira (v8.20.10#820010)