[jira] [Commented] (NIFI-4092) ClassCastException Warning during cluster sync
[ https://issues.apache.org/jira/browse/NIFI-4092?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16235075#comment-16235075 ] Tijo Thomas commented on NIFI-4092: --- I am also getting the same error . It is happening in one of the nifi cluster . ``` java.io.IOException: org.apache.nifi.controller.serialization.FlowSerializationException: java.lang.ClassCastException: org.apache.nifi.web.api.dto.TemplateDTO$JaxbAccessorM_getDescription_setDescription_java_lang_String cannot be cast to com.sun.xml.internal.bind.v2.runtime.reflect.Accessor at org.apache.nifi.persistence.StandardXMLFlowConfigurationDAO.save(StandardXMLFlowConfigurationDAO.java:143) at org.apache.nifi.controller.StandardFlowService.createDataFlowFromController(StandardFlowService.java:607) at org.apache.nifi.controller.StandardFlowService.createDataFlowFromController(StandardFlowService.java:100) at org.apache.nifi.cluster.coordination.node.NodeClusterCoordinator$2.run(NodeClusterCoordinator.java:706) at java.lang.Thread.run(Thread.java:745) Caused by: org.apache.nifi.controller.serialization.FlowSerializationException: java.lang.ClassCastException: org.apache.nifi.web.api.dto.TemplateDTO$JaxbAccessorM_getDescription_setDescription_java_lang_String cannot be cast to com.sun.xml.internal.bind.v2.runtime.reflect.Accessor at org.apache.nifi.controller.serialization.StandardFlowSerializer.addTemplate(StandardFlowSerializer.java:546) at org.apache.nifi.controller.serialization.StandardFlowSerializer.addProcessGroup(StandardFlowSerializer.java:203) at org.apache.nifi.controller.serialization.StandardFlowSerializer.addProcessGroup(StandardFlowSerializer.java:187) at org.apache.nifi.controller.serialization.StandardFlowSerializer.addProcessGroup(StandardFlowSerializer.java:187) at org.apache.nifi.controller.serialization.StandardFlowSerializer.addProcessGroup(StandardFlowSerializer.java:187) at org.apache.nifi.controller.serialization.StandardFlowSerializer.addProcessGroup(StandardFlowSerializer.java:187) at org.apache.nifi.controller.serialization.StandardFlowSerializer.serialize(StandardFlowSerializer.java:97) at org.apache.nifi.controller.FlowController.serialize(FlowController.java:1554) at org.apache.nifi.persistence.StandardXMLFlowConfigurationDAO.save(StandardXMLFlowConfigurationDAO.java:141) ... 4 common frames omitted Caused by: java.lang.ClassCastException: org.apache.nifi.web.api.dto.TemplateDTO$JaxbAccessorM_getDescription_setDescription_java_lang_String cannot be cast to com.sun.xml.internal.bind.v2.runtime.reflect.Accessor at com.sun.xml.internal.bind.v2.runtime.reflect.opt.OptimizedAccessorFactory.instanciate(OptimizedAccessorFactory.java:190) at com.sun.xml.internal.bind.v2.runtime.reflect.opt.OptimizedAccessorFactory.get(OptimizedAccessorFactory.java:129) at com.sun.xml.internal.bind.v2.runtime.reflect.Accessor$GetterSetterReflection.optimize(Accessor.java:388) at com.sun.xml.internal.bind.v2.runtime.property.SingleElementLeafProperty.(SingleElementLeafProperty.java:77) at sun.reflect.GeneratedConstructorAccessor909.newInstance(Unknown Source) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at com.sun.xml.internal.bind.v2.runtime.property.PropertyFactory.create(PropertyFactory.java:113) at com.sun.xml.internal.bind.v2.runtime.ClassBeanInfoImpl.(ClassBeanInfoImpl.java:166) at com.sun.xml.internal.bind.v2.runtime.JAXBContextImpl.getOrCreate(JAXBContextImpl.java:488) at com.sun.xml.internal.bind.v2.runtime.JAXBContextImpl.(JAXBContextImpl.java:305) at com.sun.xml.internal.bind.v2.runtime.JAXBContextImpl.(JAXBContextImpl.java:124) at com.sun.xml.internal.bind.v2.runtime.JAXBContextImpl$JAXBContextBuilder.build(JAXBContextImpl.java:1123) at com.sun.xml.internal.bind.v2.ContextFactory.createContext(ContextFactory.java:147) at sun.reflect.GeneratedMethodAccessor1654.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at javax.xml.bind.ContextFinder.newInstance(ContextFinder.java:247) at javax.xml.bind.ContextFinder.newInstance(ContextFinder.java:234) at javax.xml.bind.ContextFinder.find(ContextFinder.java:462) at javax.xml.bind.JAXBContext.newInstance(JAXBContext.java:641) at javax.xml.bind.JAXBContext.newInstance(JAXBContext.java:584) at org.apache.nifi.persistence.TemplateSerializer.serialize(TemplateSerializer.java:47) at org.apache.nifi.controller.serialization.StandardFlowSerial
[jira] [Commented] (NIFI-3688) Create extended groovy scripting processor
[ https://issues.apache.org/jira/browse/NIFI-3688?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16234971#comment-16234971 ] ASF GitHub Bot commented on NIFI-3688: -- Github user mattyb149 commented on a diff in the pull request: https://github.com/apache/nifi/pull/1662#discussion_r148413690 --- Diff: nifi-nar-bundles/nifi-groovyx-bundle/nifi-groovyx-processors/src/main/java/org/apache/nifi/processors/groovyx/ExecuteGroovyScript.java --- @@ -0,0 +1,453 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.nifi.processors.groovyx; + +import java.io.File; +import java.lang.reflect.Method; +import java.sql.SQLException; +import java.util.ArrayList; +import java.util.Collections; +import java.util.Collection; +import java.util.HashMap; +import java.util.HashSet; +import java.util.List; +import java.util.Map; +import java.util.Set; + +import org.apache.nifi.annotation.behavior.Restricted; +import org.apache.nifi.annotation.behavior.DynamicProperty; +import org.apache.nifi.annotation.behavior.EventDriven; +import org.apache.nifi.annotation.behavior.InputRequirement; +import org.apache.nifi.annotation.documentation.CapabilityDescription; +import org.apache.nifi.annotation.documentation.SeeAlso; +import org.apache.nifi.annotation.documentation.Tags; +import org.apache.nifi.annotation.lifecycle.OnScheduled; +import org.apache.nifi.annotation.lifecycle.OnStopped; +import org.apache.nifi.components.PropertyDescriptor; +import org.apache.nifi.controller.ControllerService; +import org.apache.nifi.dbcp.DBCPService; +import org.apache.nifi.flowfile.FlowFile; --- End diff -- Checkstyle (via the -Pcontrib-check Maven profile) says this is unused, so it should be removed > Create extended groovy scripting processor > -- > > Key: NIFI-3688 > URL: https://issues.apache.org/jira/browse/NIFI-3688 > Project: Apache NiFi > Issue Type: New Feature > Components: Extensions >Reporter: Dmitry Lukyanov >Priority: Minor > > The idea is to simplify groovy scripting. > Main targets: > - to be compatible with existing groovy scripting > - simplify read/write attributes > - simplify read/write content > - avoid closure casting to nifi types like `StreamCallback` > - simplify and provide visibility when accessing to controller services from > script -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[GitHub] nifi pull request #1662: NIFI-3688 Extended Groovy Nifi Processor
Github user mattyb149 commented on a diff in the pull request: https://github.com/apache/nifi/pull/1662#discussion_r148413690 --- Diff: nifi-nar-bundles/nifi-groovyx-bundle/nifi-groovyx-processors/src/main/java/org/apache/nifi/processors/groovyx/ExecuteGroovyScript.java --- @@ -0,0 +1,453 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.nifi.processors.groovyx; + +import java.io.File; +import java.lang.reflect.Method; +import java.sql.SQLException; +import java.util.ArrayList; +import java.util.Collections; +import java.util.Collection; +import java.util.HashMap; +import java.util.HashSet; +import java.util.List; +import java.util.Map; +import java.util.Set; + +import org.apache.nifi.annotation.behavior.Restricted; +import org.apache.nifi.annotation.behavior.DynamicProperty; +import org.apache.nifi.annotation.behavior.EventDriven; +import org.apache.nifi.annotation.behavior.InputRequirement; +import org.apache.nifi.annotation.documentation.CapabilityDescription; +import org.apache.nifi.annotation.documentation.SeeAlso; +import org.apache.nifi.annotation.documentation.Tags; +import org.apache.nifi.annotation.lifecycle.OnScheduled; +import org.apache.nifi.annotation.lifecycle.OnStopped; +import org.apache.nifi.components.PropertyDescriptor; +import org.apache.nifi.controller.ControllerService; +import org.apache.nifi.dbcp.DBCPService; +import org.apache.nifi.flowfile.FlowFile; --- End diff -- Checkstyle (via the -Pcontrib-check Maven profile) says this is unused, so it should be removed ---
[jira] [Commented] (NIFI-3688) Create extended groovy scripting processor
[ https://issues.apache.org/jira/browse/NIFI-3688?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16234961#comment-16234961 ] ASF GitHub Bot commented on NIFI-3688: -- Github user mattyb149 commented on a diff in the pull request: https://github.com/apache/nifi/pull/1662#discussion_r148412300 --- Diff: nifi-nar-bundles/nifi-groovyx-bundle/nifi-groovyx-processors/src/main/resources/docs/org.apache.nifi.processors.groovyx.ExecuteGroovyScript/additionalDetails.html --- @@ -0,0 +1,202 @@ + + + + + +Groovy + + + + + +Summary +This is a grooviest groovy script :) +Script Bindings: + +variabletypedescription + + session + org.apache.nifi.processor.ProcessSession + the session that is used to get, change, and transfer input files + + + context + org.apache.nifi.processor.ProcessContext + the context (almost unusefull) + + + log + org.apache.nifi.logging.ComponentLog + the logger for this processor instance + + + REL_SUCCESS + org.apache.nifi.processor.Relationship + the success relationship + + + REL_FAILURE + org.apache.nifi.processor.Relationship + the failure relationship + + + flowFile + org.apache.nifi.flowfile.FlowFile + Binded only if the property `Require flow file`=true for the processor + + + CTL + java.util.HashMap + Map populated with controller services binded through `CTL.*` processor properties + + + Dynamic processor properties + org.apache.nifi.components.PropertyDescriptor + All processor properties not started with `CTL.` are binded to script variables + + + +CTL map + +CTL.* objects accessible if corresponding processor property defined. +Example: if you defined property `CTL.cache` to DistributedMapCacheClientService, then you can access it from code CTL.cache --- End diff -- Can you explain (at least to me here) more about linking with CTL? Does the name of the service have to be "cache" in this case? Or does there need to be a user-defined property called "CTL.cache" added to the service? If the latter, what if the service itself uses the user-defined properties? Seems like there might be conflict in processing? > Create extended groovy scripting processor > -- > > Key: NIFI-3688 > URL: https://issues.apache.org/jira/browse/NIFI-3688 > Project: Apache NiFi > Issue Type: New Feature > Components: Extensions >Reporter: Dmitry Lukyanov >Priority: Minor > > The idea is to simplify groovy scripting. > Main targets: > - to be compatible with existing groovy scripting > - simplify read/write attributes > - simplify read/write content > - avoid closure casting to nifi types like `StreamCallback` > - simplify and provide visibility when accessing to controller services from > script -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[jira] [Commented] (NIFI-3688) Create extended groovy scripting processor
[ https://issues.apache.org/jira/browse/NIFI-3688?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16234963#comment-16234963 ] ASF GitHub Bot commented on NIFI-3688: -- Github user mattyb149 commented on a diff in the pull request: https://github.com/apache/nifi/pull/1662#discussion_r148412458 --- Diff: nifi-nar-bundles/pom.xml --- @@ -94,7 +95,7 @@ - + > Key: NIFI-3688 > URL: https://issues.apache.org/jira/browse/NIFI-3688 > Project: Apache NiFi > Issue Type: New Feature > Components: Extensions >Reporter: Dmitry Lukyanov >Priority: Minor > > The idea is to simplify groovy scripting. > Main targets: > - to be compatible with existing groovy scripting > - simplify read/write attributes > - simplify read/write content > - avoid closure casting to nifi types like `StreamCallback` > - simplify and provide visibility when accessing to controller services from > script -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[GitHub] nifi pull request #1662: NIFI-3688 Extended Groovy Nifi Processor
Github user mattyb149 commented on a diff in the pull request: https://github.com/apache/nifi/pull/1662#discussion_r148412300 --- Diff: nifi-nar-bundles/nifi-groovyx-bundle/nifi-groovyx-processors/src/main/resources/docs/org.apache.nifi.processors.groovyx.ExecuteGroovyScript/additionalDetails.html --- @@ -0,0 +1,202 @@ + + + + + +Groovy + + + + + +Summary +This is a grooviest groovy script :) +Script Bindings: + +variabletypedescription + + session + org.apache.nifi.processor.ProcessSession + the session that is used to get, change, and transfer input files + + + context + org.apache.nifi.processor.ProcessContext + the context (almost unusefull) + + + log + org.apache.nifi.logging.ComponentLog + the logger for this processor instance + + + REL_SUCCESS + org.apache.nifi.processor.Relationship + the success relationship + + + REL_FAILURE + org.apache.nifi.processor.Relationship + the failure relationship + + + flowFile + org.apache.nifi.flowfile.FlowFile + Binded only if the property `Require flow file`=true for the processor + + + CTL + java.util.HashMap + Map populated with controller services binded through `CTL.*` processor properties + + + Dynamic processor properties + org.apache.nifi.components.PropertyDescriptor + All processor properties not started with `CTL.` are binded to script variables + + + +CTL map + +CTL.* objects accessible if corresponding processor property defined. +Example: if you defined property `CTL.cache` to DistributedMapCacheClientService, then you can access it from code CTL.cache --- End diff -- Can you explain (at least to me here) more about linking with CTL? Does the name of the service have to be "cache" in this case? Or does there need to be a user-defined property called "CTL.cache" added to the service? If the latter, what if the service itself uses the user-defined properties? Seems like there might be conflict in processing? ---
[GitHub] nifi pull request #1662: NIFI-3688 Extended Groovy Nifi Processor
Github user mattyb149 commented on a diff in the pull request: https://github.com/apache/nifi/pull/1662#discussion_r148412458 --- Diff: nifi-nar-bundles/pom.xml --- @@ -94,7 +95,7 @@ - +
[GitHub] nifi pull request #1662: NIFI-3688 Extended Groovy Nifi Processor
Github user mattyb149 commented on a diff in the pull request: https://github.com/apache/nifi/pull/1662#discussion_r148412024 --- Diff: nifi-nar-bundles/nifi-groovyx-bundle/nifi-groovyx-processors/src/main/resources/docs/org.apache.nifi.processors.groovyx.ExecuteGroovyScript/additionalDetails.html --- @@ -0,0 +1,202 @@ + + + + + +Groovy + + + + + +Summary +This is a grooviest groovy script :) +Script Bindings: + +variabletypedescription + + session + org.apache.nifi.processor.ProcessSession + the session that is used to get, change, and transfer input files + + + context + org.apache.nifi.processor.ProcessContext + the context (almost unusefull) + + + log + org.apache.nifi.logging.ComponentLog + the logger for this processor instance + + + REL_SUCCESS + org.apache.nifi.processor.Relationship + the success relationship + + + REL_FAILURE + org.apache.nifi.processor.Relationship + the failure relationship + + + flowFile + org.apache.nifi.flowfile.FlowFile + Binded only if the property `Require flow file`=true for the processor + + + CTL + java.util.HashMap + Map populated with controller services binded through `CTL.*` processor properties + + + Dynamic processor properties + org.apache.nifi.components.PropertyDescriptor + All processor properties not started with `CTL.` are binded to script variables + + + +CTL map + +CTL.* objects accessible if corresponding processor property defined. +Example: if you defined property `CTL.cache` to DistributedMapCacheClientService, then you can access it from code CTL.cache +If CTL property references to Database connection pool, then corresponding CTL entry will contain groovy.sql.Sql object connected to database with autocommit=false. +CTL - Database transactions automatically rolled back on script exception and committed on success. Script must not disconnect connection. + + + + +SessionFile - flow file extension + + The (org.apache.nifi.processors.groovyx.flow.SessionFile) is an actual object returned by session in Extended Groovy processor. + This flow file is a container that references session and the real flow file. + This allows to use simplified syntax to work with file attributes and content: + +set new attribute value + + flowFile.ATTRIBUTE_NAME = ATTRIBUTE_VALUE --- End diff -- These features are awesome! ---
[jira] [Commented] (NIFI-3688) Create extended groovy scripting processor
[ https://issues.apache.org/jira/browse/NIFI-3688?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16234962#comment-16234962 ] ASF GitHub Bot commented on NIFI-3688: -- Github user mattyb149 commented on a diff in the pull request: https://github.com/apache/nifi/pull/1662#discussion_r148411913 --- Diff: nifi-nar-bundles/nifi-groovyx-bundle/nifi-groovyx-processors/src/main/resources/docs/org.apache.nifi.processors.groovyx.ExecuteGroovyScript/additionalDetails.html --- @@ -0,0 +1,202 @@ + + + + + +Groovy + + + + + +Summary +This is a grooviest groovy script :) +Script Bindings: + +variabletypedescription + + session + org.apache.nifi.processor.ProcessSession + the session that is used to get, change, and transfer input files + + + context + org.apache.nifi.processor.ProcessContext + the context (almost unusefull) + + + log + org.apache.nifi.logging.ComponentLog + the logger for this processor instance + + + REL_SUCCESS + org.apache.nifi.processor.Relationship + the success relationship + + + REL_FAILURE + org.apache.nifi.processor.Relationship + the failure relationship + + + flowFile + org.apache.nifi.flowfile.FlowFile + Binded only if the property `Require flow file`=true for the processor + + + CTL + java.util.HashMap + Map populated with controller services binded through `CTL.*` processor properties + + + Dynamic processor properties + org.apache.nifi.components.PropertyDescriptor + All processor properties not started with `CTL.` are binded to script variables + + + +CTL map + +CTL.* objects accessible if corresponding processor property defined. +Example: if you defined property `CTL.cache` to DistributedMapCacheClientService, then you can access it from code CTL.cache +If CTL property references to Database connection pool, then corresponding CTL entry will contain groovy.sql.Sql object connected to database with autocommit=false. --- End diff -- How is it guaranteed that autocommit will be false? If setAutoCommit is called somewhere (either by your code -- which I couldn't find a reference to -- or by groovy's Sql class), it can cause problems in a couple of scenarios, one being Oracle if the DBA has disallowed changing auto-commit, and another is Hive (since HiveConnectionPool extends DBCPService, it should be available via CTL right?) > Create extended groovy scripting processor > -- > > Key: NIFI-3688 > URL: https://issues.apache.org/jira/browse/NIFI-3688 > Project: Apache NiFi > Issue Type: New Feature > Components: Extensions >Reporter: Dmitry Lukyanov >Priority: Minor > > The idea is to simplify groovy scripting. > Main targets: > - to be compatible with existing groovy scripting > - simplify read/write attributes > - simplify read/write content > - avoid closure casting to nifi types like `StreamCallback` > - simplify and provide visibility when accessing to controller services from > script -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[jira] [Commented] (NIFI-3688) Create extended groovy scripting processor
[ https://issues.apache.org/jira/browse/NIFI-3688?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16234964#comment-16234964 ] ASF GitHub Bot commented on NIFI-3688: -- Github user mattyb149 commented on a diff in the pull request: https://github.com/apache/nifi/pull/1662#discussion_r148412024 --- Diff: nifi-nar-bundles/nifi-groovyx-bundle/nifi-groovyx-processors/src/main/resources/docs/org.apache.nifi.processors.groovyx.ExecuteGroovyScript/additionalDetails.html --- @@ -0,0 +1,202 @@ + + + + + +Groovy + + + + + +Summary +This is a grooviest groovy script :) +Script Bindings: + +variabletypedescription + + session + org.apache.nifi.processor.ProcessSession + the session that is used to get, change, and transfer input files + + + context + org.apache.nifi.processor.ProcessContext + the context (almost unusefull) + + + log + org.apache.nifi.logging.ComponentLog + the logger for this processor instance + + + REL_SUCCESS + org.apache.nifi.processor.Relationship + the success relationship + + + REL_FAILURE + org.apache.nifi.processor.Relationship + the failure relationship + + + flowFile + org.apache.nifi.flowfile.FlowFile + Binded only if the property `Require flow file`=true for the processor + + + CTL + java.util.HashMap + Map populated with controller services binded through `CTL.*` processor properties + + + Dynamic processor properties + org.apache.nifi.components.PropertyDescriptor + All processor properties not started with `CTL.` are binded to script variables + + + +CTL map + +CTL.* objects accessible if corresponding processor property defined. +Example: if you defined property `CTL.cache` to DistributedMapCacheClientService, then you can access it from code CTL.cache +If CTL property references to Database connection pool, then corresponding CTL entry will contain groovy.sql.Sql object connected to database with autocommit=false. +CTL - Database transactions automatically rolled back on script exception and committed on success. Script must not disconnect connection. + + + + +SessionFile - flow file extension + + The (org.apache.nifi.processors.groovyx.flow.SessionFile) is an actual object returned by session in Extended Groovy processor. + This flow file is a container that references session and the real flow file. + This allows to use simplified syntax to work with file attributes and content: + +set new attribute value + + flowFile.ATTRIBUTE_NAME = ATTRIBUTE_VALUE --- End diff -- These features are awesome! > Create extended groovy scripting processor > -- > > Key: NIFI-3688 > URL: https://issues.apache.org/jira/browse/NIFI-3688 > Project: Apache NiFi > Issue Type: New Feature > Components: Extensions >Reporter: Dmitry Lukyanov >Priority: Minor > > The idea is to simplify groovy scripting. > Main targets: > - to be compatible with existing groovy scripting > - simplify read/write attributes > - simplify read/write content > - avoid closure casting to nifi types like `StreamCallback` > - simplify and provide visibility when accessing to controller services from > script -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[GitHub] nifi pull request #1662: NIFI-3688 Extended Groovy Nifi Processor
Github user mattyb149 commented on a diff in the pull request: https://github.com/apache/nifi/pull/1662#discussion_r148411913 --- Diff: nifi-nar-bundles/nifi-groovyx-bundle/nifi-groovyx-processors/src/main/resources/docs/org.apache.nifi.processors.groovyx.ExecuteGroovyScript/additionalDetails.html --- @@ -0,0 +1,202 @@ + + + + + +Groovy + + + + + +Summary +This is a grooviest groovy script :) +Script Bindings: + +variabletypedescription + + session + org.apache.nifi.processor.ProcessSession + the session that is used to get, change, and transfer input files + + + context + org.apache.nifi.processor.ProcessContext + the context (almost unusefull) + + + log + org.apache.nifi.logging.ComponentLog + the logger for this processor instance + + + REL_SUCCESS + org.apache.nifi.processor.Relationship + the success relationship + + + REL_FAILURE + org.apache.nifi.processor.Relationship + the failure relationship + + + flowFile + org.apache.nifi.flowfile.FlowFile + Binded only if the property `Require flow file`=true for the processor + + + CTL + java.util.HashMap + Map populated with controller services binded through `CTL.*` processor properties + + + Dynamic processor properties + org.apache.nifi.components.PropertyDescriptor + All processor properties not started with `CTL.` are binded to script variables + + + +CTL map + +CTL.* objects accessible if corresponding processor property defined. +Example: if you defined property `CTL.cache` to DistributedMapCacheClientService, then you can access it from code CTL.cache +If CTL property references to Database connection pool, then corresponding CTL entry will contain groovy.sql.Sql object connected to database with autocommit=false. --- End diff -- How is it guaranteed that autocommit will be false? If setAutoCommit is called somewhere (either by your code -- which I couldn't find a reference to -- or by groovy's Sql class), it can cause problems in a couple of scenarios, one being Oracle if the DBA has disallowed changing auto-commit, and another is Hive (since HiveConnectionPool extends DBCPService, it should be available via CTL right?) ---
[jira] [Resolved] (MINIFICPP-60) Support HTTP(s) as transport mechanism for Site to Site
[ https://issues.apache.org/jira/browse/MINIFICPP-60?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] marco polo resolved MINIFICPP-60. - Resolution: Fixed Closed before [~aldrin] could > Support HTTP(s) as transport mechanism for Site to Site > --- > > Key: MINIFICPP-60 > URL: https://issues.apache.org/jira/browse/MINIFICPP-60 > Project: NiFi MiNiFi C++ > Issue Type: Improvement >Reporter: Aldrin Piri >Assignee: marco polo >Priority: Major > > The C++ implementation would benefit from having an HTTP(S) implementation of > Site to Site as was described in > https://cwiki.apache.org/confluence/display/NIFI/Support+HTTP(S)+as+a+transport+mechanism+for+Site-to-Site > and performed in NIFI-1857 -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[jira] [Commented] (MINIFICPP-60) Support HTTP(s) as transport mechanism for Site to Site
[ https://issues.apache.org/jira/browse/MINIFICPP-60?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16234687#comment-16234687 ] ASF GitHub Bot commented on MINIFICPP-60: - Github user asfgit closed the pull request at: https://github.com/apache/nifi-minifi-cpp/pull/158 > Support HTTP(s) as transport mechanism for Site to Site > --- > > Key: MINIFICPP-60 > URL: https://issues.apache.org/jira/browse/MINIFICPP-60 > Project: NiFi MiNiFi C++ > Issue Type: Improvement >Reporter: Aldrin Piri >Assignee: marco polo >Priority: Major > > The C++ implementation would benefit from having an HTTP(S) implementation of > Site to Site as was described in > https://cwiki.apache.org/confluence/display/NIFI/Support+HTTP(S)+as+a+transport+mechanism+for+Site-to-Site > and performed in NIFI-1857 -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[GitHub] nifi-minifi-cpp pull request #158: MINIFICPP-60: Add initial implementation ...
Github user asfgit closed the pull request at: https://github.com/apache/nifi-minifi-cpp/pull/158 ---
[jira] [Commented] (MINIFICPP-110) Implement ExecuteScript
[ https://issues.apache.org/jira/browse/MINIFICPP-110?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16234649#comment-16234649 ] ASF GitHub Bot commented on MINIFICPP-110: -- Github user achristianson commented on the issue: https://github.com/apache/nifi-minifi-cpp/pull/163 Just pushed out one tiny README.md update and a small fix to the test CMakeLists.txt. Everything is in an extension now. Should be good to go. > Implement ExecuteScript > --- > > Key: MINIFICPP-110 > URL: https://issues.apache.org/jira/browse/MINIFICPP-110 > Project: NiFi MiNiFi C++ > Issue Type: New Feature >Reporter: Andrew Christianson >Assignee: Andrew Christianson >Priority: Major > > Initially support python and lua. -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[GitHub] nifi-minifi-cpp issue #163: MINIFICPP-110 Add ExecuteScript processor with s...
Github user achristianson commented on the issue: https://github.com/apache/nifi-minifi-cpp/pull/163 Just pushed out one tiny README.md update and a small fix to the test CMakeLists.txt. Everything is in an extension now. Should be good to go. ---
[jira] [Updated] (MINIFICPP-281) Improve portability of tar.gz package
[ https://issues.apache.org/jira/browse/MINIFICPP-281?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Andrew Christianson updated MINIFICPP-281: -- Issue Type: Improvement (was: Bug) > Improve portability of tar.gz package > - > > Key: MINIFICPP-281 > URL: https://issues.apache.org/jira/browse/MINIFICPP-281 > Project: NiFi MiNiFi C++ > Issue Type: Improvement >Reporter: Andrew Christianson >Priority: Major > > The minifi tar.gz distribution produced by "make package" should be more > portable by bundling dependency shared objects or statically-linking > dependencies. Link paths should work regardless of where the package is > extracted (possibly by setting LD_LIBRARY_PATH relative to MINIFI_HOME in > minifi.sh?) > This is somewhat related/similar to MINIFICPP-277. -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[jira] [Created] (MINIFICPP-281) Improve portability of tar.gz package
Andrew Christianson created MINIFICPP-281: - Summary: Improve portability of tar.gz package Key: MINIFICPP-281 URL: https://issues.apache.org/jira/browse/MINIFICPP-281 Project: NiFi MiNiFi C++ Issue Type: Bug Reporter: Andrew Christianson Priority: Major The minifi tar.gz distribution produced by "make package" should be more portable by bundling dependency shared objects or statically-linking dependencies. Link paths should work regardless of where the package is extracted (possibly by setting LD_LIBRARY_PATH relative to MINIFI_HOME in minifi.sh?) This is somewhat related/similar to MINIFICPP-277. -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[jira] [Updated] (MINIFICPP-277) Produce system packages in build process
[ https://issues.apache.org/jira/browse/MINIFICPP-277?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Andrew Christianson updated MINIFICPP-277: -- Description: Users have reported issues with portability of built MiNiFi - C++ binaries. While this issue is caused by multiple factors, one factor is the lack of system packages built to be compatible with standard runtime environments/OSes (e.g. CentOS 6). We should add build targets which produce system packages, and ideally have repeatable builds & release artifacts for major target OSes such that the deployment/installation practice is a simple apt-get or yum install. Initial yum OS packages: * CentOS 6 * CentOS 7 Initial deb OS packages: * Ubuntu 14.04 * Ubuntu 16.04 was:Users have reported issues with portability of built MiNiFi - C++ binaries. While this issue is caused by multiple factors, one factor is the lack of system packages built to be compatible with standard runtime environments/OSes (e.g. CentOS 6). We should add build targets which produce system packages, and ideally have repeatable builds & release artifacts for major target OSes such that the deployment/installation practice is a simple apt-get or yum install. > Produce system packages in build process > > > Key: MINIFICPP-277 > URL: https://issues.apache.org/jira/browse/MINIFICPP-277 > Project: NiFi MiNiFi C++ > Issue Type: Improvement >Reporter: Andrew Christianson >Priority: Major > > Users have reported issues with portability of built MiNiFi - C++ binaries. > While this issue is caused by multiple factors, one factor is the lack of > system packages built to be compatible with standard runtime > environments/OSes (e.g. CentOS 6). We should add build targets which produce > system packages, and ideally have repeatable builds & release artifacts for > major target OSes such that the deployment/installation practice is a simple > apt-get or yum install. > Initial yum OS packages: > * CentOS 6 > * CentOS 7 > Initial deb OS packages: > * Ubuntu 14.04 > * Ubuntu 16.04 -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[jira] [Commented] (MINIFICPP-280) Various refactoring and improvements
[ https://issues.apache.org/jira/browse/MINIFICPP-280?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16234634#comment-16234634 ] ASF GitHub Bot commented on MINIFICPP-280: -- Github user phrocker commented on a diff in the pull request: https://github.com/apache/nifi-minifi-cpp/pull/168#discussion_r148365280 --- Diff: thirdparty/google-styleguide/run_linter.sh --- @@ -14,16 +15,31 @@ # See the License for the specific language governing permissions and # limitations under the License. # -# ./run_linter -#!/bin/bash -if [ "$(uname)" == "Darwin" ]; then -SCRIPT_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )" +# ./run_linter ... -- ... --- End diff -- I like this. I was hoping to make these linter changes following our upcoming release. I also like what you did with tests and was toying with that but didn't want to jump to that because I had a full plate with other features and cmake additions I really appreciate this PR as it helps us tremendously. I'll run through it. May not make it through to this release but we'll get this reviewed and merged as it's super helpful. > Various refactoring and improvements > > > Key: MINIFICPP-280 > URL: https://issues.apache.org/jira/browse/MINIFICPP-280 > Project: NiFi MiNiFi C++ > Issue Type: Improvement >Reporter: Caleb Johnson >Priority: Minor > > * move extension tests into their respective folders > * separate source and header files > * remove unnecessary or nonexisting include directories > * run linter on extension source files as part of linter target > * clean up extensions according to linter > * add ability to specify more than one include and source folder for linter > * build catch main() and spdlib as shared objects for all tests (faster > build!) > * cmake doesn't know PRIVATE BEFORE, only BEFORE PRIVATE > * borrow port changes to tests from MINIFICPP-60 for parallel testing > * enable parallel testing in travis config -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[GitHub] nifi-minifi-cpp pull request #168: MINIFICPP-280 Refactoring and various imp...
Github user phrocker commented on a diff in the pull request: https://github.com/apache/nifi-minifi-cpp/pull/168#discussion_r148365280 --- Diff: thirdparty/google-styleguide/run_linter.sh --- @@ -14,16 +15,31 @@ # See the License for the specific language governing permissions and # limitations under the License. # -# ./run_linter -#!/bin/bash -if [ "$(uname)" == "Darwin" ]; then -SCRIPT_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )" +# ./run_linter ... -- ... --- End diff -- I like this. I was hoping to make these linter changes following our upcoming release. I also like what you did with tests and was toying with that but didn't want to jump to that because I had a full plate with other features and cmake additions I really appreciate this PR as it helps us tremendously. I'll run through it. May not make it through to this release but we'll get this reviewed and merged as it's super helpful. ---
[jira] [Commented] (MINIFICPP-280) Various refactoring and improvements
[ https://issues.apache.org/jira/browse/MINIFICPP-280?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16234633#comment-16234633 ] ASF GitHub Bot commented on MINIFICPP-280: -- Github user phrocker commented on a diff in the pull request: https://github.com/apache/nifi-minifi-cpp/pull/168#discussion_r148364843 --- Diff: extensions/http-curl/tests/integration/C2NullConfiguration.cpp --- @@ -29,26 +29,27 @@ #include #include #include -#include "HTTPClient.h" -#include "InvokeHTTP.h" -#include "../TestBase.h" +#include "client/HTTPClient.h" +#include "processors/InvokeHTTP.h" #include "utils/StringUtils.h" #include "core/Core.h" -#include "../include/core/logging/Logger.h" --- End diff -- oh man. everytime I see these I get upset. Eclipse does this to me. I use eclipse cdt and intellij for javabut never can understand why eclipse does this on a fresh workspace. > Various refactoring and improvements > > > Key: MINIFICPP-280 > URL: https://issues.apache.org/jira/browse/MINIFICPP-280 > Project: NiFi MiNiFi C++ > Issue Type: Improvement >Reporter: Caleb Johnson >Priority: Minor > > * move extension tests into their respective folders > * separate source and header files > * remove unnecessary or nonexisting include directories > * run linter on extension source files as part of linter target > * clean up extensions according to linter > * add ability to specify more than one include and source folder for linter > * build catch main() and spdlib as shared objects for all tests (faster > build!) > * cmake doesn't know PRIVATE BEFORE, only BEFORE PRIVATE > * borrow port changes to tests from MINIFICPP-60 for parallel testing > * enable parallel testing in travis config -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[GitHub] nifi-minifi-cpp pull request #168: MINIFICPP-280 Refactoring and various imp...
Github user phrocker commented on a diff in the pull request: https://github.com/apache/nifi-minifi-cpp/pull/168#discussion_r148364843 --- Diff: extensions/http-curl/tests/integration/C2NullConfiguration.cpp --- @@ -29,26 +29,27 @@ #include #include #include -#include "HTTPClient.h" -#include "InvokeHTTP.h" -#include "../TestBase.h" +#include "client/HTTPClient.h" +#include "processors/InvokeHTTP.h" #include "utils/StringUtils.h" #include "core/Core.h" -#include "../include/core/logging/Logger.h" --- End diff -- oh man. everytime I see these I get upset. Eclipse does this to me. I use eclipse cdt and intellij for javabut never can understand why eclipse does this on a fresh workspace. ---
[jira] [Commented] (NIFIREG-46) Make it easy to discover the buckets accessible by current user
[ https://issues.apache.org/jira/browse/NIFIREG-46?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16234600#comment-16234600 ] ASF GitHub Bot commented on NIFIREG-46: --- GitHub user kevdoran opened a pull request: https://github.com/apache/nifi-registry/pull/30 NIFIREG-46 Add authorizedActions field to Bucket This builds upon PR #29 (NIFIREG-33), so only the commits on top of that need to considered. The API now returns an additional field for a Bucket, telling the client what actions they are authorized to perform on that bucket. For example: ``` "authorizedActions": ["read", "write", "delete"] ``` With this change, `/buckets` is now a convenient initial endpoint to use to both check authentication of the client identity and discover buckets and authorizations available to the client. You can merge this pull request into a Git repository by running: $ git pull https://github.com/kevdoran/nifi-registry NIFIREG-46 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/nifi-registry/pull/30.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #30 commit c0048e5c5ee975470df69012be32b3dac88f023f Author: Kevin Doran Date: 2017-10-12T17:54:34Z NIFIREG-33 Add LDAP and JWT auth support - Adds LdapIdentityProvider for authentication - Adds /access/token endpoint for generating JWT for users that can authenticate with a configured IdenitiyProvider - Adds JwtAuthenticationProvider for authentication - Adds KeyService for key generation and tracking for signing JWTs - Adds LdapUserGroupProvider for authorization - Adds LDAP integration tests - Refactors nifi-registry-security-api-impl into nifi-registry-framework - Refactors all security related packages, such as o.a.n.r.authorization and o.a.n.r.authentication, under org.apache.nifi.registry.security commit f70f02a79d62701eb57bb9504f7ed1851fe3d04e Author: Kevin Doran Date: 2017-11-01T19:02:19Z NIFIREG-46 Add authorizedActions field to Bucket > Make it easy to discover the buckets accessible by current user > --- > > Key: NIFIREG-46 > URL: https://issues.apache.org/jira/browse/NIFIREG-46 > Project: NiFi Registry > Issue Type: Improvement >Reporter: Kevin Doran >Assignee: Kevin Doran >Priority: Major > > As the NiFi UI will want to provide a selection for the user in the context > of "which bucket would you like to use", it would be nice if there were a > convenient way to discover, in a single call to registry, which buckets the > user has access to, and what actions they can perform (read, write, etc). > This ticket is to add a new endpoint, or extend a current endpoint, to > provide this information in an easily consumable format. -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[GitHub] nifi-registry pull request #30: NIFIREG-46 Add authorizedActions field to Bu...
GitHub user kevdoran opened a pull request: https://github.com/apache/nifi-registry/pull/30 NIFIREG-46 Add authorizedActions field to Bucket This builds upon PR #29 (NIFIREG-33), so only the commits on top of that need to considered. The API now returns an additional field for a Bucket, telling the client what actions they are authorized to perform on that bucket. For example: ``` "authorizedActions": ["read", "write", "delete"] ``` With this change, `/buckets` is now a convenient initial endpoint to use to both check authentication of the client identity and discover buckets and authorizations available to the client. You can merge this pull request into a Git repository by running: $ git pull https://github.com/kevdoran/nifi-registry NIFIREG-46 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/nifi-registry/pull/30.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #30 commit c0048e5c5ee975470df69012be32b3dac88f023f Author: Kevin Doran Date: 2017-10-12T17:54:34Z NIFIREG-33 Add LDAP and JWT auth support - Adds LdapIdentityProvider for authentication - Adds /access/token endpoint for generating JWT for users that can authenticate with a configured IdenitiyProvider - Adds JwtAuthenticationProvider for authentication - Adds KeyService for key generation and tracking for signing JWTs - Adds LdapUserGroupProvider for authorization - Adds LDAP integration tests - Refactors nifi-registry-security-api-impl into nifi-registry-framework - Refactors all security related packages, such as o.a.n.r.authorization and o.a.n.r.authentication, under org.apache.nifi.registry.security commit f70f02a79d62701eb57bb9504f7ed1851fe3d04e Author: Kevin Doran Date: 2017-11-01T19:02:19Z NIFIREG-46 Add authorizedActions field to Bucket ---
[jira] [Assigned] (MINIFICPP-72) Add tar and compression support for MergeContent
[ https://issues.apache.org/jira/browse/MINIFICPP-72?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Aldrin Piri reassigned MINIFICPP-72: Assignee: bqiu > Add tar and compression support for MergeContent > > > Key: MINIFICPP-72 > URL: https://issues.apache.org/jira/browse/MINIFICPP-72 > Project: NiFi MiNiFi C++ > Issue Type: New Feature >Affects Versions: 0.2.0 >Reporter: bqiu >Assignee: bqiu >Priority: Major > Fix For: 0.3.0 > > > Add tar and compression support for MergeContent > will use the https://www.libarchive.org -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[jira] [Resolved] (MINIFICPP-52) Implement ExtractText processor
[ https://issues.apache.org/jira/browse/MINIFICPP-52?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Aldrin Piri resolved MINIFICPP-52. -- Resolution: Fixed Fix Version/s: 0.3.0 > Implement ExtractText processor > --- > > Key: MINIFICPP-52 > URL: https://issues.apache.org/jira/browse/MINIFICPP-52 > Project: NiFi MiNiFi C++ > Issue Type: Improvement >Reporter: Andrew Christianson >Assignee: Andrew Christianson >Priority: Major > Fix For: 0.3.0 > > > Implement the ExtractText processor as it has clearly utility in simple > endpoint flows. -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[jira] [Resolved] (MINIFICPP-72) Add tar and compression support for MergeContent
[ https://issues.apache.org/jira/browse/MINIFICPP-72?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Aldrin Piri resolved MINIFICPP-72. -- Resolution: Fixed Fix Version/s: (was: 1.0.0) 0.3.0 > Add tar and compression support for MergeContent > > > Key: MINIFICPP-72 > URL: https://issues.apache.org/jira/browse/MINIFICPP-72 > Project: NiFi MiNiFi C++ > Issue Type: New Feature >Affects Versions: 0.2.0 >Reporter: bqiu >Assignee: bqiu >Priority: Major > Fix For: 0.3.0 > > > Add tar and compression support for MergeContent > will use the https://www.libarchive.org -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[jira] [Updated] (MINIFICPP-72) Add tar and compression support for MergeContent
[ https://issues.apache.org/jira/browse/MINIFICPP-72?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Aldrin Piri updated MINIFICPP-72: - Affects Version/s: (was: 1.0.0) 0.2.0 > Add tar and compression support for MergeContent > > > Key: MINIFICPP-72 > URL: https://issues.apache.org/jira/browse/MINIFICPP-72 > Project: NiFi MiNiFi C++ > Issue Type: New Feature >Affects Versions: 0.2.0 >Reporter: bqiu >Assignee: bqiu >Priority: Major > Fix For: 0.3.0 > > > Add tar and compression support for MergeContent > will use the https://www.libarchive.org -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[jira] [Resolved] (MINIFICPP-276) Add EXCLUDE_BOOST option to cmake.
[ https://issues.apache.org/jira/browse/MINIFICPP-276?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Aldrin Piri resolved MINIFICPP-276. --- Resolution: Fixed Fix Version/s: 0.3.0 > Add EXCLUDE_BOOST option to cmake. > --- > > Key: MINIFICPP-276 > URL: https://issues.apache.org/jira/browse/MINIFICPP-276 > Project: NiFi MiNiFi C++ > Issue Type: Improvement >Reporter: marco polo >Assignee: marco polo >Priority: Major > Fix For: 0.3.0 > > -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[jira] [Resolved] (MINIFICPP-266) C2 Threading uses a wait time of 0
[ https://issues.apache.org/jira/browse/MINIFICPP-266?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Aldrin Piri resolved MINIFICPP-266. --- Resolution: Fixed > C2 Threading uses a wait time of 0 > -- > > Key: MINIFICPP-266 > URL: https://issues.apache.org/jira/browse/MINIFICPP-266 > Project: NiFi MiNiFi C++ > Issue Type: Bug >Reporter: marco polo >Assignee: marco polo >Priority: Blocker > Fix For: 0.3.0 > > > This causes a spin with no wait, spiking cpu usage to 100% if C2 is virtually > disabled. We should also explore better disabling these threading components > if there is no work to do. Thanks to [~achristianson] for identifying this > and triaging. -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[jira] [Commented] (NIFI-4552) Add mime.type and record.count to @WritesAttributes doc for QueryRecord
[ https://issues.apache.org/jira/browse/NIFI-4552?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16234567#comment-16234567 ] ASF GitHub Bot commented on NIFI-4552: -- Github user asfgit closed the pull request at: https://github.com/apache/nifi/pull/2242 > Add mime.type and record.count to @WritesAttributes doc for QueryRecord > --- > > Key: NIFI-4552 > URL: https://issues.apache.org/jira/browse/NIFI-4552 > Project: Apache NiFi > Issue Type: Improvement > Components: Extensions >Reporter: Matt Burgess >Assignee: Andrew Lim >Priority: Major > Fix For: 1.5.0 > > > Currently (NiFi 1.4.0) QueryRecord will transfer any attributes from the > WriteResult to the outgoing FlowFile, but I don't think there are any (for a > query) at the time of this writing. However the QueryRecord processor also > updates the "mime.type" attribute based on the RecordSetWriter chosen, and > also sets the "record.count" attribute to the number of records that match > the query. > None of the aforementioned are in the processor documentation; if the > WriteResult ones are premature to mention that's ok, but we should add the > mime.type and record.count attributes to the documentation for QueryRecord. -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[jira] [Commented] (NIFI-4552) Add mime.type and record.count to @WritesAttributes doc for QueryRecord
[ https://issues.apache.org/jira/browse/NIFI-4552?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16234565#comment-16234565 ] ASF subversion and git services commented on NIFI-4552: --- Commit 9a850c7ed221e99ab266e06810d068186b1d87d2 in nifi's branch refs/heads/master from [~andrewmlim] [ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=9a850c7 ] NIFI-4552 Add mime.type and record.count Write Attributes to QueryRecord doc NIFI-4552 minor checkstyle violation Signed-off-by: Matthew Burgess This closes #2242 > Add mime.type and record.count to @WritesAttributes doc for QueryRecord > --- > > Key: NIFI-4552 > URL: https://issues.apache.org/jira/browse/NIFI-4552 > Project: Apache NiFi > Issue Type: Improvement > Components: Extensions >Reporter: Matt Burgess >Assignee: Andrew Lim >Priority: Major > Fix For: 1.5.0 > > > Currently (NiFi 1.4.0) QueryRecord will transfer any attributes from the > WriteResult to the outgoing FlowFile, but I don't think there are any (for a > query) at the time of this writing. However the QueryRecord processor also > updates the "mime.type" attribute based on the RecordSetWriter chosen, and > also sets the "record.count" attribute to the number of records that match > the query. > None of the aforementioned are in the processor documentation; if the > WriteResult ones are premature to mention that's ok, but we should add the > mime.type and record.count attributes to the documentation for QueryRecord. -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[GitHub] nifi pull request #2242: NIFI-4552 Add mime.type and record.count Write Attr...
Github user asfgit closed the pull request at: https://github.com/apache/nifi/pull/2242 ---
[jira] [Updated] (NIFI-4552) Add mime.type and record.count to @WritesAttributes doc for QueryRecord
[ https://issues.apache.org/jira/browse/NIFI-4552?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Matt Burgess updated NIFI-4552: --- Status: Patch Available (was: Open) > Add mime.type and record.count to @WritesAttributes doc for QueryRecord > --- > > Key: NIFI-4552 > URL: https://issues.apache.org/jira/browse/NIFI-4552 > Project: Apache NiFi > Issue Type: Improvement > Components: Extensions >Reporter: Matt Burgess >Assignee: Andrew Lim >Priority: Major > > Currently (NiFi 1.4.0) QueryRecord will transfer any attributes from the > WriteResult to the outgoing FlowFile, but I don't think there are any (for a > query) at the time of this writing. However the QueryRecord processor also > updates the "mime.type" attribute based on the RecordSetWriter chosen, and > also sets the "record.count" attribute to the number of records that match > the query. > None of the aforementioned are in the processor documentation; if the > WriteResult ones are premature to mention that's ok, but we should add the > mime.type and record.count attributes to the documentation for QueryRecord. -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[jira] [Updated] (NIFI-4552) Add mime.type and record.count to @WritesAttributes doc for QueryRecord
[ https://issues.apache.org/jira/browse/NIFI-4552?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Matt Burgess updated NIFI-4552: --- Resolution: Fixed Fix Version/s: 1.5.0 Status: Resolved (was: Patch Available) > Add mime.type and record.count to @WritesAttributes doc for QueryRecord > --- > > Key: NIFI-4552 > URL: https://issues.apache.org/jira/browse/NIFI-4552 > Project: Apache NiFi > Issue Type: Improvement > Components: Extensions >Reporter: Matt Burgess >Assignee: Andrew Lim >Priority: Major > Fix For: 1.5.0 > > > Currently (NiFi 1.4.0) QueryRecord will transfer any attributes from the > WriteResult to the outgoing FlowFile, but I don't think there are any (for a > query) at the time of this writing. However the QueryRecord processor also > updates the "mime.type" attribute based on the RecordSetWriter chosen, and > also sets the "record.count" attribute to the number of records that match > the query. > None of the aforementioned are in the processor documentation; if the > WriteResult ones are premature to mention that's ok, but we should add the > mime.type and record.count attributes to the documentation for QueryRecord. -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[jira] [Commented] (NIFI-4552) Add mime.type and record.count to @WritesAttributes doc for QueryRecord
[ https://issues.apache.org/jira/browse/NIFI-4552?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16234566#comment-16234566 ] ASF subversion and git services commented on NIFI-4552: --- Commit 9a850c7ed221e99ab266e06810d068186b1d87d2 in nifi's branch refs/heads/master from [~andrewmlim] [ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=9a850c7 ] NIFI-4552 Add mime.type and record.count Write Attributes to QueryRecord doc NIFI-4552 minor checkstyle violation Signed-off-by: Matthew Burgess This closes #2242 > Add mime.type and record.count to @WritesAttributes doc for QueryRecord > --- > > Key: NIFI-4552 > URL: https://issues.apache.org/jira/browse/NIFI-4552 > Project: Apache NiFi > Issue Type: Improvement > Components: Extensions >Reporter: Matt Burgess >Assignee: Andrew Lim >Priority: Major > Fix For: 1.5.0 > > > Currently (NiFi 1.4.0) QueryRecord will transfer any attributes from the > WriteResult to the outgoing FlowFile, but I don't think there are any (for a > query) at the time of this writing. However the QueryRecord processor also > updates the "mime.type" attribute based on the RecordSetWriter chosen, and > also sets the "record.count" attribute to the number of records that match > the query. > None of the aforementioned are in the processor documentation; if the > WriteResult ones are premature to mention that's ok, but we should add the > mime.type and record.count attributes to the documentation for QueryRecord. -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[jira] [Commented] (NIFI-4496) Improve performance of CSVReader
[ https://issues.apache.org/jira/browse/NIFI-4496?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16234547#comment-16234547 ] ASF GitHub Bot commented on NIFI-4496: -- Github user andrewmlim commented on a diff in the pull request: https://github.com/apache/nifi/pull/2245#discussion_r148347696 --- Diff: nifi-nar-bundles/nifi-standard-services/nifi-record-serialization-services-bundle/nifi-record-serialization-services/src/main/java/org/apache/nifi/csv/CSVReader.java --- @@ -54,6 +54,26 @@ "The first non-comment line of the CSV file is a header line that contains the names of the columns. The schema will be derived by using the " + "column names in the header and assuming that all columns are of type String."); +// CSV parsers +public static final AllowableValue APACHE_COMMONS_CSV = new AllowableValue("commons-csv", "Apache Commons CSV", +"The CSV parser implementation from the Apache Commons CSV library."); + +public static final AllowableValue JACKSON_CSV = new AllowableValue("jackson-csv", "Jackson CSV", +"The CSV parser implementation from the Jackson Dataformats library"); + + +public static final PropertyDescriptor CSV_PARSER = new PropertyDescriptor.Builder() +.name("csv-reader-csv-parser") +.displayName("CSV Parser") +.description("Specifies which parser to use to read CSV records. NOTE: Different parsers may support different subsets of functionality, " ++ "and/or exhibit different levels of performance.") --- End diff -- Suggest changing the NOTE to: Different parsers may support different subsets of functionality and may also exhibit different levels of performance. > Improve performance of CSVReader > > > Key: NIFI-4496 > URL: https://issues.apache.org/jira/browse/NIFI-4496 > Project: Apache NiFi > Issue Type: Improvement > Components: Extensions >Reporter: Matt Burgess >Assignee: Matt Burgess >Priority: Major > > During some throughput testing, it was noted that the CSVReader was not as > fast as desired, processing less than 50k records per second. A look at [this > benchmark|https://github.com/uniVocity/csv-parsers-comparison] implies that > the Apache Commons CSV parser (used by CSVReader) is quite slow compared to > others. > From that benchmark it appears that CSVReader could be enhanced by using a > different CSV parser under the hood. Perhaps Jackson is the best choice, as > it is fast when values are quoted, and is a mature and maintained codebase. -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[GitHub] nifi pull request #2245: NIFI-4496: Added JacksonCSVRecordReader to allow ch...
Github user andrewmlim commented on a diff in the pull request: https://github.com/apache/nifi/pull/2245#discussion_r148347696 --- Diff: nifi-nar-bundles/nifi-standard-services/nifi-record-serialization-services-bundle/nifi-record-serialization-services/src/main/java/org/apache/nifi/csv/CSVReader.java --- @@ -54,6 +54,26 @@ "The first non-comment line of the CSV file is a header line that contains the names of the columns. The schema will be derived by using the " + "column names in the header and assuming that all columns are of type String."); +// CSV parsers +public static final AllowableValue APACHE_COMMONS_CSV = new AllowableValue("commons-csv", "Apache Commons CSV", +"The CSV parser implementation from the Apache Commons CSV library."); + +public static final AllowableValue JACKSON_CSV = new AllowableValue("jackson-csv", "Jackson CSV", +"The CSV parser implementation from the Jackson Dataformats library"); + + +public static final PropertyDescriptor CSV_PARSER = new PropertyDescriptor.Builder() +.name("csv-reader-csv-parser") +.displayName("CSV Parser") +.description("Specifies which parser to use to read CSV records. NOTE: Different parsers may support different subsets of functionality, " ++ "and/or exhibit different levels of performance.") --- End diff -- Suggest changing the NOTE to: Different parsers may support different subsets of functionality and may also exhibit different levels of performance. ---
[GitHub] nifi pull request #2245: NIFI-4496: Added JacksonCSVRecordReader to allow ch...
Github user andrewmlim commented on a diff in the pull request: https://github.com/apache/nifi/pull/2245#discussion_r148347427 --- Diff: nifi-nar-bundles/nifi-standard-services/nifi-record-serialization-services-bundle/nifi-record-serialization-services/src/main/java/org/apache/nifi/csv/CSVReader.java --- @@ -54,6 +54,26 @@ "The first non-comment line of the CSV file is a header line that contains the names of the columns. The schema will be derived by using the " + "column names in the header and assuming that all columns are of type String."); +// CSV parsers +public static final AllowableValue APACHE_COMMONS_CSV = new AllowableValue("commons-csv", "Apache Commons CSV", +"The CSV parser implementation from the Apache Commons CSV library."); + +public static final AllowableValue JACKSON_CSV = new AllowableValue("jackson-csv", "Jackson CSV", +"The CSV parser implementation from the Jackson Dataformats library"); --- End diff -- Need a period (.) after library to be consistent. ---
[jira] [Commented] (NIFI-4496) Improve performance of CSVReader
[ https://issues.apache.org/jira/browse/NIFI-4496?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16234544#comment-16234544 ] ASF GitHub Bot commented on NIFI-4496: -- Github user andrewmlim commented on a diff in the pull request: https://github.com/apache/nifi/pull/2245#discussion_r148347427 --- Diff: nifi-nar-bundles/nifi-standard-services/nifi-record-serialization-services-bundle/nifi-record-serialization-services/src/main/java/org/apache/nifi/csv/CSVReader.java --- @@ -54,6 +54,26 @@ "The first non-comment line of the CSV file is a header line that contains the names of the columns. The schema will be derived by using the " + "column names in the header and assuming that all columns are of type String."); +// CSV parsers +public static final AllowableValue APACHE_COMMONS_CSV = new AllowableValue("commons-csv", "Apache Commons CSV", +"The CSV parser implementation from the Apache Commons CSV library."); + +public static final AllowableValue JACKSON_CSV = new AllowableValue("jackson-csv", "Jackson CSV", +"The CSV parser implementation from the Jackson Dataformats library"); --- End diff -- Need a period (.) after library to be consistent. > Improve performance of CSVReader > > > Key: NIFI-4496 > URL: https://issues.apache.org/jira/browse/NIFI-4496 > Project: Apache NiFi > Issue Type: Improvement > Components: Extensions >Reporter: Matt Burgess >Assignee: Matt Burgess >Priority: Major > > During some throughput testing, it was noted that the CSVReader was not as > fast as desired, processing less than 50k records per second. A look at [this > benchmark|https://github.com/uniVocity/csv-parsers-comparison] implies that > the Apache Commons CSV parser (used by CSVReader) is quite slow compared to > others. > From that benchmark it appears that CSVReader could be enhanced by using a > different CSV parser under the hood. Perhaps Jackson is the best choice, as > it is fast when values are quoted, and is a mature and maintained codebase. -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[jira] [Commented] (NIFIREG-38) Incorrect timestamps returned from REST API
[ https://issues.apache.org/jira/browse/NIFIREG-38?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16234536#comment-16234536 ] ASF GitHub Bot commented on NIFIREG-38: --- Github user scottyaslan commented on the issue: https://github.com/apache/nifi-registry/pull/25 Reviewing... > Incorrect timestamps returned from REST API > --- > > Key: NIFIREG-38 > URL: https://issues.apache.org/jira/browse/NIFIREG-38 > Project: NiFi Registry > Issue Type: Bug >Reporter: Bryan Bende >Assignee: Bryan Bende >Priority: Major > > It appears that the timestamps returned from the REST API for created and > modified are not correct and actually contain more digits than the standard > epoch. This is likely some issue with conversion to and from the database. -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[GitHub] nifi-registry issue #25: NIFIREG-38 Converting milliseconds to seconds in fr...
Github user scottyaslan commented on the issue: https://github.com/apache/nifi-registry/pull/25 Reviewing... ---
[jira] [Created] (NIFI-4562) MergeContent errors with FlowFileHandlingException: transfer relationship not specified if IOException thrown
Mark Payne created NIFI-4562: Summary: MergeContent errors with FlowFileHandlingException: transfer relationship not specified if IOException thrown Key: NIFI-4562 URL: https://issues.apache.org/jira/browse/NIFI-4562 Project: Apache NiFi Issue Type: Bug Components: Extensions Reporter: Mark Payne Assignee: Mark Payne Priority: Major If an IOException is thrown when merging FlowFiles (for instance out of disk space or too many open files), it will be converted to a ProcessException, and BinFiles will catch that. It will then transfer FlowFiles to 'failure' and call session.commit(). However, if a 'merged' FlowFile already was created then it does not get removed. As a result, the call to session.commit will throw a FlowFileHandlingException indicating that the FlowFile's transfer relationship is not set. We need to make sure that each of the 'mergers' catches this ProcessException and removes the created 'bundled' flowfile before re-throwing it. -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[jira] [Commented] (MINIFICPP-280) Various refactoring and improvements
[ https://issues.apache.org/jira/browse/MINIFICPP-280?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16234472#comment-16234472 ] ASF GitHub Bot commented on MINIFICPP-280: -- GitHub user calebj opened a pull request: https://github.com/apache/nifi-minifi-cpp/pull/168 MINIFICPP-280 Refactoring and various improvements - move extension tests into their respective folders - separate source and header files - remove unnecessary or nonexisting include directories - run linter on extension source files as part of linter target - clean up extensions according to linter - add ability to specify more than one include and source folder for linter - build catch main() and spdlib as shared objects for all tests (faster build!) - cmake doesn't know PRIVATE BEFORE, only BEFORE PRIVATE - borrow changes to tests from MINIFICPP-60 for parallel testing - enable parallel testing in travis config Thank you for submitting a contribution to Apache NiFi - MiNiFi C++. In order to streamline the review of the contribution we ask you to ensure the following steps have been taken: ### For all changes: - [x] Is there a JIRA ticket associated with this PR? Is it referenced in the commit message? - [x] Does your PR title start with MINIFI- where is the JIRA number you are trying to resolve? Pay particular attention to the hyphen "-" character. - [x] Has your PR been rebased against the latest commit within the target branch (typically master)? - [x] Is your initial contribution a single, squashed commit? ### For code changes: - [ ] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [ ] If applicable, have you updated the LICENSE file? - [ ] If applicable, have you updated the NOTICE file? ### For documentation related changes: - [ ] Have you ensured that format looks appropriate for the output in which it is rendered? ### Note: Please ensure that once the PR is submitted, you check travis-ci for build issues and submit an update to your PR as soon as possible. You can merge this pull request into a Git repository by running: $ git pull https://github.com/NiFiLocal/nifi-minifi-cpp ExtensionLint Alternatively you can review and apply these changes as the patch at: https://github.com/apache/nifi-minifi-cpp/pull/168.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #168 commit 37d64e50a378a812970a7c468bf3cd4051ba6cf7 Author: Caleb Johnson Date: 2017-11-01T16:52:55Z MINIFICPP-280 Refactoring and various improvements - move extension tests into their respective folders - separate source and header files - remove unnecessary or nonexisting include directories - run linter on extension source files as part of linter target - clean up extensions according to linter - add ability to specify more than one include and source folder for linter - build catch main() and spdlib as shared objects for all tests (faster build!) - cmake doesn't know PRIVATE BEFORE, only BEFORE PRIVATE - borrow changes to tests from MINIFICPP-60 for parallel testing - enable parallel testing in travis config > Various refactoring and improvements > > > Key: MINIFICPP-280 > URL: https://issues.apache.org/jira/browse/MINIFICPP-280 > Project: NiFi MiNiFi C++ > Issue Type: Improvement >Reporter: Caleb Johnson >Priority: Minor > > * move extension tests into their respective folders > * separate source and header files > * remove unnecessary or nonexisting include directories > * run linter on extension source files as part of linter target > * clean up extensions according to linter > * add ability to specify more than one include and source folder for linter > * build catch main() and spdlib as shared objects for all tests (faster > build!) > * cmake doesn't know PRIVATE BEFORE, only BEFORE PRIVATE > * borrow port changes to tests from MINIFICPP-60 for parallel testing > * enable parallel testing in travis config -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[GitHub] nifi-minifi-cpp pull request #168: MINIFICPP-280 Refactoring and various imp...
GitHub user calebj opened a pull request: https://github.com/apache/nifi-minifi-cpp/pull/168 MINIFICPP-280 Refactoring and various improvements - move extension tests into their respective folders - separate source and header files - remove unnecessary or nonexisting include directories - run linter on extension source files as part of linter target - clean up extensions according to linter - add ability to specify more than one include and source folder for linter - build catch main() and spdlib as shared objects for all tests (faster build!) - cmake doesn't know PRIVATE BEFORE, only BEFORE PRIVATE - borrow changes to tests from MINIFICPP-60 for parallel testing - enable parallel testing in travis config Thank you for submitting a contribution to Apache NiFi - MiNiFi C++. In order to streamline the review of the contribution we ask you to ensure the following steps have been taken: ### For all changes: - [x] Is there a JIRA ticket associated with this PR? Is it referenced in the commit message? - [x] Does your PR title start with MINIFI- where is the JIRA number you are trying to resolve? Pay particular attention to the hyphen "-" character. - [x] Has your PR been rebased against the latest commit within the target branch (typically master)? - [x] Is your initial contribution a single, squashed commit? ### For code changes: - [ ] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [ ] If applicable, have you updated the LICENSE file? - [ ] If applicable, have you updated the NOTICE file? ### For documentation related changes: - [ ] Have you ensured that format looks appropriate for the output in which it is rendered? ### Note: Please ensure that once the PR is submitted, you check travis-ci for build issues and submit an update to your PR as soon as possible. You can merge this pull request into a Git repository by running: $ git pull https://github.com/NiFiLocal/nifi-minifi-cpp ExtensionLint Alternatively you can review and apply these changes as the patch at: https://github.com/apache/nifi-minifi-cpp/pull/168.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #168 commit 37d64e50a378a812970a7c468bf3cd4051ba6cf7 Author: Caleb Johnson Date: 2017-11-01T16:52:55Z MINIFICPP-280 Refactoring and various improvements - move extension tests into their respective folders - separate source and header files - remove unnecessary or nonexisting include directories - run linter on extension source files as part of linter target - clean up extensions according to linter - add ability to specify more than one include and source folder for linter - build catch main() and spdlib as shared objects for all tests (faster build!) - cmake doesn't know PRIVATE BEFORE, only BEFORE PRIVATE - borrow changes to tests from MINIFICPP-60 for parallel testing - enable parallel testing in travis config ---
[jira] [Created] (MINIFICPP-280) Various refactoring and improvements
Caleb Johnson created MINIFICPP-280: --- Summary: Various refactoring and improvements Key: MINIFICPP-280 URL: https://issues.apache.org/jira/browse/MINIFICPP-280 Project: NiFi MiNiFi C++ Issue Type: Improvement Reporter: Caleb Johnson Priority: Minor * move extension tests into their respective folders * separate source and header files * remove unnecessary or nonexisting include directories * run linter on extension source files as part of linter target * clean up extensions according to linter * add ability to specify more than one include and source folder for linter * build catch main() and spdlib as shared objects for all tests (faster build!) * cmake doesn't know PRIVATE BEFORE, only BEFORE PRIVATE * borrow port changes to tests from MINIFICPP-60 for parallel testing * enable parallel testing in travis config -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[GitHub] nifi-minifi-cpp pull request #167: MINIFICPP-279 Including Boost includes fo...
Github user asfgit closed the pull request at: https://github.com/apache/nifi-minifi-cpp/pull/167 ---
[jira] [Commented] (MINIFICPP-279) PutFileTests can fail to build
[ https://issues.apache.org/jira/browse/MINIFICPP-279?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16234378#comment-16234378 ] ASF GitHub Bot commented on MINIFICPP-279: -- Github user asfgit closed the pull request at: https://github.com/apache/nifi-minifi-cpp/pull/167 > PutFileTests can fail to build > -- > > Key: MINIFICPP-279 > URL: https://issues.apache.org/jira/browse/MINIFICPP-279 > Project: NiFi MiNiFi C++ > Issue Type: Bug >Affects Versions: 0.2.0 >Reporter: Aldrin Piri >Assignee: Aldrin Piri > > Need to reference the boost includes in the BuildTests CMake file for usage > of filesystem components of Boost in PutFileTests. -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[jira] [Commented] (MINIFICPP-279) PutFileTests can fail to build
[ https://issues.apache.org/jira/browse/MINIFICPP-279?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16234327#comment-16234327 ] ASF GitHub Bot commented on MINIFICPP-279: -- GitHub user apiri opened a pull request: https://github.com/apache/nifi-minifi-cpp/pull/167 MINIFICPP-279 Including Boost includes for BuildTests MINIFICPP-279 Including Boost includes for BuildTests to resolve build issue in PutFileTests You can merge this pull request into a Git repository by running: $ git pull https://github.com/apiri/nifi-minifi-cpp MINIFICPP-279 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/nifi-minifi-cpp/pull/167.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #167 commit 70df87f94c78fc99463ebe1384b97180650beb93 Author: Aldrin Piri Date: 2017-11-01T16:17:22Z MINIFICPP-279 Including Boost includes for BuildTests to resolve build issue in PutFileTests > PutFileTests can fail to build > -- > > Key: MINIFICPP-279 > URL: https://issues.apache.org/jira/browse/MINIFICPP-279 > Project: NiFi MiNiFi C++ > Issue Type: Bug >Affects Versions: 0.2.0 >Reporter: Aldrin Piri >Assignee: Aldrin Piri > > Need to reference the boost includes in the BuildTests CMake file for usage > of filesystem components of Boost in PutFileTests. -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[jira] [Updated] (MINIFICPP-279) PutFileTests can fail to build
[ https://issues.apache.org/jira/browse/MINIFICPP-279?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Aldrin Piri updated MINIFICPP-279: -- Status: Patch Available (was: Open) > PutFileTests can fail to build > -- > > Key: MINIFICPP-279 > URL: https://issues.apache.org/jira/browse/MINIFICPP-279 > Project: NiFi MiNiFi C++ > Issue Type: Bug >Affects Versions: 0.2.0 >Reporter: Aldrin Piri >Assignee: Aldrin Piri > > Need to reference the boost includes in the BuildTests CMake file for usage > of filesystem components of Boost in PutFileTests. -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[GitHub] nifi-minifi-cpp pull request #167: MINIFICPP-279 Including Boost includes fo...
GitHub user apiri opened a pull request: https://github.com/apache/nifi-minifi-cpp/pull/167 MINIFICPP-279 Including Boost includes for BuildTests MINIFICPP-279 Including Boost includes for BuildTests to resolve build issue in PutFileTests You can merge this pull request into a Git repository by running: $ git pull https://github.com/apiri/nifi-minifi-cpp MINIFICPP-279 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/nifi-minifi-cpp/pull/167.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #167 commit 70df87f94c78fc99463ebe1384b97180650beb93 Author: Aldrin Piri Date: 2017-11-01T16:17:22Z MINIFICPP-279 Including Boost includes for BuildTests to resolve build issue in PutFileTests ---
[jira] [Created] (MINIFICPP-279) PutFileTests can fail to build
Aldrin Piri created MINIFICPP-279: - Summary: PutFileTests can fail to build Key: MINIFICPP-279 URL: https://issues.apache.org/jira/browse/MINIFICPP-279 Project: NiFi MiNiFi C++ Issue Type: Bug Affects Versions: 0.2.0 Reporter: Aldrin Piri Assignee: Aldrin Piri Need to reference the boost includes in the BuildTests CMake file for usage of filesystem components of Boost in PutFileTests. -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[jira] [Updated] (NIFI-4496) Improve performance of CSVReader
[ https://issues.apache.org/jira/browse/NIFI-4496?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Matt Burgess updated NIFI-4496: --- Status: Patch Available (was: In Progress) > Improve performance of CSVReader > > > Key: NIFI-4496 > URL: https://issues.apache.org/jira/browse/NIFI-4496 > Project: Apache NiFi > Issue Type: Improvement > Components: Extensions >Reporter: Matt Burgess >Assignee: Matt Burgess >Priority: Major > > During some throughput testing, it was noted that the CSVReader was not as > fast as desired, processing less than 50k records per second. A look at [this > benchmark|https://github.com/uniVocity/csv-parsers-comparison] implies that > the Apache Commons CSV parser (used by CSVReader) is quite slow compared to > others. > From that benchmark it appears that CSVReader could be enhanced by using a > different CSV parser under the hood. Perhaps Jackson is the best choice, as > it is fast when values are quoted, and is a mature and maintained codebase. -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[jira] [Commented] (NIFI-4496) Improve performance of CSVReader
[ https://issues.apache.org/jira/browse/NIFI-4496?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16234252#comment-16234252 ] ASF GitHub Bot commented on NIFI-4496: -- GitHub user mattyb149 opened a pull request: https://github.com/apache/nifi/pull/2245 NIFI-4496: Added JacksonCSVRecordReader to allow choice of CSV parser Thank you for submitting a contribution to Apache NiFi. In order to streamline the review of the contribution we ask you to ensure the following steps have been taken: ### For all changes: - [x] Is there a JIRA ticket associated with this PR? Is it referenced in the commit message? - [x] Does your PR title start with NIFI- where is the JIRA number you are trying to resolve? Pay particular attention to the hyphen "-" character. - [x] Has your PR been rebased against the latest commit within the target branch (typically master)? - [x] Is your initial contribution a single, squashed commit? ### For code changes: - [x] Have you ensured that the full suite of tests is executed via mvn -Pcontrib-check clean install at the root nifi folder? - [x] Have you written or updated unit tests to verify your changes? - [ ] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [ ] If applicable, have you updated the LICENSE file, including the main LICENSE file under nifi-assembly? - [ ] If applicable, have you updated the NOTICE file, including the main NOTICE file found under nifi-assembly? - [x] If adding new Properties, have you added .displayName in addition to .name (programmatic access) for each of the new properties? ### For documentation related changes: - [x] Have you ensured that format looks appropriate for the output in which it is rendered? ### Note: Please ensure that once the PR is submitted, you check travis-ci for build issues and submit an update to your PR as soon as possible. You can merge this pull request into a Git repository by running: $ git pull https://github.com/mattyb149/nifi NIFI-4496 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/nifi/pull/2245.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #2245 commit 15040f4f67a785ab16894992ffeca7d7847f62f1 Author: Matthew Burgess Date: 2017-11-01T15:50:06Z NIFI-4496: Added JacksonCSVRecordReader to allow choice of CSV parser > Improve performance of CSVReader > > > Key: NIFI-4496 > URL: https://issues.apache.org/jira/browse/NIFI-4496 > Project: Apache NiFi > Issue Type: Improvement > Components: Extensions >Reporter: Matt Burgess >Assignee: Matt Burgess >Priority: Major > > During some throughput testing, it was noted that the CSVReader was not as > fast as desired, processing less than 50k records per second. A look at [this > benchmark|https://github.com/uniVocity/csv-parsers-comparison] implies that > the Apache Commons CSV parser (used by CSVReader) is quite slow compared to > others. > From that benchmark it appears that CSVReader could be enhanced by using a > different CSV parser under the hood. Perhaps Jackson is the best choice, as > it is fast when values are quoted, and is a mature and maintained codebase. -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[GitHub] nifi pull request #2245: NIFI-4496: Added JacksonCSVRecordReader to allow ch...
GitHub user mattyb149 opened a pull request: https://github.com/apache/nifi/pull/2245 NIFI-4496: Added JacksonCSVRecordReader to allow choice of CSV parser Thank you for submitting a contribution to Apache NiFi. In order to streamline the review of the contribution we ask you to ensure the following steps have been taken: ### For all changes: - [x] Is there a JIRA ticket associated with this PR? Is it referenced in the commit message? - [x] Does your PR title start with NIFI- where is the JIRA number you are trying to resolve? Pay particular attention to the hyphen "-" character. - [x] Has your PR been rebased against the latest commit within the target branch (typically master)? - [x] Is your initial contribution a single, squashed commit? ### For code changes: - [x] Have you ensured that the full suite of tests is executed via mvn -Pcontrib-check clean install at the root nifi folder? - [x] Have you written or updated unit tests to verify your changes? - [ ] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [ ] If applicable, have you updated the LICENSE file, including the main LICENSE file under nifi-assembly? - [ ] If applicable, have you updated the NOTICE file, including the main NOTICE file found under nifi-assembly? - [x] If adding new Properties, have you added .displayName in addition to .name (programmatic access) for each of the new properties? ### For documentation related changes: - [x] Have you ensured that format looks appropriate for the output in which it is rendered? ### Note: Please ensure that once the PR is submitted, you check travis-ci for build issues and submit an update to your PR as soon as possible. You can merge this pull request into a Git repository by running: $ git pull https://github.com/mattyb149/nifi NIFI-4496 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/nifi/pull/2245.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #2245 commit 15040f4f67a785ab16894992ffeca7d7847f62f1 Author: Matthew Burgess Date: 2017-11-01T15:50:06Z NIFI-4496: Added JacksonCSVRecordReader to allow choice of CSV parser ---
[jira] [Commented] (MINIFICPP-277) Produce system packages in build process
[ https://issues.apache.org/jira/browse/MINIFICPP-277?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16234194#comment-16234194 ] Andrew Christianson commented on MINIFICPP-277: --- PostgreSQL is a great example of this being done on another FOSS project: https://www.postgresql.org/download/linux/redhat/. They provide repos for all the major architectures/distros. We need to scope out if this is under the Apache umbrella before anyone implements this ticket, however. > Produce system packages in build process > > > Key: MINIFICPP-277 > URL: https://issues.apache.org/jira/browse/MINIFICPP-277 > Project: NiFi MiNiFi C++ > Issue Type: Improvement >Reporter: Andrew Christianson >Priority: Major > > Users have reported issues with portability of built MiNiFi - C++ binaries. > While this issue is caused by multiple factors, one factor is the lack of > system packages built to be compatible with standard runtime > environments/OSes (e.g. CentOS 6). We should add build targets which produce > system packages, and ideally have repeatable builds & release artifacts for > major target OSes such that the deployment/installation practice is a simple > apt-get or yum install. -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[jira] [Comment Edited] (MINIFICPP-277) Produce system packages in build process
[ https://issues.apache.org/jira/browse/MINIFICPP-277?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16234194#comment-16234194 ] Andrew Christianson edited comment on MINIFICPP-277 at 11/1/17 3:13 PM: PostgreSQL is a great example of this being done on another FOSS project: https://www.postgresql.org/download/linux/redhat/. They provide repos and packages for all the major architectures/distros. We need to scope out if this is under the Apache umbrella before anyone implements this ticket, however. was (Author: achristianson): PostgreSQL is a great example of this being done on another FOSS project: https://www.postgresql.org/download/linux/redhat/. They provide repos for all the major architectures/distros. We need to scope out if this is under the Apache umbrella before anyone implements this ticket, however. > Produce system packages in build process > > > Key: MINIFICPP-277 > URL: https://issues.apache.org/jira/browse/MINIFICPP-277 > Project: NiFi MiNiFi C++ > Issue Type: Improvement >Reporter: Andrew Christianson >Priority: Major > > Users have reported issues with portability of built MiNiFi - C++ binaries. > While this issue is caused by multiple factors, one factor is the lack of > system packages built to be compatible with standard runtime > environments/OSes (e.g. CentOS 6). We should add build targets which produce > system packages, and ideally have repeatable builds & release artifacts for > major target OSes such that the deployment/installation practice is a simple > apt-get or yum install. -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[jira] [Resolved] (MINIFICPP-272) Incorrect boost dependency order in libarchive CMakeLists.txt
[ https://issues.apache.org/jira/browse/MINIFICPP-272?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dustin Rodrigues resolved MINIFICPP-272. Resolution: Fixed > Incorrect boost dependency order in libarchive CMakeLists.txt > - > > Key: MINIFICPP-272 > URL: https://issues.apache.org/jira/browse/MINIFICPP-272 > Project: NiFi MiNiFi C++ > Issue Type: Bug >Reporter: Dustin Rodrigues >Priority: Major > > find_package(Boost) comes before include_directories(${Boost_INCLUDE_DIRS}) > which causes compilation errors. -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[jira] [Created] (MINIFICPP-278) Resolve REGISTER_RESOURCE so that we no longer need LoadProcessors.h in the base
marco polo created MINIFICPP-278: Summary: Resolve REGISTER_RESOURCE so that we no longer need LoadProcessors.h in the base Key: MINIFICPP-278 URL: https://issues.apache.org/jira/browse/MINIFICPP-278 Project: NiFi MiNiFi C++ Issue Type: Bug Reporter: marco polo -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[jira] [Commented] (NIFI-4092) ClassCastException Warning during cluster sync
[ https://issues.apache.org/jira/browse/NIFI-4092?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16234044#comment-16234044 ] Ramon Havermans commented on NIFI-4092: --- Same here on 1.3 > ClassCastException Warning during cluster sync > -- > > Key: NIFI-4092 > URL: https://issues.apache.org/jira/browse/NIFI-4092 > Project: Apache NiFi > Issue Type: Bug >Affects Versions: 1.3.0 >Reporter: Joseph Gresock >Priority: Major > > This is the strack trace I receive, though I'm not sure it affects anything, > since the cluster is eventually able to connect. > 2017-06-20 13:46:44,680 WARN [Reconnect ip-172-31-55-36.ec2.internal:8443] > o.a.n.c.c.node.NodeClusterCoordinator Problem encountered issuing > reconnection request to node ip-172-31-55-36.ec2.internal:8443 > java.io.IOException: > org.apache.nifi.controller.serialization.FlowSerializationException: > java.lang.ClassCastException: > org.apache.nifi.web.api.dto.TemplateDTO$JaxbAccessorM_getDescription_setDescription_java_lang_String > cannot be cast to com.sun.xml.internal.bind.v2.runtime.reflect.Accessor > at > org.apache.nifi.persistence.StandardXMLFlowConfigurationDAO.save(StandardXMLFlowConfigurationDAO.java:143) > at > org.apache.nifi.controller.StandardFlowService.createDataFlowFromController(StandardFlowService.java:607) > at > org.apache.nifi.controller.StandardFlowService.createDataFlowFromController(StandardFlowService.java:100) > at > org.apache.nifi.cluster.coordination.node.NodeClusterCoordinator$2.run(NodeClusterCoordinator.java:706) > at java.lang.Thread.run(Thread.java:748) > Caused by: > org.apache.nifi.controller.serialization.FlowSerializationException: > java.lang.ClassCastException: > org.apache.nifi.web.api.dto.TemplateDTO$JaxbAccessorM_getDescription_setDescription_java_lang_String > cannot be cast to com.sun.xml.internal.bind.v2.runtime.reflect.Accessor > at > org.apache.nifi.controller.serialization.StandardFlowSerializer.addTemplate(StandardFlowSerializer.java:546) > at > org.apache.nifi.controller.serialization.StandardFlowSerializer.addProcessGroup(StandardFlowSerializer.java:203) > at > org.apache.nifi.controller.serialization.StandardFlowSerializer.addProcessGroup(StandardFlowSerializer.java:187) > at > org.apache.nifi.controller.serialization.StandardFlowSerializer.addProcessGroup(StandardFlowSerializer.java:187) > at > org.apache.nifi.controller.serialization.StandardFlowSerializer.serialize(StandardFlowSerializer.java:97) > at > org.apache.nifi.controller.FlowController.serialize(FlowController.java:1544) > at > org.apache.nifi.persistence.StandardXMLFlowConfigurationDAO.save(StandardXMLFlowConfigurationDAO.java:141) > ... 4 common frames omitted > Caused by: java.lang.ClassCastException: > org.apache.nifi.web.api.dto.TemplateDTO$JaxbAccessorM_getDescription_setDescription_java_lang_String > cannot be cast to com.sun.xml.internal.bind.v2.runtime.reflect.Accessor > at > com.sun.xml.internal.bind.v2.runtime.reflect.opt.OptimizedAccessorFactory.instanciate(OptimizedAccessorFactory.java:190) > at > com.sun.xml.internal.bind.v2.runtime.reflect.opt.OptimizedAccessorFactory.get(OptimizedAccessorFactory.java:129) > at > com.sun.xml.internal.bind.v2.runtime.reflect.Accessor$GetterSetterReflection.optimize(Accessor.java:388) > at > com.sun.xml.internal.bind.v2.runtime.property.SingleElementLeafProperty.(SingleElementLeafProperty.java:77) > at sun.reflect.GeneratedConstructorAccessor435.newInstance(Unknown > Source) > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) > at java.lang.reflect.Constructor.newInstance(Constructor.java:423) > at > com.sun.xml.internal.bind.v2.runtime.property.PropertyFactory.create(PropertyFactory.java:113) > at > com.sun.xml.internal.bind.v2.runtime.ClassBeanInfoImpl.(ClassBeanInfoImpl.java:166) > at > com.sun.xml.internal.bind.v2.runtime.JAXBContextImpl.getOrCreate(JAXBContextImpl.java:488) > at > com.sun.xml.internal.bind.v2.runtime.JAXBContextImpl.(JAXBContextImpl.java:305) -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[jira] [Commented] (NIFI-4543) Improve HBase processors provenance transit URL
[ https://issues.apache.org/jira/browse/NIFI-4543?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16234010#comment-16234010 ] ASF GitHub Bot commented on NIFI-4543: -- Github user ijokarumawak commented on the issue: https://github.com/apache/nifi/pull/2237 @pvillard31 @MikeThomsen Thanks for the comments! `connection.getAdmin().getClusterStatus().getMaster()` looks promising. I will test that approach and update PR. > Improve HBase processors provenance transit URL > --- > > Key: NIFI-4543 > URL: https://issues.apache.org/jira/browse/NIFI-4543 > Project: Apache NiFi > Issue Type: Improvement > Components: Extensions >Reporter: Koji Kawamura >Assignee: Koji Kawamura >Priority: Major > > HBase related processors report NiFi provenance events with transit URLs in a > format as 'hbase://tablename/rowid'. However, the URL is not descriptive > enough if a NiFi interacts with multiple HBase clusters having the same table > names. > HBase processors transit URL should include host information it operates > with, so that an URL can identify a HBase cluster. > Target Processors: > * FetchHBaseRow > * GetHBase > * PutHBaseCell > * PutHBaseJSON > * PutHBaseRecord -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[GitHub] nifi issue #2237: NIFI-4543: Improve HBase processors provenance transit URL
Github user ijokarumawak commented on the issue: https://github.com/apache/nifi/pull/2237 @pvillard31 @MikeThomsen Thanks for the comments! `connection.getAdmin().getClusterStatus().getMaster()` looks promising. I will test that approach and update PR. ---
[jira] [Updated] (NIFI-4490) Repetitive events detected on changing the incorrect driver name in CaptureChangeMySQL
[ https://issues.apache.org/jira/browse/NIFI-4490?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Koji Kawamura updated NIFI-4490: Status: Patch Available (was: In Progress) > Repetitive events detected on changing the incorrect driver name in > CaptureChangeMySQL > -- > > Key: NIFI-4490 > URL: https://issues.apache.org/jira/browse/NIFI-4490 > Project: Apache NiFi > Issue Type: Bug >Reporter: Matt Burgess >Assignee: Koji Kawamura >Priority: Major > > Followed the below steps > 1. In CaptureChangeMySQL, set the driver name to "com.mysql.some.driver" > 2. Create a table in the test database and start the CaptureChangeMySQL > processor > 3. "Error creating binlog enrichment JDBC connection" error message is thrown > which is correct. > This completes the first test. Next do the following > 1. Correct the driver name > 2. Reset the binlog events by running "RESET MASTER" > 3. Clear the state of the CaptureChangeMySQL processor. > 4. Start the CaptureChangeMySQL processor. > Result : The Create statement which was triggered when the driver was invalid > is detected thrice in the tests that I carried out by CaptureChangeMySQL > processor > Expected : The event should not have been detected at all since the binlogs > have been deleted, state reset and the database pattern hardcoded to a single > db -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[jira] [Commented] (NIFI-4490) Repetitive events detected on changing the incorrect driver name in CaptureChangeMySQL
[ https://issues.apache.org/jira/browse/NIFI-4490?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16234002#comment-16234002 ] ASF GitHub Bot commented on NIFI-4490: -- GitHub user ijokarumawak opened a pull request: https://github.com/apache/nifi/pull/2244 NIFI-4490: Ensure driver settings are correct before connecting binlog Thank you for submitting a contribution to Apache NiFi. In order to streamline the review of the contribution we ask you to ensure the following steps have been taken: ### For all changes: - [x] Is there a JIRA ticket associated with this PR? Is it referenced in the commit message? - [x] Does your PR title start with NIFI- where is the JIRA number you are trying to resolve? Pay particular attention to the hyphen "-" character. - [x] Has your PR been rebased against the latest commit within the target branch (typically master)? - [x] Is your initial contribution a single, squashed commit? ### For code changes: - [ ] Have you ensured that the full suite of tests is executed via mvn -Pcontrib-check clean install at the root nifi folder? - [x] Have you written or updated unit tests to verify your changes? - [ ] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [ ] If applicable, have you updated the LICENSE file, including the main LICENSE file under nifi-assembly? - [ ] If applicable, have you updated the NOTICE file, including the main NOTICE file found under nifi-assembly? - [ ] If adding new Properties, have you added .displayName in addition to .name (programmatic access) for each of the new properties? ### For documentation related changes: - [ ] Have you ensured that format looks appropriate for the output in which it is rendered? ### Note: Please ensure that once the PR is submitted, you check travis-ci for build issues and submit an update to your PR as soon as possible. You can merge this pull request into a Git repository by running: $ git pull https://github.com/ijokarumawak/nifi nifi-4490 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/nifi/pull/2244.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #2244 commit 6c929c9b56fd9c7821d7258d8de3da2495bdc11b Author: Koji Kawamura Date: 2017-11-01T12:28:04Z NIFI-4490: Ensure driver settings are correct before connecting binlog > Repetitive events detected on changing the incorrect driver name in > CaptureChangeMySQL > -- > > Key: NIFI-4490 > URL: https://issues.apache.org/jira/browse/NIFI-4490 > Project: Apache NiFi > Issue Type: Bug >Reporter: Matt Burgess >Assignee: Koji Kawamura >Priority: Major > > Followed the below steps > 1. In CaptureChangeMySQL, set the driver name to "com.mysql.some.driver" > 2. Create a table in the test database and start the CaptureChangeMySQL > processor > 3. "Error creating binlog enrichment JDBC connection" error message is thrown > which is correct. > This completes the first test. Next do the following > 1. Correct the driver name > 2. Reset the binlog events by running "RESET MASTER" > 3. Clear the state of the CaptureChangeMySQL processor. > 4. Start the CaptureChangeMySQL processor. > Result : The Create statement which was triggered when the driver was invalid > is detected thrice in the tests that I carried out by CaptureChangeMySQL > processor > Expected : The event should not have been detected at all since the binlogs > have been deleted, state reset and the database pattern hardcoded to a single > db -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[GitHub] nifi pull request #2244: NIFI-4490: Ensure driver settings are correct befor...
GitHub user ijokarumawak opened a pull request: https://github.com/apache/nifi/pull/2244 NIFI-4490: Ensure driver settings are correct before connecting binlog Thank you for submitting a contribution to Apache NiFi. In order to streamline the review of the contribution we ask you to ensure the following steps have been taken: ### For all changes: - [x] Is there a JIRA ticket associated with this PR? Is it referenced in the commit message? - [x] Does your PR title start with NIFI- where is the JIRA number you are trying to resolve? Pay particular attention to the hyphen "-" character. - [x] Has your PR been rebased against the latest commit within the target branch (typically master)? - [x] Is your initial contribution a single, squashed commit? ### For code changes: - [ ] Have you ensured that the full suite of tests is executed via mvn -Pcontrib-check clean install at the root nifi folder? - [x] Have you written or updated unit tests to verify your changes? - [ ] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [ ] If applicable, have you updated the LICENSE file, including the main LICENSE file under nifi-assembly? - [ ] If applicable, have you updated the NOTICE file, including the main NOTICE file found under nifi-assembly? - [ ] If adding new Properties, have you added .displayName in addition to .name (programmatic access) for each of the new properties? ### For documentation related changes: - [ ] Have you ensured that format looks appropriate for the output in which it is rendered? ### Note: Please ensure that once the PR is submitted, you check travis-ci for build issues and submit an update to your PR as soon as possible. You can merge this pull request into a Git repository by running: $ git pull https://github.com/ijokarumawak/nifi nifi-4490 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/nifi/pull/2244.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #2244 commit 6c929c9b56fd9c7821d7258d8de3da2495bdc11b Author: Koji Kawamura Date: 2017-11-01T12:28:04Z NIFI-4490: Ensure driver settings are correct before connecting binlog ---
[jira] [Assigned] (NIFI-4490) Repetitive events detected on changing the incorrect driver name in CaptureChangeMySQL
[ https://issues.apache.org/jira/browse/NIFI-4490?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Koji Kawamura reassigned NIFI-4490: --- Assignee: Koji Kawamura > Repetitive events detected on changing the incorrect driver name in > CaptureChangeMySQL > -- > > Key: NIFI-4490 > URL: https://issues.apache.org/jira/browse/NIFI-4490 > Project: Apache NiFi > Issue Type: Bug >Reporter: Matt Burgess >Assignee: Koji Kawamura >Priority: Major > > Followed the below steps > 1. In CaptureChangeMySQL, set the driver name to "com.mysql.some.driver" > 2. Create a table in the test database and start the CaptureChangeMySQL > processor > 3. "Error creating binlog enrichment JDBC connection" error message is thrown > which is correct. > This completes the first test. Next do the following > 1. Correct the driver name > 2. Reset the binlog events by running "RESET MASTER" > 3. Clear the state of the CaptureChangeMySQL processor. > 4. Start the CaptureChangeMySQL processor. > Result : The Create statement which was triggered when the driver was invalid > is detected thrice in the tests that I carried out by CaptureChangeMySQL > processor > Expected : The event should not have been detected at all since the binlogs > have been deleted, state reset and the database pattern hardcoded to a single > db -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[GitHub] nifi-minifi pull request #97: MINIFI-409: Skipping testMergeJournalsEmptyJou...
GitHub user jzonthemtn opened a pull request: https://github.com/apache/nifi-minifi/pull/97 MINIFI-409: Skipping testMergeJournalsEmptyJournal test on Windows. Thank you for submitting a contribution to Apache NiFi - MiNiFi. In order to streamline the review of the contribution we ask you to ensure the following steps have been taken: ### For all changes: - [X] Is there a JIRA ticket associated with this PR? Is it referenced in the commit message? - [X] Does your PR title start with MINIFI- where is the JIRA number you are trying to resolve? Pay particular attention to the hyphen "-" character. - [X] Has your PR been rebased against the latest commit within the target branch (typically master)? - [X] Is your initial contribution a single, squashed commit? ### For code changes: - [X] Have you ensured that the full suite of tests is executed via mvn -Pcontrib-check clean install at the root nifi-minifi folder? - [ ] Have you written or updated unit tests to verify your changes? - [ ] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [ ] If applicable, have you updated the LICENSE file, including the main LICENSE file under minifi-assembly? - [ ] If applicable, have you updated the NOTICE file, including the main NOTICE file found under minifi-assembly? ### For documentation related changes: - [ ] Have you ensured that format looks appropriate for the output in which it is rendered? ### Note: Please ensure that once the PR is submitted, you check travis-ci for build issues and submit an update to your PR as soon as possible. You can merge this pull request into a Git repository by running: $ git pull https://github.com/jzonthemtn/nifi-minifi MINIFI-409 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/nifi-minifi/pull/97.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #97 commit 1fc400bbfdaa388ac8b8708b8d3fc92a84388d9b Author: jzonthemtn Date: 2017-11-01T11:57:01Z MINIFI-409: Skipping testMergeJournalsEmptyJournal test on Windows. ---
[jira] [Updated] (NIFI-4505) MapCache/SimpleMapCache/PersistentMapCache: Add keyset method
[ https://issues.apache.org/jira/browse/NIFI-4505?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Brandon DeVries updated NIFI-4505: -- Summary: MapCache/SimpleMapCache/PersistentMapCache: Add keyset method (was: MapCache/SimpleMaCache/PersistentMapCache: Add keyset method) > MapCache/SimpleMapCache/PersistentMapCache: Add keyset method > - > > Key: NIFI-4505 > URL: https://issues.apache.org/jira/browse/NIFI-4505 > Project: Apache NiFi > Issue Type: Improvement >Affects Versions: 1.4.0 >Reporter: Jon Kessler >Priority: Minor > > Suggest adding a keyset method to the MapCache and implementations as well as > to any client/interface that make use of a MapCache. -- This message was sent by Atlassian JIRA (v6.4.14#64029)