[GitHub] nifi pull request: NIFI-1613 Initial version, try to improve conve...

2016-03-21 Thread ToivoAdams
GitHub user ToivoAdams opened a pull request:

https://github.com/apache/nifi/pull/293

NIFI-1613 Initial version, try to improve conversion for different SQ…

Initial version.
1. New method createSqlStringValue(), placeholder for future logic
2. New test testCreateSqlStringValue()
3. Refactored TestConvertJSONToSQL. 
  Setting up Connection pooling is expensive operation.
  So let's do this only once and reuse MockDBCPService in each test.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/ToivoAdams/nifi nifi-1613

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/293.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #293


commit 7c2eea7902f81456a880853d782615c8874a38a8
Author: Toivo Adams 
Date:   2016-03-20T19:13:15Z

NIFI-1613 Initial version, try to improve conversion for different SQL 
types. New test and refactored existing test to reuse DBCP service.




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: NIFI-1685 Optimize database Junit tests to redu...

2016-03-25 Thread ToivoAdams
GitHub user ToivoAdams opened a pull request:

https://github.com/apache/nifi/pull/304

NIFI-1685 Optimize database Junit tests to reduce execution time.

TestPutSQL and TestJdbcCommon are now optimized.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/ToivoAdams/nifi NIFI-1685

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/304.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #304


commit a01ad7131e77e11490659ae62abced79980dc622
Author: Toivo Adams 
Date:   2016-03-25T16:51:00Z

NIFI-1685 Optimize database Junit tests to reduce execution time.




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: NIFI-1685 Optimize database Junit tests to redu...

2016-03-25 Thread ToivoAdams
Github user ToivoAdams commented on a diff in the pull request:

https://github.com/apache/nifi/pull/304#discussion_r57464949
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/TestPutSQL.java
 ---
@@ -144,6 +168,7 @@ public void testInsertWithGeneratedKeys() throws 
InitializationException, Proces
 @Test
 public void testFailInMiddleWithBadStatement() throws 
InitializationException, ProcessException, SQLException, IOException {
 final TestRunner runner = TestRunners.newTestRunner(PutSQL.class);
+/
--- End diff --

You are right, my bad, I forgot to remove.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: nifi-1442 Strategy should not hold more than 5 ...

2016-03-27 Thread ToivoAdams
GitHub user ToivoAdams opened a pull request:

https://github.com/apache/nifi/pull/306

nifi-1442 Strategy should not hold more than 5 Bulletins in memory

NodeBulletinProcessingStrategy keep max 5 now and test

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/ToivoAdams/nifi nifi-1442

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/306.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #306


commit 64f3ac705d10f730d2277d44ed8d3e3e4d4fb8b3
Author: Toivo Adams 
Date:   2016-03-27T18:35:40Z

nifi-1442 Strategy should not hold more than 5 Bulletins in memory




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: Nifi 1214

2016-04-03 Thread ToivoAdams
GitHub user ToivoAdams opened a pull request:

https://github.com/apache/nifi/pull/321

Nifi 1214

NiFi-1214 Mock Framework should allow order-independent assumptions on 
FlowFiles, First version

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/ToivoAdams/nifi nifi-1214

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/321.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #321


commit 95525e12a6ada4e48f170a30aacf1cc2f3da6bd3
Author: Toivo Adams 
Date:   2016-04-02T17:58:43Z

nifi-1214 Initial version

commit 5e704af5763685fd5ad7577a6f69650bcf738e9f
Author: Toivo Adams 
Date:   2016-04-03T13:31:02Z

nifi-1214 Mock Framework should allow order-independent assumptions on 
FlowFiles, First version

commit f9265071883a03da42541eaae40a0730ec94f492
Author: Toivo Adams 
Date:   2016-04-03T13:44:09Z

nifi-1214 Mock Framework should allow order-independent assumptions on 
FlowFiles, First version




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: Nifi 1214

2016-04-18 Thread ToivoAdams
Github user ToivoAdams commented on a diff in the pull request:

https://github.com/apache/nifi/pull/321#discussion_r60099883
  
--- Diff: 
nifi-mock/src/main/java/org/apache/nifi/util/verifier/ConditionsBuilder.java ---
@@ -0,0 +1,52 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.util.verifier;
+
+import java.util.ArrayList;
+
+public class ConditionsBuilder {
+
+protected final ArrayList conditions = new ArrayList<>();
+
+public ConditionsBuilder(Condition condition) {
+conditions.add(condition);
+}
+
+static public ConditionsBuilder attributeEqual(String name, String 
value) {
+return new ConditionsBuilder(new AttributeEqual(name,value));
+}
+
+static public ConditionsBuilder contentEqual(String content) {
+return new ConditionsBuilder(new ContentEqual(content));
+}
+
+public ConditionsBuilder andAttributeEqual(String name, String value) {
+conditions.add(new AttributeEqual(name,value));
+return this;
+}
+
+public ConditionsBuilder andContentEqual(String content) {
+conditions.add(new ContentEqual(content));
+return this;
+}
+
+public ArrayList getConditions() {
+return conditions;
+}
+
--- End diff --

You are right.
But we are moving to Java 8 and Predicate will be used instead of 
ConditionsBuilder.
So this code is not relevant anymore.

Thanks
toivo


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: Nifi 1214

2016-04-20 Thread ToivoAdams
Github user ToivoAdams commented on a diff in the pull request:

https://github.com/apache/nifi/pull/321#discussion_r60382818
  
--- Diff: nifi-mock/src/main/java/org/apache/nifi/util/TestRunner.java ---
@@ -865,4 +866,21 @@
  * @return the State Manager that is used to store and retrieve state 
for the given controller service
  */
 MockStateManager getStateManager(ControllerService service);
+
+/**
+ *  Maybe we should add possibility to be more precise:
+ *  "All FlowFiles must meet all conditions"
+ *  or
+ *  "At least one FlowFile must meet all conditions"
+ *  or
+ *  "Each FlowFile should meet at least one condition"
+ *
+ *  Current functionality is: "Each FlowFile should meet at least one 
condition"
+ *  So instead of assertAllConditionsMet we should use something like 
assertFlowFileMeetAnyCondition
+ *  Or add extra parameter which specifies how FlowFile must meet 
conditions.
+ *
+ */
+void assertAllConditionsMet(final String relationshipName, 
ConditionsBuilder... andContentEqual);
--- End diff --

Predicates is a good idea.

“a very valuable method/test would be to indicate that all conditions are 
met exactly once.”
You mean exactly one (only one) FlowFile meets all conditions.
runner.assertExactlyOnceAllConditionsMet(MY_RELATIONSHIP, ...


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: nifi-1214b Mock Framework should allow order-in...

2016-04-28 Thread ToivoAdams
GitHub user ToivoAdams opened a pull request:

https://github.com/apache/nifi/pull/388

nifi-1214b Mock Framework should allow order-independent assumptions …

nifi-1214b Mock Framework should allow order-independent assumptions on 
FlowFiles. This replaces previous nifi-1214

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/ToivoAdams/nifi nifi-1214b

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/388.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #388


commit 11059e3c5da101059498dccd95da58ffaecfdab6
Author: Toivo Adams 
Date:   2016-04-28T15:50:35Z

nifi-1214b Mock Framework should allow order-independent assumptions on 
FlowFiles. This replaces previous nifi-1214




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: NIFI-1280 Create FilterCSVColumns Processor.

2016-05-07 Thread ToivoAdams
GitHub user ToivoAdams opened a pull request:

https://github.com/apache/nifi/pull/420

NIFI-1280 Create FilterCSVColumns Processor.

First version. 
SQL select statement is used to specify how CSV data should be transformed.
Modified Calcite CSV adapter is used for SQL execution.
Some code is borrowed from HiveJdbcCommon, methods convertToCsvStream() 

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/ToivoAdams/nifi nifi-1280

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/420.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #420


commit 72cdeb684cde342d6b240d496c0553afa78a4a6f
Author: Toivo Adams 
Date:   2016-05-07T09:29:15Z

NIFI-1280 Create FilterCSVColumns Processor.




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: NIFI-1280 Create FilterCSVColumns Processor.

2016-05-11 Thread ToivoAdams
Github user ToivoAdams commented on a diff in the pull request:

https://github.com/apache/nifi/pull/420#discussion_r62806092
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/TestFilterCSVColumns.java
 ---
@@ -0,0 +1,117 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import static org.junit.Assert.assertEquals;
+
+import java.io.FileInputStream;
+import java.io.IOException;
+import java.io.InputStream;
+import java.nio.file.Path;
+import java.nio.file.Paths;
+import java.sql.ResultSet;
+import java.sql.SQLException;
+import java.util.List;
+
+import org.apache.nifi.reporting.InitializationException;
+import org.apache.nifi.util.MockFlowFile;
+import org.apache.nifi.util.TestRunner;
+import org.apache.nifi.util.TestRunners;
+import org.junit.Test;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+public class TestFilterCSVColumns {
+
+private static final Logger LOGGER;
+
+static {
+System.setProperty("org.slf4j.simpleLogger.defaultLogLevel", 
"info");
+System.setProperty("org.slf4j.simpleLogger.showDateTime", "true");
+System.setProperty("org.slf4j.simpleLogger.log.nifi.io.nio", 
"debug");
+
System.setProperty("org.slf4j.simpleLogger.log.nifi.processors.standard.FilterCSVColumns",
 "debug");
+
System.setProperty("org.slf4j.simpleLogger.log.nifi.processors.standard.TestFilterCSVColumns",
 "debug");
+LOGGER = LoggerFactory.getLogger(TestFilterCSVColumns.class);
+}
+
+@Test
+public void testTransformSimple() throws InitializationException, 
IOException, SQLException {
+String sql = "select first_name, last_name, company_name, address, 
city from CSV.A where city='New York'";
+
+Path inpath = 
Paths.get("src/test/resources/TestFilterCSVColumns/US500.csv");
+InputStream in = new FileInputStream(inpath.toFile());
+
+ResultSet resultSet = FilterCSVColumns.transform(in, sql);
+
+int nrofColumns = resultSet.getMetaData().getColumnCount();
+
+for (int i = 1; i <= nrofColumns; i++) {
+System.out.print(resultSet.getMetaData().getColumnLabel(i) + " 
 ");
+}
+System.out.println();
+
+while (resultSet.next()) {
+for (int i = 1; i <= nrofColumns; i++) {
+System.out.print(resultSet.getString(i)+ "  ");
+}
+System.out.println();
+}
+}
+
+@Test
+public void testTransformCalc() throws InitializationException, 
IOException, SQLException {
+String sql = "select ID, AMOUNT1+AMOUNT2+AMOUNT3 as TOTAL from 
CSV.A where ID=100";
+
+Path inpath = 
Paths.get("src/test/resources/TestFilterCSVColumns/Numeric.csv");
+InputStream in = new FileInputStream(inpath.toFile());
+
+ResultSet resultSet = FilterCSVColumns.transform(in, sql);
+
+int nrofColumns = resultSet.getMetaData().getColumnCount();
+
+for (int i = 1; i <= nrofColumns; i++) {
+System.out.print(resultSet.getMetaData().getColumnLabel(i) + " 
 ");
+}
+System.out.println();
+
+while (resultSet.next()) {
+for (int i = 1; i <= nrofColumns; i++) {
+System.out.print(resultSet.getString(i)+ "  ");
+}
+double total = resultSet.getDouble("TOTAL");
+System.out.println();
+assertEquals(90.75, total, 0.0001);
+}
+}
+
+@Test
+public void testSimpleTy

[GitHub] nifi pull request: NIFI-1280 Create FilterCSVColumns Processor.

2016-05-11 Thread ToivoAdams
Github user ToivoAdams commented on a diff in the pull request:

https://github.com/apache/nifi/pull/420#discussion_r62806105
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/TestFilterCSVColumns.java
 ---
@@ -0,0 +1,117 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import static org.junit.Assert.assertEquals;
+
+import java.io.FileInputStream;
+import java.io.IOException;
+import java.io.InputStream;
+import java.nio.file.Path;
+import java.nio.file.Paths;
+import java.sql.ResultSet;
+import java.sql.SQLException;
+import java.util.List;
+
+import org.apache.nifi.reporting.InitializationException;
+import org.apache.nifi.util.MockFlowFile;
+import org.apache.nifi.util.TestRunner;
+import org.apache.nifi.util.TestRunners;
+import org.junit.Test;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+public class TestFilterCSVColumns {
+
+private static final Logger LOGGER;
+
+static {
+System.setProperty("org.slf4j.simpleLogger.defaultLogLevel", 
"info");
+System.setProperty("org.slf4j.simpleLogger.showDateTime", "true");
+System.setProperty("org.slf4j.simpleLogger.log.nifi.io.nio", 
"debug");
+
System.setProperty("org.slf4j.simpleLogger.log.nifi.processors.standard.FilterCSVColumns",
 "debug");
+
System.setProperty("org.slf4j.simpleLogger.log.nifi.processors.standard.TestFilterCSVColumns",
 "debug");
+LOGGER = LoggerFactory.getLogger(TestFilterCSVColumns.class);
+}
+
+@Test
+public void testTransformSimple() throws InitializationException, 
IOException, SQLException {
+String sql = "select first_name, last_name, company_name, address, 
city from CSV.A where city='New York'";
+
+Path inpath = 
Paths.get("src/test/resources/TestFilterCSVColumns/US500.csv");
+InputStream in = new FileInputStream(inpath.toFile());
+
+ResultSet resultSet = FilterCSVColumns.transform(in, sql);
+
+int nrofColumns = resultSet.getMetaData().getColumnCount();
+
+for (int i = 1; i <= nrofColumns; i++) {
+System.out.print(resultSet.getMetaData().getColumnLabel(i) + " 
 ");
+}
+System.out.println();
+
+while (resultSet.next()) {
+for (int i = 1; i <= nrofColumns; i++) {
+System.out.print(resultSet.getString(i)+ "  ");
+}
+System.out.println();
--- End diff --

Right, I will fix.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: NIFI-1280 Create FilterCSVColumns Processor.

2016-05-11 Thread ToivoAdams
Github user ToivoAdams commented on a diff in the pull request:

https://github.com/apache/nifi/pull/420#discussion_r62806135
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/TestFilterCSVColumns.java
 ---
@@ -0,0 +1,117 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import static org.junit.Assert.assertEquals;
+
+import java.io.FileInputStream;
+import java.io.IOException;
+import java.io.InputStream;
+import java.nio.file.Path;
+import java.nio.file.Paths;
+import java.sql.ResultSet;
+import java.sql.SQLException;
+import java.util.List;
+
+import org.apache.nifi.reporting.InitializationException;
+import org.apache.nifi.util.MockFlowFile;
+import org.apache.nifi.util.TestRunner;
+import org.apache.nifi.util.TestRunners;
+import org.junit.Test;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+public class TestFilterCSVColumns {
+
+private static final Logger LOGGER;
+
+static {
+System.setProperty("org.slf4j.simpleLogger.defaultLogLevel", 
"info");
+System.setProperty("org.slf4j.simpleLogger.showDateTime", "true");
+System.setProperty("org.slf4j.simpleLogger.log.nifi.io.nio", 
"debug");
+
System.setProperty("org.slf4j.simpleLogger.log.nifi.processors.standard.FilterCSVColumns",
 "debug");
+
System.setProperty("org.slf4j.simpleLogger.log.nifi.processors.standard.TestFilterCSVColumns",
 "debug");
+LOGGER = LoggerFactory.getLogger(TestFilterCSVColumns.class);
+}
+
+@Test
+public void testTransformSimple() throws InitializationException, 
IOException, SQLException {
+String sql = "select first_name, last_name, company_name, address, 
city from CSV.A where city='New York'";
+
+Path inpath = 
Paths.get("src/test/resources/TestFilterCSVColumns/US500.csv");
+InputStream in = new FileInputStream(inpath.toFile());
+
+ResultSet resultSet = FilterCSVColumns.transform(in, sql);
+
+int nrofColumns = resultSet.getMetaData().getColumnCount();
+
+for (int i = 1; i <= nrofColumns; i++) {
+System.out.print(resultSet.getMetaData().getColumnLabel(i) + " 
 ");
+}
+System.out.println();
+
+while (resultSet.next()) {
+for (int i = 1; i <= nrofColumns; i++) {
+System.out.print(resultSet.getString(i)+ "  ");
+}
+System.out.println();
+}
+}
+
+@Test
+public void testTransformCalc() throws InitializationException, 
IOException, SQLException {
+String sql = "select ID, AMOUNT1+AMOUNT2+AMOUNT3 as TOTAL from 
CSV.A where ID=100";
+
+Path inpath = 
Paths.get("src/test/resources/TestFilterCSVColumns/Numeric.csv");
+InputStream in = new FileInputStream(inpath.toFile());
+
+ResultSet resultSet = FilterCSVColumns.transform(in, sql);
+
+int nrofColumns = resultSet.getMetaData().getColumnCount();
+
+for (int i = 1; i <= nrofColumns; i++) {
+System.out.print(resultSet.getMetaData().getColumnLabel(i) + " 
 ");
+}
+System.out.println();
+
+while (resultSet.next()) {
+for (int i = 1; i <= nrofColumns; i++) {
+System.out.print(resultSet.getString(i)+ "  ");
--- End diff --

Right, I will fix.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: NIFI-1280 Create FilterCSVColumns Processor.

2016-05-11 Thread ToivoAdams
Github user ToivoAdams commented on a diff in the pull request:

https://github.com/apache/nifi/pull/420#discussion_r62806202
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/FilterCSVColumns.java
 ---
@@ -0,0 +1,258 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import static java.sql.Types.CHAR;
+import static java.sql.Types.LONGNVARCHAR;
+import static java.sql.Types.LONGVARCHAR;
+import static java.sql.Types.NCHAR;
+import static java.sql.Types.NVARCHAR;
+import static java.sql.Types.VARCHAR;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.InputStreamReader;
+import java.io.OutputStream;
+import java.io.Reader;
+import java.nio.charset.StandardCharsets;
+import java.sql.Connection;
+import java.sql.DriverManager;
+import java.sql.ResultSet;
+import java.sql.ResultSetMetaData;
+import java.sql.SQLException;
+import java.sql.Statement;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Properties;
+import java.util.Set;
+import java.util.concurrent.TimeUnit;
+
+import org.apache.calcite.adapter.csv.CsvSchemaFactory2;
+import org.apache.calcite.jdbc.CalciteConnection;
+import org.apache.calcite.schema.Schema;
+import org.apache.calcite.schema.SchemaPlus;
+import org.apache.commons.lang3.StringEscapeUtils;
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.SideEffectFree;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ProcessorLog;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.ProcessorInitializationContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.io.StreamCallback;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.stream.io.BufferedInputStream;
+import org.apache.nifi.util.StopWatch;
+
+import com.google.common.collect.ImmutableMap;
+
+@EventDriven
+@SideEffectFree
+@SupportsBatching
+@Tags({"xml", "xslt", "transform"})
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@CapabilityDescription("Filter out specific columns from CSV data. Some 
other transformations are also supported."
++ "Columns can be renamed, simple calculations performed, 
aggregations, etc."
++ "SQL select statement is used to specify how CSV data should be 
transformed."
++ "SQL statement follows standard SQL, some restrictions may 
apply."
++ "Successfully transformed CSV data is routed to the 'success' 
relationship."
++ "If transform fails, the original FlowFile is routed to the 
'failure' relationship")
+public class FilterCSVColumns  extends AbstractProcessor {
+
+public static final PropertyDescriptor SQL_SELECT = new 
PropertyDescriptor.Builder()
+.name("SQL select statement")
+.description("SQL select statement specifies how CSV data 
should be transfor

[GitHub] nifi pull request: NIFI-1280 Create FilterCSVColumns Processor.

2016-05-11 Thread ToivoAdams
Github user ToivoAdams commented on a diff in the pull request:

https://github.com/apache/nifi/pull/420#discussion_r62806150
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/FilterCSVColumns.java
 ---
@@ -0,0 +1,258 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import static java.sql.Types.CHAR;
+import static java.sql.Types.LONGNVARCHAR;
+import static java.sql.Types.LONGVARCHAR;
+import static java.sql.Types.NCHAR;
+import static java.sql.Types.NVARCHAR;
+import static java.sql.Types.VARCHAR;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.InputStreamReader;
+import java.io.OutputStream;
+import java.io.Reader;
+import java.nio.charset.StandardCharsets;
+import java.sql.Connection;
+import java.sql.DriverManager;
+import java.sql.ResultSet;
+import java.sql.ResultSetMetaData;
+import java.sql.SQLException;
+import java.sql.Statement;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Properties;
+import java.util.Set;
+import java.util.concurrent.TimeUnit;
+
+import org.apache.calcite.adapter.csv.CsvSchemaFactory2;
+import org.apache.calcite.jdbc.CalciteConnection;
+import org.apache.calcite.schema.Schema;
+import org.apache.calcite.schema.SchemaPlus;
+import org.apache.commons.lang3.StringEscapeUtils;
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.SideEffectFree;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ProcessorLog;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.ProcessorInitializationContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.io.StreamCallback;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.stream.io.BufferedInputStream;
+import org.apache.nifi.util.StopWatch;
+
+import com.google.common.collect.ImmutableMap;
+
+@EventDriven
+@SideEffectFree
+@SupportsBatching
+@Tags({"xml", "xslt", "transform"})
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@CapabilityDescription("Filter out specific columns from CSV data. Some 
other transformations are also supported."
++ "Columns can be renamed, simple calculations performed, 
aggregations, etc."
++ "SQL select statement is used to specify how CSV data should be 
transformed."
++ "SQL statement follows standard SQL, some restrictions may 
apply."
++ "Successfully transformed CSV data is routed to the 'success' 
relationship."
++ "If transform fails, the original FlowFile is routed to the 
'failure' relationship")
+public class FilterCSVColumns  extends AbstractProcessor {
+
+public static final PropertyDescriptor SQL_SELECT = new 
PropertyDescriptor.Builder()
+.name("SQL select statement")
+.description("SQL select statement specifies how CSV data 
should be transfor

[GitHub] nifi pull request: NIFI-1280 Create FilterCSVColumns Processor.

2016-05-11 Thread ToivoAdams
Github user ToivoAdams commented on a diff in the pull request:

https://github.com/apache/nifi/pull/420#discussion_r62806395
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/FilterCSVColumns.java
 ---
@@ -0,0 +1,258 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import static java.sql.Types.CHAR;
+import static java.sql.Types.LONGNVARCHAR;
+import static java.sql.Types.LONGVARCHAR;
+import static java.sql.Types.NCHAR;
+import static java.sql.Types.NVARCHAR;
+import static java.sql.Types.VARCHAR;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.InputStreamReader;
+import java.io.OutputStream;
+import java.io.Reader;
+import java.nio.charset.StandardCharsets;
+import java.sql.Connection;
+import java.sql.DriverManager;
+import java.sql.ResultSet;
+import java.sql.ResultSetMetaData;
+import java.sql.SQLException;
+import java.sql.Statement;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Properties;
+import java.util.Set;
+import java.util.concurrent.TimeUnit;
+
+import org.apache.calcite.adapter.csv.CsvSchemaFactory2;
+import org.apache.calcite.jdbc.CalciteConnection;
+import org.apache.calcite.schema.Schema;
+import org.apache.calcite.schema.SchemaPlus;
+import org.apache.commons.lang3.StringEscapeUtils;
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.SideEffectFree;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ProcessorLog;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.ProcessorInitializationContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.io.StreamCallback;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.stream.io.BufferedInputStream;
+import org.apache.nifi.util.StopWatch;
+
+import com.google.common.collect.ImmutableMap;
+
+@EventDriven
+@SideEffectFree
+@SupportsBatching
+@Tags({"xml", "xslt", "transform"})
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@CapabilityDescription("Filter out specific columns from CSV data. Some 
other transformations are also supported."
++ "Columns can be renamed, simple calculations performed, 
aggregations, etc."
++ "SQL select statement is used to specify how CSV data should be 
transformed."
++ "SQL statement follows standard SQL, some restrictions may 
apply."
++ "Successfully transformed CSV data is routed to the 'success' 
relationship."
++ "If transform fails, the original FlowFile is routed to the 
'failure' relationship")
+public class FilterCSVColumns  extends AbstractProcessor {
+
+public static final PropertyDescriptor SQL_SELECT = new 
PropertyDescriptor.Builder()
+.name("SQL select statement")
+.description("SQL select statement specifies how CSV data 
should be transfor

[GitHub] nifi pull request: NIFI-1280 Create FilterCSVColumns Processor.

2016-05-11 Thread ToivoAdams
Github user ToivoAdams commented on a diff in the pull request:

https://github.com/apache/nifi/pull/420#discussion_r62806416
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/FilterCSVColumns.java
 ---
@@ -0,0 +1,258 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import static java.sql.Types.CHAR;
+import static java.sql.Types.LONGNVARCHAR;
+import static java.sql.Types.LONGVARCHAR;
+import static java.sql.Types.NCHAR;
+import static java.sql.Types.NVARCHAR;
+import static java.sql.Types.VARCHAR;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.InputStreamReader;
+import java.io.OutputStream;
+import java.io.Reader;
+import java.nio.charset.StandardCharsets;
+import java.sql.Connection;
+import java.sql.DriverManager;
+import java.sql.ResultSet;
+import java.sql.ResultSetMetaData;
+import java.sql.SQLException;
+import java.sql.Statement;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Properties;
+import java.util.Set;
+import java.util.concurrent.TimeUnit;
+
+import org.apache.calcite.adapter.csv.CsvSchemaFactory2;
+import org.apache.calcite.jdbc.CalciteConnection;
+import org.apache.calcite.schema.Schema;
+import org.apache.calcite.schema.SchemaPlus;
+import org.apache.commons.lang3.StringEscapeUtils;
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.SideEffectFree;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ProcessorLog;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.ProcessorInitializationContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.io.StreamCallback;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.stream.io.BufferedInputStream;
+import org.apache.nifi.util.StopWatch;
+
+import com.google.common.collect.ImmutableMap;
+
+@EventDriven
+@SideEffectFree
+@SupportsBatching
+@Tags({"xml", "xslt", "transform"})
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@CapabilityDescription("Filter out specific columns from CSV data. Some 
other transformations are also supported."
++ "Columns can be renamed, simple calculations performed, 
aggregations, etc."
++ "SQL select statement is used to specify how CSV data should be 
transformed."
++ "SQL statement follows standard SQL, some restrictions may 
apply."
++ "Successfully transformed CSV data is routed to the 'success' 
relationship."
++ "If transform fails, the original FlowFile is routed to the 
'failure' relationship")
+public class FilterCSVColumns  extends AbstractProcessor {
+
+public static final PropertyDescriptor SQL_SELECT = new 
PropertyDescriptor.Builder()
+.name("SQL select statement")
+.description("SQL select statement specifies how CSV data 
should be transfor

[GitHub] nifi pull request: NIFI-1280 Create FilterCSVColumns Processor.

2016-05-11 Thread ToivoAdams
Github user ToivoAdams commented on a diff in the pull request:

https://github.com/apache/nifi/pull/420#discussion_r62807232
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/calcite/adapter/csv/CsvEnumerator2.java
 ---
@@ -0,0 +1,303 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to you under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.calcite.adapter.csv;
+
+import java.io.IOException;
+import java.text.ParseException;
+import java.util.ArrayList;
+import java.util.Date;
+import java.util.List;
+import java.util.TimeZone;
+
+import org.apache.calcite.adapter.java.JavaTypeFactory;
+import org.apache.calcite.linq4j.Enumerator;
+import org.apache.calcite.rel.type.RelDataType;
+import org.apache.calcite.util.Pair;
+import org.apache.commons.lang3.time.FastDateFormat;
+
+import au.com.bytecode.opencsv.CSVReader;
+
+
+/** Enumerator that reads from a CSV stream.
+ *
+ * @param  Row type
+ */
+class CsvEnumerator2 implements Enumerator {
--- End diff --

Yes, It’s modified Calcite CSV adapter.
Current Calcite adapter supports only reading from java.io.File
So CSV adapter is modified to support reading from java.io.InputStream

Modified CSV adapter is currently intermediate code.
In future Calcite should include CSV adapter which is able to read from 
java.io.InputStream
I (or someone else) will try to create correct CSV adapter.

And name CsvEnumerator2 means just second version of original CsvEnumerator
Please see also Calcite 
https://mail-archives.apache.org/mod_mbox/calcite-dev/201605.mbox/browser  
Reading CSV data from Stream



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: NIFI-1280 Create FilterCSVColumns Processor.

2016-05-11 Thread ToivoAdams
Github user ToivoAdams commented on a diff in the pull request:

https://github.com/apache/nifi/pull/420#discussion_r62807345
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/TestFilterCSVColumns.java
 ---
@@ -0,0 +1,117 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import static org.junit.Assert.assertEquals;
+
+import java.io.FileInputStream;
+import java.io.IOException;
+import java.io.InputStream;
+import java.nio.file.Path;
+import java.nio.file.Paths;
+import java.sql.ResultSet;
+import java.sql.SQLException;
+import java.util.List;
+
+import org.apache.nifi.reporting.InitializationException;
+import org.apache.nifi.util.MockFlowFile;
+import org.apache.nifi.util.TestRunner;
+import org.apache.nifi.util.TestRunners;
+import org.junit.Test;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+public class TestFilterCSVColumns {
+
+private static final Logger LOGGER;
+
+static {
+System.setProperty("org.slf4j.simpleLogger.defaultLogLevel", 
"info");
+System.setProperty("org.slf4j.simpleLogger.showDateTime", "true");
+System.setProperty("org.slf4j.simpleLogger.log.nifi.io.nio", 
"debug");
+
System.setProperty("org.slf4j.simpleLogger.log.nifi.processors.standard.FilterCSVColumns",
 "debug");
+
System.setProperty("org.slf4j.simpleLogger.log.nifi.processors.standard.TestFilterCSVColumns",
 "debug");
+LOGGER = LoggerFactory.getLogger(TestFilterCSVColumns.class);
+}
+
+@Test
+public void testTransformSimple() throws InitializationException, 
IOException, SQLException {
+String sql = "select first_name, last_name, company_name, address, 
city from CSV.A where city='New York'";
+
+Path inpath = 
Paths.get("src/test/resources/TestFilterCSVColumns/US500.csv");
+InputStream in = new FileInputStream(inpath.toFile());
+
+ResultSet resultSet = FilterCSVColumns.transform(in, sql);
+
+int nrofColumns = resultSet.getMetaData().getColumnCount();
+
+for (int i = 1; i <= nrofColumns; i++) {
+System.out.print(resultSet.getMetaData().getColumnLabel(i) + " 
 ");
+}
+System.out.println();
+
+while (resultSet.next()) {
+for (int i = 1; i <= nrofColumns; i++) {
+System.out.print(resultSet.getString(i)+ "  ");
+}
+System.out.println();
+}
+}
+
+@Test
+public void testTransformCalc() throws InitializationException, 
IOException, SQLException {
+String sql = "select ID, AMOUNT1+AMOUNT2+AMOUNT3 as TOTAL from 
CSV.A where ID=100";
--- End diff --

Stream here means reading CSV input from java.io.InputStream
This is not directly related to Calcite streaming queries.
(Select statement includes STREAM keyword)
https://calcite.apache.org/docs/stream.html

Current Calcite adapter supports only reading from java.io.File
So CSV adapter is modified to support reading from java.io.InputStream



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: NIFI-1280 Create FilterCSVColumns Processor.

2016-05-11 Thread ToivoAdams
Github user ToivoAdams commented on a diff in the pull request:

https://github.com/apache/nifi/pull/420#discussion_r62808179
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/resources/TestFilterCSVColumns/US500.csv
 ---
@@ -0,0 +1 @@

+FIRST_NAME:string,LAST_NAME,COMPANY_NAME,ADDRESS,CITY,COUNTY,STATE,zip,phone1,phone2,email,web
"James","Butt","Benton, John B Jr","6649 N Blue Gum St","New 
Orleans","Orleans","LA",70116,"504-621-8927","504-845-1427","jb...@gmail.com","http://www.bentonjohnbjr.com";
"Josephine","Darakjy","Chanay, Jeffrey A Esq","4 B Blue Ridge 
Blvd","Brighton","Livingston","MI",48116,"810-292-9388","810-374-9840","josephine_dara...@darakjy.org","http://www.chanayjeffreyaesq.com";
"Art","Venere","Chemel, James L Cpa","8 W Cerritos Ave 
#54","Bridgeport","Gloucester","NJ","08014","856-636-8749","856-264-4130","a...@venere.org","http://www.chemeljameslcpa.com";
"Lenna","Paprocki","Feltz Printing Service","639 Main 
St","Anchorage","Anchorage","AK",99501,"907-385-4412","907-921-2010","lpapro...@hotmail.com","http://www.feltzprintingservice.com";
"Donette","Foller","Printing Dimensions","34 Center 
St","Hamilton","Butler","OH",45011,"513-570-1893","513-549-4561","donette.fol...@cox.net","http://www.printingdi
 mensions.com"
"Simona","Morasca","Chapman, Ross E Esq","3 Mcauley 
Dr","Ashland","Ashland","OH",44805,"419-503-2484","419-800-6759","sim...@morasca.com","http://www.chapmanrosseesq.com";
"Mitsue","Tollner","Morlong Associates","7 Eads 
St","Chicago","Cook","IL",60632,"773-573-6914","773-924-8565","mitsue_toll...@yahoo.com","http://www.morlongassociates.com";
"Leota","Dilliard","Commercial Press","7 W Jackson Blvd","San Jose","Santa 
Clara","CA",95111,"408-752-3500","408-813-1105","le...@hotmail.com","http://www.commercialpress.com";
"Sage","Wieser","Truhlar And Truhlar Attys","5 Boston Ave #88","Sioux 
Falls","Minnehaha","SD",57105,"605-414-2147","605-794-4895","sage_wie...@cox.net","http://www.truhlarandtruhlarattys.com";
"Kris","Marrier","King, Christopher A Esq","228 Runamuck Pl 
#2808","Baltimore","Baltimore 
City","MD",21224,"410-655-8723","410-804-4694","k...@gmail.com","http://www.kingchristopheraesq.com";
"Minna","Amigon","Dorl, James J Esq","2371 Jerrold 
Ave","Kulpsville","Montgomery"
 
,"PA",19443,"215-874-1229","215-422-8694","minna_ami...@yahoo.com","http://www.dorljamesjesq.com";
"Abel","Maclead","Rangoni Of Florence","37275 St  Rt 17m M","Middle 
Island","Suffolk","NY",11953,"631-335-3414","631-677-3675","amacl...@gmail.com","http://www.rangoniofflorence.com";
"Kiley","Caldarera","Feiner Bros","25 E 75th St #69","Los Angeles","Los 
Angeles","CA",90034,"310-498-5651","310-254-3084","kiley.caldar...@aol.com","http://www.feinerbros.com";
"Graciela","Ruta","Buckley Miller & Wright","98 Connecticut Ave Nw","Chagrin 
Falls","Geauga","OH",44023,"440-780-8425","440-579-7763","gr...@cox.net","http://www.buckleymillerwright.com";
"Cammy","Albares","Rousseaux, Michael Esq","56 E Morehead 
St","Laredo","Webb","TX",78045,"956-537-6195","956-841-7216","calba...@gmail.com","http://www.rousseauxmichaelesq.com";
"Mattie","Poquette","Century Communications","73 State Road 434 
E","Phoenix","Maricopa","AZ",85013,"602-277-4385",

[GitHub] nifi pull request: NIFI-1280 Create FilterCSVColumns Processor.

2016-05-11 Thread ToivoAdams
Github user ToivoAdams commented on a diff in the pull request:

https://github.com/apache/nifi/pull/420#discussion_r62822639
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/TestFilterCSVColumns.java
 ---
@@ -0,0 +1,117 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import static org.junit.Assert.assertEquals;
+
+import java.io.FileInputStream;
+import java.io.IOException;
+import java.io.InputStream;
+import java.nio.file.Path;
+import java.nio.file.Paths;
+import java.sql.ResultSet;
+import java.sql.SQLException;
+import java.util.List;
+
+import org.apache.nifi.reporting.InitializationException;
+import org.apache.nifi.util.MockFlowFile;
+import org.apache.nifi.util.TestRunner;
+import org.apache.nifi.util.TestRunners;
+import org.junit.Test;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+public class TestFilterCSVColumns {
+
+private static final Logger LOGGER;
+
+static {
+System.setProperty("org.slf4j.simpleLogger.defaultLogLevel", 
"info");
+System.setProperty("org.slf4j.simpleLogger.showDateTime", "true");
+System.setProperty("org.slf4j.simpleLogger.log.nifi.io.nio", 
"debug");
+
System.setProperty("org.slf4j.simpleLogger.log.nifi.processors.standard.FilterCSVColumns",
 "debug");
+
System.setProperty("org.slf4j.simpleLogger.log.nifi.processors.standard.TestFilterCSVColumns",
 "debug");
+LOGGER = LoggerFactory.getLogger(TestFilterCSVColumns.class);
+}
+
+@Test
+public void testTransformSimple() throws InitializationException, 
IOException, SQLException {
+String sql = "select first_name, last_name, company_name, address, 
city from CSV.A where city='New York'";
+
+Path inpath = 
Paths.get("src/test/resources/TestFilterCSVColumns/US500.csv");
+InputStream in = new FileInputStream(inpath.toFile());
+
+ResultSet resultSet = FilterCSVColumns.transform(in, sql);
+
+int nrofColumns = resultSet.getMetaData().getColumnCount();
+
+for (int i = 1; i <= nrofColumns; i++) {
+System.out.print(resultSet.getMetaData().getColumnLabel(i) + " 
 ");
+}
+System.out.println();
+
+while (resultSet.next()) {
+for (int i = 1; i <= nrofColumns; i++) {
+System.out.print(resultSet.getString(i)+ "  ");
+}
+System.out.println();
+}
+}
+
+@Test
+public void testTransformCalc() throws InitializationException, 
IOException, SQLException {
+String sql = "select ID, AMOUNT1+AMOUNT2+AMOUNT3 as TOTAL from 
CSV.A where ID=100";
--- End diff --

Yes, reading from File and InputStream and be unified.
But current Calcite CSV adapter supports strictly only File. 

Also CSV adapter supports many ‘tables’. So SQL select supports also 
joins.
Each ‘table’ is separate File. CSV adapter is capable of discover 
‘tables’ using files.
CSV will scan all files in data directory.
Using InputStream requires some kind of solutions for multiple tables.

Some day we might have Calcite CSV adapter which is capable of using both 
Files and InputStreams. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: NIFI-1280 Create FilterCSVColumns Processor.

2016-05-11 Thread ToivoAdams
Github user ToivoAdams commented on the pull request:

https://github.com/apache/nifi/pull/420#issuecomment-218562252
  
Go ahead.
Glad to hear you see possibilities.

My full-time job doesn't leave me much time anyway during next few days.
So its actually good to have experienced developer to move thing on.

Other types, you mean Avro, JSON, etc input and output formats?
This means some sort of Calcite adapters must be also developed?

Thanks
toivo


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #321: Nifi 1214

2016-06-19 Thread ToivoAdams
Github user ToivoAdams commented on the issue:

https://github.com/apache/nifi/pull/321
  
Yes, this replaced by  #388 



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #321: Nifi 1214

2016-06-19 Thread ToivoAdams
Github user ToivoAdams closed the pull request at:

https://github.com/apache/nifi/pull/321


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #321: Nifi 1214

2016-06-19 Thread ToivoAdams
Github user ToivoAdams commented on the issue:

https://github.com/apache/nifi/pull/321
  
This closes #321 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---