[GitHub] [flink] wuchong commented on a change in pull request #9099: [FLINK-13237][table-planner-blink] Add expression table api test to blink

2019-08-02 Thread GitBox
wuchong commented on a change in pull request #9099: 
[FLINK-13237][table-planner-blink] Add expression table api test to blink
URL: https://github.com/apache/flink/pull/9099#discussion_r310131084
 
 

 ##
 File path: 
flink-table/flink-table-planner-blink/src/test/scala/org/apache/flink/table/planner/expressions/utils/ExpressionTestBase.scala
 ##
 @@ -162,31 +169,34 @@ abstract class ExpressionTestBase {
   }
 
   assertEquals(
-s"Wrong result for: [$originalExpr] optimized to: 
[$optimizedExpr]",
+s"Wrong result for: [$optimizedExpr]",
 
 Review comment:
   Why table api will be null?  Couldn't we use `Expression.toString()` ? 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] wuchong commented on a change in pull request #9099: [FLINK-13237][table-planner-blink] Add expression table api test to blink

2019-08-01 Thread GitBox
wuchong commented on a change in pull request #9099: 
[FLINK-13237][table-planner-blink] Add expression table api test to blink
URL: https://github.com/apache/flink/pull/9099#discussion_r309545022
 
 

 ##
 File path: 
flink-table/flink-table-planner-blink/src/test/scala/org/apache/flink/table/planner/expressions/validation/MapTypeValidationTest.scala
 ##
 @@ -27,16 +28,24 @@ class MapTypeValidationTest extends MapTypeTestBase {
 
   @Test(expected = classOf[ValidationException])
   def testWrongKeyType(): Unit = {
+testAllApis('f2.at(12), "f2.at(12)", "f2[12]", "FAIL")
 testSqlApi("f2[12]", "FAIL")
 
 Review comment:
   The exception test only covers the first expression. So I think we should 
remove `testSqlApi("f2[12]", "FAIL")` as it has been covered 
`testAllApis('f2.at(12), "f2.at(12)", "f2[12]", "FAIL")`. The same with 
following tests.
   
   Btw, `testAllApis('f2.at(12), "f2.at(12)", "f2[12]", "FAIL")` can't test all 
expressions. Only the first `'f2.at(12)` is covered. We should fix that in the 
future. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] wuchong commented on a change in pull request #9099: [FLINK-13237][table-planner-blink] Add expression table api test to blink

2019-08-01 Thread GitBox
wuchong commented on a change in pull request #9099: 
[FLINK-13237][table-planner-blink] Add expression table api test to blink
URL: https://github.com/apache/flink/pull/9099#discussion_r309554284
 
 

 ##
 File path: 
flink-table/flink-table-planner-blink/src/test/scala/org/apache/flink/table/planner/expressions/validation/ScalarOperatorsValidationTest.scala
 ##
 @@ -0,0 +1,93 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.expressions.validation
+
+import org.apache.flink.table.api.ValidationException
+import org.apache.flink.table.api.scala._
+import org.apache.flink.table.planner.expressions.utils.ScalarOperatorsTestBase
+
+import org.junit.Test
+
+class ScalarOperatorsValidationTest extends ScalarOperatorsTestBase {
+
+  @Test(expected = classOf[ValidationException])
+  def testIfInvalidTypesScala(): Unit = {
+testTableApi(('f6 && true).?(5, "false"), "FAIL", "FAIL")
+  }
+
+  @Test(expected = classOf[ValidationException])
+  def testIfInvalidTypesJava(): Unit = {
+testTableApi("FAIL", "(f8 && true).?(5, 'false')", "FAIL")
+  }
+
+  @Test(expected = classOf[ValidationException])
+  def testInvalidStringComparison1(): Unit = {
+testTableApi("w" === 4, "FAIL", "FAIL")
+  }
+
+  @Test(expected = classOf[ValidationException])
+  def testInvalidStringComparison2(): Unit = {
+testTableApi("w" > 4.toExpr, "FAIL", "FAIL")
+  }
+
+  // 
--
+  // Sub-query functions
+  // 
--
+
+  @Test(expected = classOf[ValidationException])
+  def testInMoreThanOneTypes(): Unit = {
+testTableApi(
+  'f2.in('f3, 'f10, 4),
+  "FAIL",
+  "FAIL"
+)
+testTableApi(
+  'f2.in('f3, 'f4, 4),  // OK if all numeric
 
 Review comment:
   This should be removed, and it will fail in blink planner.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] wuchong commented on a change in pull request #9099: [FLINK-13237][table-planner-blink] Add expression table api test to blink

2019-08-01 Thread GitBox
wuchong commented on a change in pull request #9099: 
[FLINK-13237][table-planner-blink] Add expression table api test to blink
URL: https://github.com/apache/flink/pull/9099#discussion_r309594125
 
 

 ##
 File path: 
flink-table/flink-table-planner-blink/src/test/scala/org/apache/flink/table/planner/expressions/ScalarFunctionsTest.scala
 ##
 @@ -1198,583 +1498,992 @@ class ScalarFunctionsTest extends 
ScalarTypesTestBase {
   @Test
   def testDivide(): Unit = {
 
-testSqlApi(
+testAllApis(
+  151435632L / 6.0, // the `/` is Scala operator, not Flink 
TableApi operator
+  "151435632L / 6",
   "151435632 / 6",
   "2.5239272E7")
 
-// DIV return decimal
-testSqlApi(
-  "DIV(151435632, 6)",
-  "25239272")
-
-testSqlApi(
+testAllApis(
+  'f7 / 2,
+  "f7 / 2",
   "f7 / 2",
   "1.5")
 
 // f34 => Decimal(19,0)
 // 6 => Integer => Decimal(10,0)
 // Decimal(19,0) / Decimal(10,0) => Decimal(30,11)
-testSqlApi(
+testAllApis(
+  'f34 / 6,
+  "f34 / 6",
   "f34 / 6",
   "25239272.000")
 
 // Decimal(19,0) / Decimal(19,0) => Decimal(39,20) => Decimal(38,19)
-testSqlApi(
+testAllApis(
+  'f34 / 'f34,
+  "f34 / f34",
   "f34 / f34",
   "1.000")
   }
 
   @Test
   def testMod(): Unit = {
 
-testSqlApi(
+testAllApis(
+  151435632L % 6,
+  "151435632L % 6",
   "mod(151435632,6)",
   "0")
 
-testSqlApi(
+testAllApis(
+  'f34.mod('f34),
+  "f34.mod(f34)",
   "mod(f34,f34)",
   "0")
 
-testSqlApi(
+testAllApis(
+  'f34.mod(6),
+  "f34.mod(6)",
   "mod(f34,6)",
   "0")
 
-testSqlApi(
+testAllApis(
+  'f4.mod('f7),
+  "f4.mod(f7)",
   "MOD(f4, f7)",
   "2")
 
-testSqlApi(
+testAllApis(
+  'f4.mod(3),
+  "mod(f4, 3)",
   "MOD(f4, 3)",
   "2")
 
-testSqlApi(
+testAllApis(
+  'f4 % 3,
+  "mod(44, 3)",
   "MOD(44, 3)",
   "2")
   }
 
   @Test
   def testExp(): Unit = {
-testSqlApi(
+testAllApis(
+  'f2.exp(),
+  "f2.exp()",
   "EXP(f2)",
   math.exp(42.toByte).toString)
 
-testSqlApi(
+testAllApis(
+  'f3.exp(),
+  "f3.exp()",
   "EXP(f3)",
   math.exp(43.toShort).toString)
 
-testSqlApi(
+testAllApis(
+  'f4.exp(),
+  "f4.exp()",
   "EXP(f4)",
   math.exp(44.toLong).toString)
 
-testSqlApi(
+testAllApis(
+  'f5.exp(),
+  "f5.exp()",
   "EXP(f5)",
   math.exp(4.5.toFloat).toString)
 
-testSqlApi(
+testAllApis(
+  'f6.exp(),
+  "f6.exp()",
   "EXP(f6)",
   math.exp(4.6).toString)
 
-testSqlApi(
+testAllApis(
+  'f7.exp(),
+  "exp(f7)",
   "EXP(f7)",
   math.exp(3).toString)
 
-testSqlApi(
+testAllApis(
+  3.exp(),
+  "exp(3)",
   "EXP(3)",
   math.exp(3).toString)
   }
 
   @Test
   def testLog10(): Unit = {
-testSqlApi(
+testAllApis(
+  'f2.log10(),
+  "f2.log10()",
   "LOG10(f2)",
   math.log10(42.toByte).toString)
 
-testSqlApi(
+testAllApis(
+  'f3.log10(),
+  "f3.log10()",
   "LOG10(f3)",
   math.log10(43.toShort).toString)
 
-testSqlApi(
+testAllApis(
+  'f4.log10(),
+  "f4.log10()",
   "LOG10(f4)",
   math.log10(44.toLong).toString)
 
-testSqlApi(
+testAllApis(
+  'f5.log10(),
+  "f5.log10()",
   "LOG10(f5)",
   math.log10(4.5.toFloat).toString)
 
-testSqlApi(
+testAllApis(
+  'f6.log10(),
+  "f6.log10()",
   "LOG10(f6)",
   math.log10(4.6).toString)
   }
 
   @Test
   def testPower(): Unit = {
 // f7: int , f4: long, f6: double
-testSqlApi(
+testAllApis(
+  'f2.power('f7),
+  "f2.power(f7)",
   "POWER(f2, f7)",
   math.pow(42.toByte, 3).toString)
 
-testSqlApi(
+testAllApis(
+  'f3.power('f6),
+  "f3.power(f6)",
   "POWER(f3, f6)",
   math.pow(43.toShort, 4.6D).toString)
 
-testSqlApi(
+testAllApis(
+  'f4.power('f5),
+  "f4.power(f5)",
   "POWER(f4, f5)",
   math.pow(44.toLong, 4.5.toFloat).toString)
 
-testSqlApi(
+testAllApis(
+  'f4.power('f5),
+  "f4.power(f5)",
   "POWER(f4, f5)",
   math.pow(44.toLong, 4.5.toFloat).toString)
 
 // f5: float
-testSqlApi(
+testAllApis('f5.power('f5),
+  "f5.power(f5)",
   "power(f5, f5)",
   math.pow(4.5F, 4.5F).toString)
 
-testSqlApi(
+testAllApis('f5.power('f6),
+  "f5.power(f6)",
   "power(f5, f6)",
   math.pow(4.5F, 4.6D).toString)
 
-testSqlApi(
+testAllApis('f5.power('f7),
+  "f5.power(f7)",
   "power(f5, f7)",
   math.pow(4.5F, 3).toString)
 
-testSqlApi(
+testAllApis('f5.power('f4),
+  "f5.power(f4)",
   "power(f5, f4)",
   math.pow(4.5F, 44L).toStri

[GitHub] [flink] wuchong commented on a change in pull request #9099: [FLINK-13237][table-planner-blink] Add expression table api test to blink

2019-08-01 Thread GitBox
wuchong commented on a change in pull request #9099: 
[FLINK-13237][table-planner-blink] Add expression table api test to blink
URL: https://github.com/apache/flink/pull/9099#discussion_r309560829
 
 

 ##
 File path: 
flink-table/flink-table-planner-blink/src/test/scala/org/apache/flink/table/planner/expressions/MathFunctionsTest.scala
 ##
 @@ -29,598 +31,836 @@ class MathFunctionsTest extends ScalarTypesTestBase {
 
   @Test
   def testMod(): Unit = {
-testSqlApi(
+testAllApis(
 
 Review comment:
   Remove this file? Or remove the same tests in `ScalarFunctionsTest`.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] wuchong commented on a change in pull request #9099: [FLINK-13237][table-planner-blink] Add expression table api test to blink

2019-08-01 Thread GitBox
wuchong commented on a change in pull request #9099: 
[FLINK-13237][table-planner-blink] Add expression table api test to blink
URL: https://github.com/apache/flink/pull/9099#discussion_r309537284
 
 

 ##
 File path: 
flink-table/flink-table-planner-blink/src/test/scala/org/apache/flink/table/planner/expressions/utils/ExpressionTestBase.scala
 ##
 @@ -162,31 +169,34 @@ abstract class ExpressionTestBase {
   }
 
   assertEquals(
-s"Wrong result for: [$originalExpr] optimized to: 
[$optimizedExpr]",
+s"Wrong result for: [$optimizedExpr]",
 
 Review comment:
   I would like to keep the expression string to make the exception readable. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] wuchong commented on a change in pull request #9099: [FLINK-13237][table-planner-blink] Add expression table api test to blink

2019-08-01 Thread GitBox
wuchong commented on a change in pull request #9099: 
[FLINK-13237][table-planner-blink] Add expression table api test to blink
URL: https://github.com/apache/flink/pull/9099#discussion_r309564946
 
 

 ##
 File path: 
flink-table/flink-table-planner-blink/src/test/scala/org/apache/flink/table/planner/expressions/UserDefinedScalarFunctionTest.scala
 ##
 @@ -0,0 +1,468 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.expressions
+
+import org.apache.flink.api.common.typeinfo.{BasicArrayTypeInfo, 
TypeInformation}
+import org.apache.flink.api.java.typeutils.RowTypeInfo
+import org.apache.flink.table.api.scala._
+import org.apache.flink.table.api.{DataTypes, Types, ValidationException}
+import org.apache.flink.table.functions.ScalarFunction
+import org.apache.flink.table.planner.expressions.utils.{ExpressionTestBase, _}
+import 
org.apache.flink.table.planner.runtime.utils.JavaUserDefinedScalarFunctions._
+import org.apache.flink.table.planner.utils.DateTimeTestUtil
+import org.apache.flink.types.Row
+
+import org.junit.Test
+
+import java.lang.{Boolean => JBoolean}
+
+class UserDefinedScalarFunctionTest extends ExpressionTestBase {
+
+  @Test
+  def testParameters(): Unit = {
+testAllApis(
+  Func0('f0),
+  "Func0(f0)",
+  "Func0(f0)",
+  "42")
+
+testAllApis(
+  Func1('f0),
+  "Func1(f0)",
+  "Func1(f0)",
+  "43")
+
+testAllApis(
+  Func1('f11),
+  "Func1(f11)",
+  "Func1(f11)",
+  "4")
+
+testAllApis(
+  Func1('f12),
+  "Func1(f12)",
+  "Func1(f12)",
+  "4")
+
+testAllApis(
+  Func1('f13),
+  "Func1(f13)",
+  "Func1(f13)",
+  "4.0")
+
+testAllApis(
+  Func2('f0, 'f1, 'f3),
+  "Func2(f0, f1, f3)",
+  "Func2(f0, f1, f3)",
+  "42 and Test and SimplePojo(Bob,36)")
+
+testAllApis(
+  Func0(123),
+  "Func0(123)",
+  "Func0(123)",
+  "123")
+
+// TODO: GenericType with Date/Time/Timestamp -> String would call 
toString implicitl
+testAllApis(
+  Func6('f4, 'f5, 'f6),
+  "Func6(f4, f5, f6)",
+  "Func6(f4, f5, f6)",
+  "(1990-10-14,12:10:10,1990-10-14 12:10:10.0)")
+
+// function names containing keywords
+testAllApis(
+  Func0('f0),
+  "getFunc0(f0)",
+  "getFunc0(f0)",
+  "42")
+
+testAllApis(
+  Func0('f0),
+  "asAlways(f0)",
+  "asAlways(f0)",
+  "42")
+
+testAllApis(
+  Func0('f0),
+  "toWhatever(f0)",
+  "toWhatever(f0)",
+  "42")
+
+testAllApis(
+  Func0('f0),
+  "Nullable(f0)",
+  "Nullable(f0)",
+  "42")
+
+// test row type input
+testAllApis(
+  Func20('f14),
+  "Func20(f14)",
+  "Func20(f14)",
+  "(12,true,(1,2,3))"
+)
+  }
+
+  @Test
+  def testNullableParameters(): Unit = {
+testAllApis(
+  Func3(nullOf(DataTypes.INT), nullOf(DataTypes.STRING)),
+  "Func3(Null(INT), Null(STRING))",
+  "Func3(NULL, NULL)",
+  "null and null")
+
+testAllApis(
+  Func3(nullOf(DataTypes.INT), "Test"),
+  "Func3(Null(INT), 'Test')",
+  "Func3(NULL, 'Test')",
+  "null and Test")
+
+testAllApis(
+  Func3(42, nullOf(DataTypes.STRING)),
+  "Func3(42, Null(STRING))",
+  "Func3(42, NULL)",
+  "42 and null")
+
+testAllApis(
+  Func0(nullOf(DataTypes.INT)),
+  "Func0(Null(INT))",
+  "Func0(NULL)",
+  "-1")
+  }
+
+  @Test
+  def testDoubleQuoteParameters(): Unit = {
+val hello = "\"\""
+testAllApis(
+  Func3(42, hello),
+  s"Func3(42, '$hello')",
+  s"Func3(42, '$hello')",
+  s"42 and $hello")
+  }
+
+  @Test
+  def testResults(): Unit = {
+testAllApis(
+  Func4(),
+  "Func4()",
+  "Func4()",
+  "null")
+
+testAllApis(
+  Func5(),
+  "Func5()",
+  "Func5()",
+  "-1")
+  }
+
+  @Test
+  def testNesting(): Unit = {
+testAllApis(
+  Func0(Func0('f0)),
+  "Func0(Func0(f0))",
+  "Func0(Func0(f0))",
+  "42")
+
+testAllApis(
+  Func0(Func0('f0)),
+  "Func0(Func0(f0))",
+  "Func0(Func0(f0))",
+  "42")
+
+testAllApis(
+  Func7(Func7(Func7(1, 1), Func7(1, 1)), Fun

[GitHub] [flink] wuchong commented on a change in pull request #9099: [FLINK-13237][table-planner-blink] Add expression table api test to blink

2019-08-01 Thread GitBox
wuchong commented on a change in pull request #9099: 
[FLINK-13237][table-planner-blink] Add expression table api test to blink
URL: https://github.com/apache/flink/pull/9099#discussion_r309561355
 
 

 ##
 File path: 
flink-table/flink-table-planner-blink/src/test/scala/org/apache/flink/table/planner/expressions/NonDeterministicTests.scala
 ##
 @@ -27,47 +28,64 @@ import org.junit.{Ignore, Test}
 /**
   * Tests that can only be checked manually as they are non-deterministic.
   */
-@Ignore
 
 Review comment:
   nit: I would like to keep the class level `@Ignore`. This will be easier to 
run the tests in local. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] wuchong commented on a change in pull request #9099: [FLINK-13237][table-planner-blink] Add expression table api test to blink

2019-08-01 Thread GitBox
wuchong commented on a change in pull request #9099: 
[FLINK-13237][table-planner-blink] Add expression table api test to blink
URL: https://github.com/apache/flink/pull/9099#discussion_r309591683
 
 

 ##
 File path: 
flink-table/flink-table-planner-blink/src/test/scala/org/apache/flink/table/planner/expressions/ScalarFunctionsTest.scala
 ##
 @@ -869,6 +978,17 @@ class ScalarFunctionsTest extends ScalarTypesTestBase {
 }
   }
 
+  @Test
+  def testPosition2(): Unit = {
+
+// NOTE: Spark names this function as instr() , i.e. "in string"
+testSqlApi("position('aa' in 'aaads')", "1")
 
 Review comment:
   Move this to `testPosition()`? 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] wuchong commented on a change in pull request #9099: [FLINK-13237][table-planner-blink] Add expression table api test to blink

2019-08-01 Thread GitBox
wuchong commented on a change in pull request #9099: 
[FLINK-13237][table-planner-blink] Add expression table api test to blink
URL: https://github.com/apache/flink/pull/9099#discussion_r309556473
 
 

 ##
 File path: 
flink-table/flink-table-planner-blink/src/test/scala/org/apache/flink/table/planner/expressions/BuiltinScalarFunctionTest.scala
 ##
 @@ -0,0 +1,311 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.flink.table.planner.expressions
+
+import org.apache.flink.api.java.typeutils.RowTypeInfo
+import org.apache.flink.table.api.{Types, ValidationException}
+import org.apache.flink.table.planner.expressions.utils.ExpressionTestBase
+import org.apache.flink.table.planner.utils.DateTimeTestUtil
+import org.apache.flink.table.runtime.typeutils.BigDecimalTypeInfo
+import org.apache.flink.types.Row
+
+import org.junit.rules.ExpectedException
+import org.junit.{Ignore, Rule, Test}
+
+import java.math.{BigDecimal => JBigDecimal}
+
+import scala.annotation.meta.getter
+
+class BuiltinScalarFunctionTest extends ExpressionTestBase {
 
 Review comment:
   Most of tests have been moved to `ScalarFunctionsTest`. You can check and 
move the missing ones to there.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] wuchong commented on a change in pull request #9099: [FLINK-13237][table-planner-blink] Add expression table api test to blink

2019-08-01 Thread GitBox
wuchong commented on a change in pull request #9099: 
[FLINK-13237][table-planner-blink] Add expression table api test to blink
URL: https://github.com/apache/flink/pull/9099#discussion_r309557480
 
 

 ##
 File path: 
flink-table/flink-table-planner-blink/src/test/scala/org/apache/flink/table/planner/expressions/DecimalTypeTest.scala
 ##
 @@ -202,22 +305,22 @@ class DecimalTypeTest extends ExpressionTestBase {
 
   override def testData: Row = {
 val testData = new Row(6)
-testData.setField(0, Decimal.castFrom("123456789.123456789123456789", 30, 
18))
-testData.setField(1, Decimal.castFrom("123456789123456789123456789", 30, 
0))
+testData.setField(0, BigDecimal("123456789.123456789123456789").bigDecimal)
+testData.setField(1, BigDecimal("123456789123456789123456789").bigDecimal)
 testData.setField(2, 42)
 testData.setField(3, 4.2)
-testData.setField(4, Decimal.castFrom("123456789", 10, 0))
-testData.setField(5, Decimal.castFrom("0.000", 10, 3))
+testData.setField(4, BigDecimal("123456789").bigDecimal)
+testData.setField(5, BigDecimal("0.000").bigDecimal)
 testData
   }
 
   override def typeInfo: RowTypeInfo = {
 new RowTypeInfo(
-  /* 0 */ DecimalTypeInfo.of(30, 18),
-  /* 1 */ DecimalTypeInfo.of(30, 0),
-  /* 2 */ Types.INT,
-  /* 3 */ Types.DOUBLE,
-  /* 4 */ DecimalTypeInfo.of(10, 0),
-  /* 5 */ DecimalTypeInfo.of(10, 3))
 
 Review comment:
   Please keep the comment index. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] wuchong commented on a change in pull request #9099: [FLINK-13237][table-planner-blink] Add expression table api test to blink

2019-08-01 Thread GitBox
wuchong commented on a change in pull request #9099: 
[FLINK-13237][table-planner-blink] Add expression table api test to blink
URL: https://github.com/apache/flink/pull/9099#discussion_r309554158
 
 

 ##
 File path: 
flink-table/flink-table-planner-blink/src/test/scala/org/apache/flink/table/planner/expressions/validation/ScalarOperatorsValidationTest.scala
 ##
 @@ -0,0 +1,93 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.expressions.validation
+
+import org.apache.flink.table.api.ValidationException
+import org.apache.flink.table.api.scala._
+import org.apache.flink.table.planner.expressions.utils.ScalarOperatorsTestBase
+
+import org.junit.Test
+
+class ScalarOperatorsValidationTest extends ScalarOperatorsTestBase {
+
+  @Test(expected = classOf[ValidationException])
+  def testIfInvalidTypesScala(): Unit = {
+testTableApi(('f6 && true).?(5, "false"), "FAIL", "FAIL")
+  }
+
+  @Test(expected = classOf[ValidationException])
+  def testIfInvalidTypesJava(): Unit = {
+testTableApi("FAIL", "(f8 && true).?(5, 'false')", "FAIL")
+  }
+
+  @Test(expected = classOf[ValidationException])
+  def testInvalidStringComparison1(): Unit = {
+testTableApi("w" === 4, "FAIL", "FAIL")
+  }
+
+  @Test(expected = classOf[ValidationException])
+  def testInvalidStringComparison2(): Unit = {
+testTableApi("w" > 4.toExpr, "FAIL", "FAIL")
+  }
+
+  // 
--
+  // Sub-query functions
+  // 
--
+
+  @Test(expected = classOf[ValidationException])
+  def testInMoreThanOneTypes(): Unit = {
+testTableApi(
+  'f2.in('f3, 'f10, 4),
 
 Review comment:
   `'f2.in('f3, 'f10, 4)` -> `'f2.in('f3, 'f4, 4)` ? 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] wuchong commented on a change in pull request #9099: [FLINK-13237][table-planner-blink] Add expression table api test to blink

2019-08-01 Thread GitBox
wuchong commented on a change in pull request #9099: 
[FLINK-13237][table-planner-blink] Add expression table api test to blink
URL: https://github.com/apache/flink/pull/9099#discussion_r309558289
 
 

 ##
 File path: 
flink-table/flink-table-planner-blink/src/test/scala/org/apache/flink/table/planner/expressions/LiteralPrefixTest.scala
 ##
 @@ -0,0 +1,80 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.expressions
+
+import org.apache.flink.api.common.typeinfo.TypeInformation
+import org.apache.flink.api.java.typeutils.RowTypeInfo
+import org.apache.flink.table.api.Types
+import org.apache.flink.table.api.scala._
+import org.apache.flink.table.planner.expressions.utils.ExpressionTestBase
+import org.apache.flink.types.Row
+
+import org.junit.Test
+
+class LiteralPrefixTest extends ExpressionTestBase {
 
 Review comment:
   Can we move this test to `LiteralTest`? The test looks like duplicated. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] wuchong commented on a change in pull request #9099: [FLINK-13237][table-planner-blink] Add expression table api test to blink

2019-08-01 Thread GitBox
wuchong commented on a change in pull request #9099: 
[FLINK-13237][table-planner-blink] Add expression table api test to blink
URL: https://github.com/apache/flink/pull/9099#discussion_r309590465
 
 

 ##
 File path: 
flink-table/flink-table-planner-blink/src/test/scala/org/apache/flink/table/planner/expressions/ScalarFunctionsTest.scala
 ##
 @@ -655,124 +749,168 @@ class ScalarFunctionsTest extends ScalarTypesTestBase {
 
   @Test
   def testConcatWs(): Unit = {
-testSqlApi(
+testAllApis(
+  concat_ws('f33, "AA"),
+  "concat_ws(f33, 'AA')",
   "CONCAT_WS(f33, 'AA')",
   "AA")
-testSqlApi(
+testAllApis(
+  concat_ws("", "AA"),
+  "concat_ws('','AA')",
   "concat_ws('','AA')",
   "AA")
-testSqlApi(
+testAllApis(
+  concat_ws("~", "AA", "BB"),
+  "concat_ws('~','AA','BB')",
   "concat_ws('~','AA','BB')",
   "AA~BB")
-testSqlApi(
+testAllApis(
+  concat_ws("~", 'f33, "AA", "BB", "", 'f33, "CC"),
+  "concat_ws('~',f33, 'AA','BB','',f33, 'CC')",
   "concat_ws('~',f33, 'AA','BB','',f33, 'CC')",
   "AA~BB~~CC")
-testSqlApi(
+testAllApis(
+  concat_ws("", "Flink", 'f33, "xx", 'f33, 'f33),
+  "concat_ws('','Flink', f33, 'xx', f33, f33)",
   "CONCAT_WS('','Flink', f33, 'xx', f33, f33)",
   "Flinkxx")
-
-testSqlApi("concat_ws('||', f35, f36, f33)", "a||b")
   }
 
+  @Test
   def testRegexpReplace(): Unit = {
 
-testSqlApi(
+testAllApis(
+  "foobar".regexpReplace("oo|ar", "abc"),
+  "'foobar'.regexpReplace('oo|ar', 'abc')",
   "regexp_replace('foobar', 'oo|ar', 'abc')",
   "fabcbabc")
 
-testSqlApi(
+testAllApis(
+  "foofar".regexpReplace("^f", ""),
+  "'foofar'.regexpReplace('^f', '')",
   "regexp_replace('foofar', '^f', '')",
   "oofar")
 
-testSqlApi(
+testAllApis(
+  "foobar".regexpReplace("^f*.*r$", ""),
+  "'foobar'.regexpReplace('^f*.*r$', '')",
   "regexp_replace('foobar', '^f*.*r$', '')",
   "")
 
-testSqlApi(
+testAllApis(
+  "foo1bar2".regexpReplace("\\d", ""),
+  "'foo1bar2'.regexpReplace('\\d', '')",
   "regexp_replace('foobar', '\\d', '')",
   "foobar")
 
-testSqlApi(
+testAllApis(
+  "foobar".regexpReplace("\\w", ""),
+  "'foobar'.regexpReplace('\\w', '')",
   "regexp_replace('foobar', '\\w', '')",
   "")
 
-testSqlApi(
+testAllApis(
+  "fooobar".regexpReplace("oo", "$"),
+  "'fooobar'.regexpReplace('oo', '$')",
   "regexp_replace('fooobar', 'oo', '$')",
   "f$obar")
 
-testSqlApi(
+testAllApis(
+  "foobar".regexpReplace("oo", "\\"),
+  "'foobar'.regexpReplace('oo', '\\')",
   "regexp_replace('foobar', 'oo', '\\')",
   "f\\bar")
 
-testSqlApi(
+testAllApis(
+  'f33.regexpReplace("oo|ar", ""),
+  "f33.regexpReplace('oo|ar', '')",
   "REGEXP_REPLACE(f33, 'oo|ar', '')",
   "null")
 
-testSqlApi(
+testAllApis(
+  "foobar".regexpReplace('f33, ""),
+  "'foobar'.regexpReplace(f33, '')",
   "REGEXP_REPLACE('foobar', f33, '')",
   "null")
 
-testSqlApi(
+testAllApis(
+  "foobar".regexpReplace("oo|ar", 'f33),
+  "'foobar'.regexpReplace('oo|ar', f33)",
   "REGEXP_REPLACE('foobar', 'oo|ar', f33)",
   "null")
 
 // This test was added for the null literal problem in string expression 
parsing (FLINK-10463).
-testSqlApi(
+testAllApis(
+  nullOf(Types.STRING).regexpReplace("oo|ar", 'f33),
+  "nullOf(STRING).regexpReplace('oo|ar', f33)",
   "REGEXP_REPLACE(CAST(NULL AS VARCHAR), 'oo|ar', f33)",
   "null")
-
-testSqlApi("regexp_replace('100-200', '(\\d+)', 'num')", "num-num")
-testSqlApi("regexp_replace('100-200', '(\\d+)-(\\d+)', '400')", "400")
-testSqlApi("regexp_replace('100-200', '(\\d+)', '400')", "400-400")
-testSqlApi("regexp_replace('100-200', '', '400')", "100-200")
-testSqlApi("regexp_replace(f40, '(\\d+)', '400')", "null")
-testSqlApi("regexp_replace(CAST(null as VARCHAR), '(\\d+)', 'num')", 
"null")
-testSqlApi("regexp_replace('100-200', CAST(null as VARCHAR), '400')", 
"null")
-testSqlApi("regexp_replace('100-200', '(\\d+)', CAST(null as VARCHAR))", 
"null")
   }
 
   @Test
   def testRegexpExtract(): Unit = {
-testSqlApi(
+testAllApis(
+  "foothebar".regexpExtract("foo(.*?)(bar)", 2),
+  "'foothebar'.regexpExtract('foo(.*?)(bar)', 2)",
   "REGEXP_EXTRACT('foothebar', 'foo(.*?)(bar)', 2)",
   "bar")
 
-testSqlApi(
+testAllApis(
+  "foothebar".regexpExtract("foo(.*?)(bar)", 0),
+  "'foothebar'.regexpExtract('foo(.*?)(bar)', 0)",
   "REGEXP_EXTRACT('foothebar', 'foo(.*?)(bar)', 0)",
   "foothebar")
 
-testSqlApi(
+testAllApis(
+  "foothebar".regexpExtract("foo(.*?)(bar)", 1),
+  "'foothebar'.regexpExtract('foo(.*?)(bar)', 1)",
   "REGEXP_EXTRACT('foothebar', 'foo(.*?)(bar)', 1)",
 

[GitHub] [flink] wuchong commented on a change in pull request #9099: [FLINK-13237][table-planner-blink] Add expression table api test to blink

2019-08-01 Thread GitBox
wuchong commented on a change in pull request #9099: 
[FLINK-13237][table-planner-blink] Add expression table api test to blink
URL: https://github.com/apache/flink/pull/9099#discussion_r309590101
 
 

 ##
 File path: 
flink-table/flink-table-planner-blink/src/test/scala/org/apache/flink/table/planner/expressions/ScalarFunctionsTest.scala
 ##
 @@ -3178,4 +4161,10 @@ class ScalarFunctionsTest extends ScalarTypesTestBase {
   "2")
   }
 
+  @Ignore // Implicit type conversion
+  @Test
+  def testEmptyStringToTime(): Unit = {
 
 Review comment:
   Remove this test if we don't support implicit type conversion yet?
   I think we need to add a bunch of thorough tests when supporting implicit 
type conversion instead of enable the ignored tests. What do you think?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] wuchong commented on a change in pull request #9099: [FLINK-13237][table-planner-blink] Add expression table api test to blink

2019-08-01 Thread GitBox
wuchong commented on a change in pull request #9099: 
[FLINK-13237][table-planner-blink] Add expression table api test to blink
URL: https://github.com/apache/flink/pull/9099#discussion_r309590685
 
 

 ##
 File path: 
flink-table/flink-table-planner-blink/src/test/scala/org/apache/flink/table/planner/expressions/ScalarFunctionsTest.scala
 ##
 @@ -655,124 +749,168 @@ class ScalarFunctionsTest extends ScalarTypesTestBase {
 
   @Test
   def testConcatWs(): Unit = {
-testSqlApi(
+testAllApis(
+  concat_ws('f33, "AA"),
+  "concat_ws(f33, 'AA')",
   "CONCAT_WS(f33, 'AA')",
   "AA")
-testSqlApi(
+testAllApis(
+  concat_ws("", "AA"),
+  "concat_ws('','AA')",
   "concat_ws('','AA')",
   "AA")
-testSqlApi(
+testAllApis(
+  concat_ws("~", "AA", "BB"),
+  "concat_ws('~','AA','BB')",
   "concat_ws('~','AA','BB')",
   "AA~BB")
-testSqlApi(
+testAllApis(
+  concat_ws("~", 'f33, "AA", "BB", "", 'f33, "CC"),
+  "concat_ws('~',f33, 'AA','BB','',f33, 'CC')",
   "concat_ws('~',f33, 'AA','BB','',f33, 'CC')",
   "AA~BB~~CC")
-testSqlApi(
+testAllApis(
+  concat_ws("", "Flink", 'f33, "xx", 'f33, 'f33),
+  "concat_ws('','Flink', f33, 'xx', f33, f33)",
   "CONCAT_WS('','Flink', f33, 'xx', f33, f33)",
   "Flinkxx")
-
-testSqlApi("concat_ws('||', f35, f36, f33)", "a||b")
   }
 
+  @Test
   def testRegexpReplace(): Unit = {
 
-testSqlApi(
+testAllApis(
+  "foobar".regexpReplace("oo|ar", "abc"),
+  "'foobar'.regexpReplace('oo|ar', 'abc')",
   "regexp_replace('foobar', 'oo|ar', 'abc')",
   "fabcbabc")
 
-testSqlApi(
+testAllApis(
+  "foofar".regexpReplace("^f", ""),
+  "'foofar'.regexpReplace('^f', '')",
   "regexp_replace('foofar', '^f', '')",
   "oofar")
 
-testSqlApi(
+testAllApis(
+  "foobar".regexpReplace("^f*.*r$", ""),
+  "'foobar'.regexpReplace('^f*.*r$', '')",
   "regexp_replace('foobar', '^f*.*r$', '')",
   "")
 
-testSqlApi(
+testAllApis(
+  "foo1bar2".regexpReplace("\\d", ""),
+  "'foo1bar2'.regexpReplace('\\d', '')",
   "regexp_replace('foobar', '\\d', '')",
   "foobar")
 
-testSqlApi(
+testAllApis(
+  "foobar".regexpReplace("\\w", ""),
+  "'foobar'.regexpReplace('\\w', '')",
   "regexp_replace('foobar', '\\w', '')",
   "")
 
-testSqlApi(
+testAllApis(
+  "fooobar".regexpReplace("oo", "$"),
+  "'fooobar'.regexpReplace('oo', '$')",
   "regexp_replace('fooobar', 'oo', '$')",
   "f$obar")
 
-testSqlApi(
+testAllApis(
+  "foobar".regexpReplace("oo", "\\"),
+  "'foobar'.regexpReplace('oo', '\\')",
   "regexp_replace('foobar', 'oo', '\\')",
   "f\\bar")
 
-testSqlApi(
+testAllApis(
+  'f33.regexpReplace("oo|ar", ""),
+  "f33.regexpReplace('oo|ar', '')",
   "REGEXP_REPLACE(f33, 'oo|ar', '')",
   "null")
 
-testSqlApi(
+testAllApis(
+  "foobar".regexpReplace('f33, ""),
+  "'foobar'.regexpReplace(f33, '')",
   "REGEXP_REPLACE('foobar', f33, '')",
   "null")
 
-testSqlApi(
+testAllApis(
+  "foobar".regexpReplace("oo|ar", 'f33),
+  "'foobar'.regexpReplace('oo|ar', f33)",
   "REGEXP_REPLACE('foobar', 'oo|ar', f33)",
   "null")
 
 // This test was added for the null literal problem in string expression 
parsing (FLINK-10463).
-testSqlApi(
+testAllApis(
+  nullOf(Types.STRING).regexpReplace("oo|ar", 'f33),
+  "nullOf(STRING).regexpReplace('oo|ar', f33)",
   "REGEXP_REPLACE(CAST(NULL AS VARCHAR), 'oo|ar', f33)",
   "null")
-
-testSqlApi("regexp_replace('100-200', '(\\d+)', 'num')", "num-num")
-testSqlApi("regexp_replace('100-200', '(\\d+)-(\\d+)', '400')", "400")
-testSqlApi("regexp_replace('100-200', '(\\d+)', '400')", "400-400")
-testSqlApi("regexp_replace('100-200', '', '400')", "100-200")
-testSqlApi("regexp_replace(f40, '(\\d+)', '400')", "null")
-testSqlApi("regexp_replace(CAST(null as VARCHAR), '(\\d+)', 'num')", 
"null")
-testSqlApi("regexp_replace('100-200', CAST(null as VARCHAR), '400')", 
"null")
-testSqlApi("regexp_replace('100-200', '(\\d+)', CAST(null as VARCHAR))", 
"null")
   }
 
   @Test
   def testRegexpExtract(): Unit = {
-testSqlApi(
+testAllApis(
+  "foothebar".regexpExtract("foo(.*?)(bar)", 2),
+  "'foothebar'.regexpExtract('foo(.*?)(bar)', 2)",
   "REGEXP_EXTRACT('foothebar', 'foo(.*?)(bar)', 2)",
   "bar")
 
-testSqlApi(
+testAllApis(
+  "foothebar".regexpExtract("foo(.*?)(bar)", 0),
+  "'foothebar'.regexpExtract('foo(.*?)(bar)', 0)",
   "REGEXP_EXTRACT('foothebar', 'foo(.*?)(bar)', 0)",
   "foothebar")
 
-testSqlApi(
+testAllApis(
+  "foothebar".regexpExtract("foo(.*?)(bar)", 1),
+  "'foothebar'.regexpExtract('foo(.*?)(bar)', 1)",
   "REGEXP_EXTRACT('foothebar', 'foo(.*?)(bar)', 1)",
 

[GitHub] [flink] wuchong commented on a change in pull request #9099: [FLINK-13237][table-planner-blink] Add expression table api test to blink

2019-08-01 Thread GitBox
wuchong commented on a change in pull request #9099: 
[FLINK-13237][table-planner-blink] Add expression table api test to blink
URL: https://github.com/apache/flink/pull/9099#discussion_r309538399
 
 

 ##
 File path: 
flink-table/flink-table-planner-blink/src/test/scala/org/apache/flink/table/planner/expressions/utils/ExpressionTestBase.scala
 ##
 @@ -162,31 +169,34 @@ abstract class ExpressionTestBase {
   }
 
   assertEquals(
-s"Wrong result for: [$originalExpr] optimized to: 
[$optimizedExpr]",
+s"Wrong result for: [$optimizedExpr]",
 expected,
 if (actual == null) "null" else actual)
   }
 
   }
 
-  private def addSqlTestExpr(sqlExpr: String, expected: String): Unit = {
+  def addSqlTestExpr(sqlExpr: String, expected: String): Unit = {
 
 Review comment:
   This method should be `private`. The only test which call this method can be 
removed because we don't support implicit type conversion yet.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] wuchong commented on a change in pull request #9099: [FLINK-13237][table-planner-blink] Add expression table api test to blink

2019-08-01 Thread GitBox
wuchong commented on a change in pull request #9099: 
[FLINK-13237][table-planner-blink] Add expression table api test to blink
URL: https://github.com/apache/flink/pull/9099#discussion_r309595542
 
 

 ##
 File path: 
flink-table/flink-table-planner-blink/src/test/scala/org/apache/flink/table/planner/expressions/ScalarFunctionsTest.scala
 ##
 @@ -2531,27 +3423,60 @@ class ScalarFunctionsTest extends ScalarTypesTestBase {
   "SQL_TSI_SECOND" -> SECOND
 )
 
-for ((interval, result) <- intervalMapResults) {
-  testSqlApi(
-s"TIMESTAMPADD($interval, ${data.head._1}, ${data.head._2})", 
result.head)
-  testSqlApi(
-s"TIMESTAMPADD($interval, ${data(1)._1}, ${data(1)._2})", result(1))
-  testSqlApi(
-s"TIMESTAMPADD($interval, ${data(2)._1}, ${data(2)._2})", result(2))
-  testSqlApi(
-s"TIMESTAMPADD($interval, ${data(3)._1}, ${data(3)._2})", result(3))
-  testSqlApi(
-s"TIMESTAMPADD($interval, ${data(4)._1}, ${data(4)._2})", result(4))
+def intervalCount(interval: String, count: Int): (Expression, String) = 
interval match {
+  case "YEAR" => (count.years, s"$count.years")
+  case "SQL_TSI_YEAR" => (count.years, s"$count.years")
+  case "QUARTER" => (count.quarters, s"$count.quarters")
+  case "SQL_TSI_QUARTER" => (count.quarters, s"$count.quarters")
+  case "MONTH" => (count.months, s"$count.months")
+  case "SQL_TSI_MONTH" => (count.months, s"$count.months")
+  case "WEEK" => (count.weeks, s"$count.weeks")
+  case "SQL_TSI_WEEK" => (count.weeks, s"$count.weeks")
+  case "DAY" => (count.days, s"$count.days")
+  case "SQL_TSI_DAY" => (count.days, s"$count.days")
+  case "HOUR" => (count.hours, s"$count.hours")
+  case "SQL_TSI_HOUR" => (count.hours, s"$count.hours")
+  case "MINUTE" => (count.minutes, s"$count.minutes")
+  case "SQL_TSI_MINUTE" => (count.minutes, s"$count.minutes")
+  case "SECOND" => (count.seconds, s"$count.seconds")
+  case "SQL_TSI_SECOND" => (count.seconds, s"$count.seconds")
 }
 
-testSqlApi("TIMESTAMPADD(HOUR, CAST(NULL AS INTEGER), TIMESTAMP 
'2016-02-24 12:42:25')", "null")
+for ((interval, result) <- intervalMapResults) {
+  for (i <- 0 to 4) {
+val (offset, ts) = data(i)
+val timeInterval = intervalCount(interval, offset)
+testAllApis(
+  timeInterval._1 + ts.toTimestamp,
+  s"${timeInterval._2} + '$ts'.toTimestamp",
+  s"TIMESTAMPADD($interval, $offset, TIMESTAMP '$ts')",
+  result(i))
+  }
+}
 
-testSqlApi("TIMESTAMPADD(HOUR, -200, CAST(NULL AS TIMESTAMP))", "null")
+testAllApis(
+  "2016-02-24 12:42:25".toTimestamp + 
nullOf(DataTypes.INTERVAL(DataTypes.MINUTE())),
+  "'2016-02-24 12:42:25'.toTimestamp + Null(INTERVAL_MILLIS)",
+  "TIMESTAMPADD(HOUR, CAST(NULL AS INTEGER), TIMESTAMP '2016-02-24 
12:42:25')",
+  "null")
 
-testSqlApi("TIMESTAMPADD(DAY, 1, DATE '2016-06-15')", "2016-06-16")
+testAllApis(
+  nullOf(DataTypes.TIMESTAMP(3)) + -200.hours,
+  "Null(SQL_TIMESTAMP) + -200.hours",
+  "TIMESTAMPADD(HOUR, -200, CAST(NULL AS TIMESTAMP))",
+  "null")
 
-testSqlApi("TIMESTAMPADD(MONTH, 3, CAST(NULL AS TIMESTAMP))", "null")
+testAllApis(
+  "2016-06-15".toDate + 1.day,
+  "'2016-06-15'.toDate + 1.day",
+  "TIMESTAMPADD(DAY, 1, DATE '2016-06-15')",
+  "2016-06-16")
 
+testAllApis(
+  nullOf(DataTypes.TIMESTAMP(3)) + 3.months,
+  "Null(SQL_TIMESTAMP) + 3.months",
+  "TIMESTAMPADD(MONTH, 3, CAST(NULL AS TIMESTAMP))",
+  "null")
 
 Review comment:
   It seems that we missed some tests under this?
   
   This is from flink planner expression test:
   ```scala
   testAllApis(
 "2016-02-24 12:42:25".toTimestamp + nullOf(Types.INTERVAL_MILLIS),
 "'2016-02-24 12:42:25'.toTimestamp + nullOf(INTERVAL_MILLIS)",
 "TIMESTAMPADD(HOUR, CAST(NULL AS INTEGER), TIMESTAMP '2016-02-24 
12:42:25')",
 "null")
   
   testAllApis(
 nullOf(Types.SQL_TIMESTAMP) + -200.hours,
 "nullOf(SQL_TIMESTAMP) + -200.hours",
 "TIMESTAMPADD(HOUR, -200, CAST(NULL AS TIMESTAMP))",
 "null")
   
   testAllApis(
 nullOf(Types.SQL_TIMESTAMP) + 3.months,
 "nullOf(SQL_TIMESTAMP) + 3.months",
 "TIMESTAMPADD(MONTH, 3, CAST(NULL AS TIMESTAMP))",
 "null")
   
   // TIMESTAMPADD with DATE returns a TIMESTAMP value for sub-day 
intervals.
   testAllApis("2016-06-15".toDate + 1.month,
 "'2016-06-15'.toDate + 1.month",
 "timestampadd(MONTH, 1, date '2016-06-15')",
 "2016-07-15")
   
   testAllApis("2016-06-15".toDate + 1.day,
 "'2016-06-15'.toDate + 1.day",
 "timestampadd(DAY, 1, date '2016-06-15')",
 "2016-06-16")
   
   testAllApis("2016-06-15".toTimestamp - 1.hour,
 "'2016-06-15'.toTimestamp - 1.hour",
 "timestampadd(HOUR, -1, date

[GitHub] [flink] wuchong commented on a change in pull request #9099: [FLINK-13237][table-planner-blink] Add expression table api test to blink

2019-08-01 Thread GitBox
wuchong commented on a change in pull request #9099: 
[FLINK-13237][table-planner-blink] Add expression table api test to blink
URL: https://github.com/apache/flink/pull/9099#discussion_r309592331
 
 

 ##
 File path: 
flink-table/flink-table-planner-blink/src/test/scala/org/apache/flink/table/planner/expressions/ScalarFunctionsTest.scala
 ##
 @@ -1042,6 +1181,13 @@ class ScalarFunctionsTest extends ScalarTypesTestBase {
 testSqlApi("split_index('test', 'e', -1)", "null")
   }
 
+  @Test
+  def testInitCap2(): Unit = {
 
 Review comment:
   Remove this. It is duplicated with `testInitCap()`.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] wuchong commented on a change in pull request #9099: [FLINK-13237][table-planner-blink] Add expression table api test to blink

2019-08-01 Thread GitBox
wuchong commented on a change in pull request #9099: 
[FLINK-13237][table-planner-blink] Add expression table api test to blink
URL: https://github.com/apache/flink/pull/9099#discussion_r309596603
 
 

 ##
 File path: 
flink-table/flink-table-planner-blink/src/test/scala/org/apache/flink/table/planner/expressions/ScalarFunctionsTest.scala
 ##
 @@ -2785,7 +3766,9 @@ class ScalarFunctionsTest extends ScalarTypesTestBase {
 
   @Test
   def testNullBigDecimal(): Unit = {
-testSqlApi(
+testAllApis(
+  ExpressionParser.parseExpression("f41.sign()"),
+  "f41.sign()",
   "SIGN(f41)",
   "null")
 testSqlApi("SIGN(f41)", "null")
 
 Review comment:
   Remove this?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] wuchong commented on a change in pull request #9099: [FLINK-13237][table-planner-blink] Add expression table api test to blink

2019-08-01 Thread GitBox
wuchong commented on a change in pull request #9099: 
[FLINK-13237][table-planner-blink] Add expression table api test to blink
URL: https://github.com/apache/flink/pull/9099#discussion_r309537175
 
 

 ##
 File path: 
flink-table/flink-table-planner-blink/src/test/scala/org/apache/flink/table/planner/expressions/utils/ExpressionTestBase.scala
 ##
 @@ -196,10 +206,62 @@ abstract class ExpressionTestBase {
 calcProgram.expandLocalRef(calcProgram.getProjectList.get(0))
   }
 
+  def testAllApis(
+  expr: Expression,
+  exprString: String,
+  sqlExpr: String,
+  expected: String): Unit = {
+addTableApiTestExpr(expr, expected)
+addTableApiTestExpr(exprString, expected)
+addSqlTestExpr(sqlExpr, expected)
+if (expected == nullable) {
+  testTableNullable(expr, exprString)
+  testSqlNullable(sqlExpr)
+}
+  }
+
+  def /**/testTableApi(
 
 Review comment:
   remove `/**/`


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] wuchong commented on a change in pull request #9099: [FLINK-13237][table-planner-blink] Add expression table api test to blink

2019-08-01 Thread GitBox
wuchong commented on a change in pull request #9099: 
[FLINK-13237][table-planner-blink] Add expression table api test to blink
URL: https://github.com/apache/flink/pull/9099#discussion_r309596546
 
 

 ##
 File path: 
flink-table/flink-table-planner-blink/src/test/scala/org/apache/flink/table/planner/expressions/ScalarFunctionsTest.scala
 ##
 @@ -2785,7 +3766,9 @@ class ScalarFunctionsTest extends ScalarTypesTestBase {
 
   @Test
   def testNullBigDecimal(): Unit = {
-testSqlApi(
+testAllApis(
+  ExpressionParser.parseExpression("f41.sign()"),
 
 Review comment:
   `'f41.sign()` ?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services