[GitHub] [flink] wuchong commented on a change in pull request #10039: [FLINK-14262][table] Support referencing function with fully/partially qualified names in SQL

2019-11-07 Thread GitBox
wuchong commented on a change in pull request #10039: [FLINK-14262][table]  
Support referencing function with fully/partially qualified names in SQL
URL: https://github.com/apache/flink/pull/10039#discussion_r343620400
 
 

 ##
 File path: 
flink-table/flink-table-planner-blink/src/main/java/org/apache/flink/table/planner/catalog/FunctionCatalogOperatorTable.java
 ##
 @@ -84,12 +89,12 @@ public void lookupOperatorOverloads(
return;
}
 
-   String name = opName.getSimple();
-   Optional candidateFunction = 
functionCatalog.lookupFunction(
-   FunctionIdentifier.of(name));
+   FunctionIdentifier identifier = 
toFunctionIdentifier(opName.names.toArray(new String[0]), catalogManager);
 
 Review comment:
   I agree with @dawidwys . What `toFunctionIdentifier` does is just qualify 
2-parts to 3-parts and this can happen in `FunctionCatalog`, because it holds 
the `CatalogManager`. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] wuchong commented on a change in pull request #10039: [FLINK-14262][table] Support referencing function with fully/partially qualified names in SQL

2019-11-05 Thread GitBox
wuchong commented on a change in pull request #10039: [FLINK-14262][table]  
Support referencing function with fully/partially qualified names in SQL
URL: https://github.com/apache/flink/pull/10039#discussion_r342524935
 
 

 ##
 File path: 
flink-table/flink-table-planner-blink/src/test/scala/org/apache/flink/table/planner/catalog/CatalogTableITCase.scala
 ##
 @@ -84,6 +94,41 @@ class CatalogTableITCase(isStreamingMode: Boolean) {
 tableEnv.execute(name)
   }
 
+  private def testUdf(sql: String): Unit = {
+val sinkDDL =
+  """
+|create table sinkT(
+|  a bigint
+|) with (
+|  'connector' = 'COLLECTION'
+|)
+  """.stripMargin
+tableEnv.sqlUpdate(sinkDDL)
+tableEnv.sqlUpdate(sql)
+tableEnv.execute("")
+assertEquals(Seq(toRow(2L)), TestCollectionTableFactory.RESULT.sorted)
 
 Review comment:
   I agree they are all the same. But according to the method name, we can pass 
in any sql string. I just thought the implementation is not clean, but this is 
not a major problem. 
   
   I think one way to make it clean is rename it to 
`testMyFunc(functionIdentifier: String)`, and change the call to 
   
   ```java
   testMyFunc("default_catalog.default_database.myfunc");
   testMyFunc("default_database.myfunc");
   testMyFunc("myfunc");
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] wuchong commented on a change in pull request #10039: [FLINK-14262][table] Support referencing function with fully/partially qualified names in SQL

2019-11-05 Thread GitBox
wuchong commented on a change in pull request #10039: [FLINK-14262][table]  
Support referencing function with fully/partially qualified names in SQL
URL: https://github.com/apache/flink/pull/10039#discussion_r342524935
 
 

 ##
 File path: 
flink-table/flink-table-planner-blink/src/test/scala/org/apache/flink/table/planner/catalog/CatalogTableITCase.scala
 ##
 @@ -84,6 +94,41 @@ class CatalogTableITCase(isStreamingMode: Boolean) {
 tableEnv.execute(name)
   }
 
+  private def testUdf(sql: String): Unit = {
+val sinkDDL =
+  """
+|create table sinkT(
+|  a bigint
+|) with (
+|  'connector' = 'COLLECTION'
+|)
+  """.stripMargin
+tableEnv.sqlUpdate(sinkDDL)
+tableEnv.sqlUpdate(sql)
+tableEnv.execute("")
+assertEquals(Seq(toRow(2L)), TestCollectionTableFactory.RESULT.sorted)
 
 Review comment:
   I agree they are all the same. But according to the method name, we can pass 
in any sql string. I just thought the implementation is not clean, but this is 
not a major problem. 
   
   I think one way to make it clean is rename it to 
`testMyUdf(functionIdentifier: String)`, and change the call to 
   
   ```java
   testMyUdf("default_catalog.default_database.myfunc");
   testMyUdf("default_database.myfunc");
   testMyUdf("myfunc");
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] wuchong commented on a change in pull request #10039: [FLINK-14262][table] Support referencing function with fully/partially qualified names in SQL

2019-11-05 Thread GitBox
wuchong commented on a change in pull request #10039: [FLINK-14262][table]  
Support referencing function with fully/partially qualified names in SQL
URL: https://github.com/apache/flink/pull/10039#discussion_r342522749
 
 

 ##
 File path: 
flink-table/flink-table-planner-blink/src/test/scala/org/apache/flink/table/planner/catalog/CatalogTableITCase.scala
 ##
 @@ -84,6 +94,41 @@ class CatalogTableITCase(isStreamingMode: Boolean) {
 tableEnv.execute(name)
   }
 
+  private def testUdf(sql: String): Unit = {
+val sinkDDL =
+  """
+|create table sinkT(
+|  a bigint
+|) with (
+|  'connector' = 'COLLECTION'
+|)
+  """.stripMargin
+tableEnv.sqlUpdate(sinkDDL)
+tableEnv.sqlUpdate(sql)
+tableEnv.execute("")
+assertEquals(Seq(toRow(2L)), TestCollectionTableFactory.RESULT.sorted)
+  }
+
+  @Test
+  def testUdfWithFullIdentifier(): Unit = {
+testUdf("insert into sinkT select 
default_catalog.default_database.myfunc(cast(1 as bigint))")
 
 Review comment:
   Sorry, I missed the `default_database.myfunc` one, but is `myfunc` covered? 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] wuchong commented on a change in pull request #10039: [FLINK-14262][table] Support referencing function with fully/partially qualified names in SQL

2019-11-05 Thread GitBox
wuchong commented on a change in pull request #10039: [FLINK-14262][table]  
Support referencing function with fully/partially qualified names in SQL
URL: https://github.com/apache/flink/pull/10039#discussion_r342517824
 
 

 ##
 File path: 
flink-table/flink-table-planner-blink/src/main/java/org/apache/flink/table/planner/catalog/FunctionCatalogOperatorTable.java
 ##
 @@ -74,7 +84,7 @@ public void lookupOperatorOverloads(
SqlSyntax syntax,
List operatorList,
SqlNameMatcher nameMatcher) {
-   if (!opName.isSimple()) {
+   if (opName.isStar()) {
 
 Review comment:
   Then we should only allow `FUNCTION` and `FUNCTION_ID`? 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] wuchong commented on a change in pull request #10039: [FLINK-14262][table] Support referencing function with fully/partially qualified names in SQL

2019-11-04 Thread GitBox
wuchong commented on a change in pull request #10039: [FLINK-14262][table]  
Support referencing function with fully/partially qualified names in SQL
URL: https://github.com/apache/flink/pull/10039#discussion_r342376155
 
 

 ##
 File path: 
flink-table/flink-table-planner-blink/src/test/scala/org/apache/flink/table/planner/catalog/CatalogTableITCase.scala
 ##
 @@ -84,6 +94,41 @@ class CatalogTableITCase(isStreamingMode: Boolean) {
 tableEnv.execute(name)
   }
 
+  private def testUdf(sql: String): Unit = {
+val sinkDDL =
+  """
+|create table sinkT(
+|  a bigint
+|) with (
+|  'connector' = 'COLLECTION'
+|)
+  """.stripMargin
+tableEnv.sqlUpdate(sinkDDL)
+tableEnv.sqlUpdate(sql)
+tableEnv.execute("")
+assertEquals(Seq(toRow(2L)), TestCollectionTableFactory.RESULT.sorted)
+  }
+
+  @Test
+  def testUdfWithFullIdentifier(): Unit = {
+testUdf("insert into sinkT select 
default_catalog.default_database.myfunc(cast(1 as bigint))")
 
 Review comment:
   I think we should also test `default_database.myfunc` and `myfunc`.
   Could you also verify it also works for temproary (system/catalog) function? 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] wuchong commented on a change in pull request #10039: [FLINK-14262][table] Support referencing function with fully/partially qualified names in SQL

2019-11-04 Thread GitBox
wuchong commented on a change in pull request #10039: [FLINK-14262][table]  
Support referencing function with fully/partially qualified names in SQL
URL: https://github.com/apache/flink/pull/10039#discussion_r342376155
 
 

 ##
 File path: 
flink-table/flink-table-planner-blink/src/test/scala/org/apache/flink/table/planner/catalog/CatalogTableITCase.scala
 ##
 @@ -84,6 +94,41 @@ class CatalogTableITCase(isStreamingMode: Boolean) {
 tableEnv.execute(name)
   }
 
+  private def testUdf(sql: String): Unit = {
+val sinkDDL =
+  """
+|create table sinkT(
+|  a bigint
+|) with (
+|  'connector' = 'COLLECTION'
+|)
+  """.stripMargin
+tableEnv.sqlUpdate(sinkDDL)
+tableEnv.sqlUpdate(sql)
+tableEnv.execute("")
+assertEquals(Seq(toRow(2L)), TestCollectionTableFactory.RESULT.sorted)
+  }
+
+  @Test
+  def testUdfWithFullIdentifier(): Unit = {
+testUdf("insert into sinkT select 
default_catalog.default_database.myfunc(cast(1 as bigint))")
 
 Review comment:
   I think we should also test `default_database.myfunc` and `myfunc`.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] wuchong commented on a change in pull request #10039: [FLINK-14262][table] Support referencing function with fully/partially qualified names in SQL

2019-11-04 Thread GitBox
wuchong commented on a change in pull request #10039: [FLINK-14262][table]  
Support referencing function with fully/partially qualified names in SQL
URL: https://github.com/apache/flink/pull/10039#discussion_r342375742
 
 

 ##
 File path: 
flink-table/flink-table-planner-blink/src/test/scala/org/apache/flink/table/planner/catalog/CatalogTableITCase.scala
 ##
 @@ -84,6 +94,41 @@ class CatalogTableITCase(isStreamingMode: Boolean) {
 tableEnv.execute(name)
   }
 
+  private def testUdf(sql: String): Unit = {
+val sinkDDL =
+  """
+|create table sinkT(
+|  a bigint
+|) with (
+|  'connector' = 'COLLECTION'
+|)
+  """.stripMargin
+tableEnv.sqlUpdate(sinkDDL)
+tableEnv.sqlUpdate(sql)
+tableEnv.execute("")
+assertEquals(Seq(toRow(2L)), TestCollectionTableFactory.RESULT.sorted)
 
 Review comment:
   The expected result is hard code here which is not clean. It would be better 
to pass in the expected result as parameter. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] wuchong commented on a change in pull request #10039: [FLINK-14262][table] Support referencing function with fully/partially qualified names in SQL

2019-11-04 Thread GitBox
wuchong commented on a change in pull request #10039: [FLINK-14262][table]  
Support referencing function with fully/partially qualified names in SQL
URL: https://github.com/apache/flink/pull/10039#discussion_r342388213
 
 

 ##
 File path: 
flink-table/flink-table-planner-blink/src/main/scala/org/apache/flink/table/planner/functions/utils/ScalarSqlFunction.scala
 ##
 @@ -45,13 +44,13 @@ import scala.collection.JavaConverters._
   * @param typeFactorytype factory for converting Flink's between 
Calcite's types
   */
 class ScalarSqlFunction(
-name: String,
+name: SqlIdentifier,
 
 Review comment:
   I would suggest to use `FunctionIdentifier` as the name parameter. 
   1)`FunctionIdentifier` is a more strict interface, either 3-part full path 
(UDF), or 1-part name (built-in).
   2) `FunctionIdentifier` has a better utility to constuct, rather than using 
`SqlParserPos.ZERO` here and there. 
   3)  the constructor of `ScalarSqlFunction` is not only from SQL path, but 
also Table API path, but it is weird to construct a `SqlIdentifier` in Table 
API path. 
   
   The same to `TableSqlFunction`, `AggregateSqlFunction`, etc...
   
   What do you think?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] wuchong commented on a change in pull request #10039: [FLINK-14262][table] Support referencing function with fully/partially qualified names in SQL

2019-11-04 Thread GitBox
wuchong commented on a change in pull request #10039: [FLINK-14262][table]  
Support referencing function with fully/partially qualified names in SQL
URL: https://github.com/apache/flink/pull/10039#discussion_r342393287
 
 

 ##
 File path: 
flink-table/flink-table-planner-blink/src/main/java/org/apache/flink/table/planner/catalog/FunctionCatalogOperatorTable.java
 ##
 @@ -74,7 +84,7 @@ public void lookupOperatorOverloads(
SqlSyntax syntax,
List operatorList,
SqlNameMatcher nameMatcher) {
-   if (!opName.isSimple()) {
+   if (opName.isStar()) {
 
 Review comment:
   nit: `opName.isStar() && syntax == SqlSyntax.FUNCTION`.
   
   I just add a test in local, `f0 is not null`  will also go into 
`lookupFunction` which should never happen. `FunctionCatalog.lookupFunction` 
should only handle functions, not prefix or postfix syntax. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] wuchong commented on a change in pull request #10039: [FLINK-14262][table] Support referencing function with fully/partially qualified names in SQL

2019-11-04 Thread GitBox
wuchong commented on a change in pull request #10039: [FLINK-14262][table]  
Support referencing function with fully/partially qualified names in SQL
URL: https://github.com/apache/flink/pull/10039#discussion_r342380567
 
 

 ##
 File path: 
flink-table/flink-table-planner-blink/src/main/java/org/apache/flink/table/planner/catalog/FunctionCatalogOperatorTable.java
 ##
 @@ -184,4 +188,21 @@ private boolean isNotUserFunction(SqlFunctionCategory 
category) {
public List getOperatorList() {
throw new UnsupportedOperationException("This should never be 
called");
}
+
+   public static FunctionIdentifier toFunctionIdentifier(String[] names, 
CatalogManager catalogManager) {
+   return names.length == 1 ?
+   FunctionIdentifier.of(names[0]) :
+   FunctionIdentifier.of(
+   
catalogManager.qualifyIdentifier(UnresolvedIdentifier.of(names)));
+   }
+
+   public static SqlIdentifier createSqlIdentifier(CallExpression call, 
UserDefinedFunction function) {
 
 Review comment:
   The `createSqlIdentifier` and `toFunctionIdentifier` utility is weird in 
this class, this class is not a utility class. Maybe `UserDefinedFunctionUtils` 
is a better place. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services