[GitHub] [flink] lirui-apache commented on a change in pull request #10625: [FLINK-15259][hive] HiveInspector.toInspectors() should convert Flink…
lirui-apache commented on a change in pull request #10625: [FLINK-15259][hive] HiveInspector.toInspectors() should convert Flink… URL: https://github.com/apache/flink/pull/10625#discussion_r362194203 ## File path: flink-table/flink-table-planner-blink/src/main/scala/org/apache/flink/table/planner/codegen/GenerateUtils.scala ## @@ -375,7 +375,7 @@ object GenerateUtils { | $SQL_TIMESTAMP.fromEpochMillis(${ts.getMillisecond}L, ${ts.getNanoOfMillisecond}); """.stripMargin ctx.addReusableMember(fieldTimestamp) -generateNonNullLiteral(literalType, fieldTerm, literalType) +generateNonNullLiteral(literalType, fieldTerm, ts) Review comment: Now this change is moved to b5f0206 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] lirui-apache commented on a change in pull request #10625: [FLINK-15259][hive] HiveInspector.toInspectors() should convert Flink…
lirui-apache commented on a change in pull request #10625: [FLINK-15259][hive] HiveInspector.toInspectors() should convert Flink… URL: https://github.com/apache/flink/pull/10625#discussion_r362193585 ## File path: flink-connectors/flink-connector-hive/src/main/java/org/apache/flink/table/catalog/hive/client/HiveShim.java ## @@ -232,4 +233,10 @@ SimpleGenericUDAFParameterInfo createUDAFParameterInfo(ObjectInspector[] params, * Converts a hive date instance to LocalDate which is expected by DataFormatConverter. */ LocalDate toFlinkDate(Object hiveDate); + + /** +* Converts a Hive primitive java object to corresponding Writable object. Throws CatalogException if the conversion +* is not supported. +*/ + Writable hivePrimitiveToWritable(Object value) throws CatalogException; Review comment: I'll change to throw `FlinkHiveException`. It's also a runtime exception but more specific to Hive connector. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] lirui-apache commented on a change in pull request #10625: [FLINK-15259][hive] HiveInspector.toInspectors() should convert Flink…
lirui-apache commented on a change in pull request #10625: [FLINK-15259][hive] HiveInspector.toInspectors() should convert Flink… URL: https://github.com/apache/flink/pull/10625#discussion_r362191208 ## File path: flink-table/flink-table-planner-blink/src/main/scala/org/apache/flink/table/planner/codegen/GenerateUtils.scala ## @@ -375,7 +375,7 @@ object GenerateUtils { | $SQL_TIMESTAMP.fromEpochMillis(${ts.getMillisecond}L, ${ts.getNanoOfMillisecond}); """.stripMargin ctx.addReusableMember(fieldTimestamp) -generateNonNullLiteral(literalType, fieldTerm, literalType) +generateNonNullLiteral(literalType, fieldTerm, ts) Review comment: Right, `ts` is a `SqlTimestamp `. I'll squash and make a separate commit for this. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] lirui-apache commented on a change in pull request #10625: [FLINK-15259][hive] HiveInspector.toInspectors() should convert Flink…
lirui-apache commented on a change in pull request #10625: [FLINK-15259][hive] HiveInspector.toInspectors() should convert Flink… URL: https://github.com/apache/flink/pull/10625#discussion_r362173968 ## File path: flink-table/flink-table-planner-blink/src/main/scala/org/apache/flink/table/planner/codegen/GenerateUtils.scala ## @@ -375,7 +375,7 @@ object GenerateUtils { | $SQL_TIMESTAMP.fromEpochMillis(${ts.getMillisecond}L, ${ts.getNanoOfMillisecond}); """.stripMargin ctx.addReusableMember(fieldTimestamp) -generateNonNullLiteral(literalType, fieldTerm, literalType) +generateNonNullLiteral(literalType, fieldTerm, ts) Review comment: @xuefuz @JingsongLi This is a bug in planner when generating timestamp literals: we pass the type instead of the value as the literal. The added test exposes the issue. Do you think this change needs a dedicated PR? This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] lirui-apache commented on a change in pull request #10625: [FLINK-15259][hive] HiveInspector.toInspectors() should convert Flink…
lirui-apache commented on a change in pull request #10625: [FLINK-15259][hive] HiveInspector.toInspectors() should convert Flink… URL: https://github.com/apache/flink/pull/10625#discussion_r361938203 ## File path: flink-connectors/flink-connector-hive/src/main/java/org/apache/flink/table/catalog/hive/client/HiveShimV310.java ## @@ -233,4 +240,23 @@ public LocalDate toFlinkDate(Object hiveDate) { throw new FlinkHiveException("Failed to convert to Flink date", e); } } + + @Override + public Writable hivePrimitiveToWritable(Object value) throws CatalogException { + try { + return super.hivePrimitiveToWritable(value); + } catch (CatalogException e) { Review comment: Maybe I can use optional. Let me have a try This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] lirui-apache commented on a change in pull request #10625: [FLINK-15259][hive] HiveInspector.toInspectors() should convert Flink…
lirui-apache commented on a change in pull request #10625: [FLINK-15259][hive] HiveInspector.toInspectors() should convert Flink… URL: https://github.com/apache/flink/pull/10625#discussion_r361935078 ## File path: flink-connectors/flink-connector-hive/src/test/java/org/apache/flink/table/module/hive/HiveModuleTest.java ## @@ -88,4 +94,24 @@ public void testHiveBuiltInFunction() { public void testNonExistFunction() { assertFalse(new HiveModule(HiveShimLoader.getHiveVersion()).getFunctionDefinition("nonexist").isPresent()); } + + @Test + public void testConstantArguments() throws Exception { + TableEnvironment tEnv = HiveTestUtils.createTableEnvWithBlinkPlannerBatchMode(); + + tEnv.unloadModule("core"); + tEnv.loadModule("hive", new HiveModule(HiveShimLoader.getHiveVersion())); + + List results = TableUtils.collectToList(tEnv.sqlQuery("select concat('an', 'bn')")); + assertEquals("[anbn]", results.toString()); + + results = TableUtils.collectToList(tEnv.sqlQuery("select concat('ab', cast('cdefghi' as varchar(5)))")); Review comment: Unfortunately we cannot test `null` literals, because Hive functions consider nulls as non-constant arguments: https://github.com/apache/flink/blob/master/flink-connectors/flink-connector-hive/src/main/java/org/apache/flink/table/functions/hive/HiveFunction.java#L36 Maybe we should consider all the passed arguments as constant, and don't pass non-constant arguments in the first place. But that'll need to change the interface. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services