[ https://issues.apache.org/jira/browse/FLINK-8215?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16296576#comment-16296576 ]
ASF GitHub Bot commented on FLINK-8215: --------------------------------------- Github user twalthr commented on the issue: https://github.com/apache/flink/pull/5148 Thank for this fix @walterddr. The code looks good. I will merge this... > Collections codegen exception when constructing Array or Map via SQL API > ------------------------------------------------------------------------ > > Key: FLINK-8215 > URL: https://issues.apache.org/jira/browse/FLINK-8215 > Project: Flink > Issue Type: Bug > Components: Table API & SQL > Reporter: Rong Rong > Assignee: Rong Rong > > TableAPI goes through `LogicalNode.validate()`, which brings up the > collection validation and rejects inconsistent type, this will throw > `ValidationExcpetion` for something like `array(1.0, 2.0f)`. > SqlAPI uses `FlinkPlannerImpl.validator(SqlNode)`, which uses calcite SqlNode > validation, which supports resolving leastRestrictive type. `ARRAY[CAST(1 AS > DOUBLE), CAST(2 AS FLOAT)]` throws codegen exception. > Root cause is the CodeGeneration for these collection value constructors does > not cast or resolve leastRestrictive type correctly. I see 2 options: > 1. Strengthen validation to not allow resolving leastRestrictive type on SQL. > 2. Making codegen support leastRestrictive type cast, such as using > `generateCast` instead of direct casting like `(ClassType) element`. -- This message was sent by Atlassian JIRA (v6.4.14#64029)