[ 
https://issues.apache.org/jira/browse/FLINK-25459?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17465916#comment-17465916
 ] 

Wenlong Lyu commented on FLINK-25459:
-------------------------------------

[~qyw919867774] this is by designed. you can try insert into with explicit 
column like if you want different order: INSERT INTO neworders (order_no, 
product, qty) SELECT order_no, product, qty FROM orders

> When inserting row type fields into sink, the order needs to be maintained
> --------------------------------------------------------------------------
>
>                 Key: FLINK-25459
>                 URL: https://issues.apache.org/jira/browse/FLINK-25459
>             Project: Flink
>          Issue Type: Improvement
>          Components: Table SQL / Planner
>    Affects Versions: 1.14.2
>            Reporter: qyw
>            Priority: Major
>
> When I insert a row type value into sink, why do I need to maintain the field 
> order in row?
> This is the comparison between my query schema and sink schema:
> Query schema: [ceshi: ROW<`name` STRING, `id` INT, `age` INT, `test` ROW<`c` 
> STRING>>]
> Sink schema:  [ceshi: ROW<`id` INT, `name` STRING, `age` INT, `test` ROW<`c` 
> STRING>>] 
> An error will be thrown:
> Exception in thread "main" org.apache.flink.table.api.ValidationException: 
> Column types of query result and sink for registered table 
> 'default_catalog.default_database.kafka_target' do not match.
> Cause: Incompatible types for sink column 'ceshi' at position 0.
>  
>  
> Is this phenomenon reasonable?



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

Reply via email to