[ https://issues.apache.org/jira/browse/SPARK-48817?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
XiDuo You resolved SPARK-48817. ------------------------------- Fix Version/s: 4.0.0 Resolution: Fixed > MultiInsert is split to multiple sql executions, resulting in no exchange > reuse > ------------------------------------------------------------------------------- > > Key: SPARK-48817 > URL: https://issues.apache.org/jira/browse/SPARK-48817 > Project: Spark > Issue Type: Improvement > Components: SQL > Affects Versions: 4.0.0, 3.5.1 > Reporter: Zhen Wang > Assignee: Zhen Wang > Priority: Major > Labels: pull-request-available > Fix For: 4.0.0 > > Attachments: image-2024-07-05-14-59-35-340.png, > image-2024-07-05-14-59-55-291.png, image-2024-07-05-15-00-01-805.png, > image-2024-07-05-15-00-09-181.png, image-2024-07-05-15-00-17-693.png, > image-2024-07-05-16-42-01-973.png, image-2024-07-05-16-42-17-817.png, > image-2024-07-05-16-42-27-033.png, image-2024-07-05-16-42-34-738.png, > image-2024-07-05-16-42-46-500.png > > > MultiInsert is split to multiple sql executions, resulting in no exchange > reuse. > > Reproduce sql: > {code:java} > create table wangzhen_t1(c1 int); > create table wangzhen_t2(c1 int); > create table wangzhen_t3(c1 int); > insert into wangzhen_t1 values (1), (2), (3); > from (select /*+ REPARTITION(3) */ c1 from wangzhen_t1) > insert overwrite table wangzhen_t2 select c1 > insert overwrite table wangzhen_t3 select c1; {code} > > In Spark 3.1, there is only one SQL execution and there is a reuse exchange. > !image-2024-07-05-14-59-35-340.png! > > However, in Spark 3.5, it was split to multiple executions and there was no > ReuseExchange. > !image-2024-07-05-16-42-01-973.png! > !image-2024-07-05-16-42-17-817.png!!image-2024-07-05-16-42-34-738.png!!image-2024-07-05-16-42-46-500.png! > -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org