[jira] [Commented] (SPARK-42650) link issue SPARK-42550
[ https://issues.apache.org/jira/browse/SPARK-42650?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17697348#comment-17697348 ] Hyukjin Kwon commented on SPARK-42650: -- [~kevinshin] mind self-contained reproducer without Kyuubi? So is this a regression from SPARK-40588? Please fix the JIRA title to summarize the issue. > link issue SPARK-42550 > -- > > Key: SPARK-42650 > URL: https://issues.apache.org/jira/browse/SPARK-42650 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 3.2.3 >Reporter: kevinshin >Priority: Major > > When use > [KyuubiSparkSQLExtension|https://kyuubi.readthedocs.io/en/v1.6.1-incubating/extensions/engines/spark/] > and when a `insert overwrite` statment meet exception ,a no partion table's > home directory will lost ,partion table will lost partion directory. > > my spark-defaults.conf config : > spark.sql.extensions org.apache.kyuubi.sql.KyuubiSparkSQLExtension > > because I can't reopen SPARK-42550 , for detail and reproduce please > reference: > https://issues.apache.org/jira/browse/SPARK-42550 > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-42650) link issue SPARK-42550
[ https://issues.apache.org/jira/browse/SPARK-42650?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17696810#comment-17696810 ] kevinshin commented on SPARK-42650: --- Thank you @[~yumwang] > link issue SPARK-42550 > -- > > Key: SPARK-42650 > URL: https://issues.apache.org/jira/browse/SPARK-42650 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 3.2.3 >Reporter: kevinshin >Priority: Major > > When use > [KyuubiSparkSQLExtension|https://kyuubi.readthedocs.io/en/v1.6.1-incubating/extensions/engines/spark/] > and when a `insert overwrite` statment meet exception ,a no partion table's > home directory will lost ,partion table will lost partion directory. > > my spark-defaults.conf config : > spark.sql.extensions org.apache.kyuubi.sql.KyuubiSparkSQLExtension > > because I can't reopen SPARK-42550 , for detail and reproduce please > reference: > https://issues.apache.org/jira/browse/SPARK-42550 > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-42650) link issue SPARK-42550
[ https://issues.apache.org/jira/browse/SPARK-42650?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17696665#comment-17696665 ] Yuming Wang commented on SPARK-42650: - It seems to be caused by https://github.com/apache/spark/pull/38358. > link issue SPARK-42550 > -- > > Key: SPARK-42650 > URL: https://issues.apache.org/jira/browse/SPARK-42650 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 3.2.3 >Reporter: kevinshin >Priority: Major > > When use > [KyuubiSparkSQLExtension|https://kyuubi.readthedocs.io/en/v1.6.1-incubating/extensions/engines/spark/] > and when a `insert overwrite` statment meet exception ,a no partion table's > home directory will lost ,partion table will lost partion directory. > > my spark-defaults.conf config : > spark.sql.extensions org.apache.kyuubi.sql.KyuubiSparkSQLExtension > > because I can't reopen SPARK-42550 , for detail and reproduce please > reference: > https://issues.apache.org/jira/browse/SPARK-42550 > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-42650) link issue SPARK-42550
[ https://issues.apache.org/jira/browse/SPARK-42650?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17696645#comment-17696645 ] kevinshin commented on SPARK-42650: --- Thanks [~ulysses] , I can reproduced it without any spark's extension. > link issue SPARK-42550 > -- > > Key: SPARK-42650 > URL: https://issues.apache.org/jira/browse/SPARK-42650 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 3.2.3 >Reporter: kevinshin >Priority: Major > > When use > [KyuubiSparkSQLExtension|https://kyuubi.readthedocs.io/en/v1.6.1-incubating/extensions/engines/spark/] > and when a `insert overwrite` statment meet exception ,a no partion table's > home directory will lost ,partion table will lost partion directory. > > my spark-defaults.conf config : > spark.sql.extensions org.apache.kyuubi.sql.KyuubiSparkSQLExtension > > because I can't reopen SPARK-42550 , for detail and reproduce please > reference: > https://issues.apache.org/jira/browse/SPARK-42550 > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-42650) link issue SPARK-42550
[ https://issues.apache.org/jira/browse/SPARK-42650?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17696070#comment-17696070 ] XiDuo You commented on SPARK-42650: --- To be clear, it is the issue of Spark 3.2.3. Spark3.2.1, 3.3.x and master are fine. It can be reproduced by: {code:java} CREATE TABLE IF NOT EXISTS spark32_overwrite(amt1 int) STORED AS ORC; CREATE TABLE IF NOT EXISTS spark32_overwrite2(amt1 long) STORED AS ORC; INSERT OVERWRITE TABLE spark32_overwrite2 select 644164; set spark.sql.ansi.enabled=true; INSERT OVERWRITE TABLE spark32_overwrite select amt1 from (select cast(amt1 as int) as amt1 from spark32_overwrite2 distribute by amt1); {code} > link issue SPARK-42550 > -- > > Key: SPARK-42650 > URL: https://issues.apache.org/jira/browse/SPARK-42650 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 3.2.3 >Reporter: kevinshin >Priority: Major > > When use > [KyuubiSparkSQLExtension|https://kyuubi.readthedocs.io/en/v1.6.1-incubating/extensions/engines/spark/] > and when a `insert overwrite` statment meet exception ,a no partion table's > home directory will lost ,partion table will lost partion directory. > > my spark-defaults.conf config : > spark.sql.extensions org.apache.kyuubi.sql.KyuubiSparkSQLExtension > > because I can't reopen SPARK-42550 , for detail and reproduce please > reference: > https://issues.apache.org/jira/browse/SPARK-42550 > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-42650) link issue SPARK-42550
[ https://issues.apache.org/jira/browse/SPARK-42650?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17696037#comment-17696037 ] kevinshin commented on SPARK-42650: --- Spark and Kyuubi are both belong to apache. May Apache community help to figure out the detail of this issue? Will this issue keep exist the next releases. > link issue SPARK-42550 > -- > > Key: SPARK-42650 > URL: https://issues.apache.org/jira/browse/SPARK-42650 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 3.2.3 >Reporter: kevinshin >Priority: Major > > When use > [KyuubiSparkSQLExtension|https://kyuubi.readthedocs.io/en/v1.6.1-incubating/extensions/engines/spark/] > and when a `insert overwrite` statment meet exception ,a no partion table's > home directory will lost ,partion table will lost partion directory. > > my spark-defaults.conf config : > spark.sql.extensions org.apache.kyuubi.sql.KyuubiSparkSQLExtension > > because I can't reopen SPARK-42550 , for detail and reproduce please > reference: > https://issues.apache.org/jira/browse/SPARK-42550 > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-42650) link issue SPARK-42550
[ https://issues.apache.org/jira/browse/SPARK-42650?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17695989#comment-17695989 ] Yuming Wang commented on SPARK-42650: - It seems like a Kyuubi bug? > link issue SPARK-42550 > -- > > Key: SPARK-42650 > URL: https://issues.apache.org/jira/browse/SPARK-42650 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 3.2.3 >Reporter: kevinshin >Priority: Major > > When use > [KyuubiSparkSQLExtension|https://kyuubi.readthedocs.io/en/v1.6.1-incubating/extensions/engines/spark/] > and when a `insert overwrite` statment meet exception ,a no partion table's > home directory will lost ,partion table will lost partion directory. > > my spark-defaults.conf config : > spark.sql.extensions org.apache.kyuubi.sql.KyuubiSparkSQLExtension > > because I can't reopen SPARK-42550 , for detail and reproduce please > reference: > https://issues.apache.org/jira/browse/SPARK-42550 > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org