[ https://issues.apache.org/jira/browse/SPARK-34011?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Maxim Gekk updated SPARK-34011: ------------------------------- Description: Here is the example to reproduce the issue: {code:sql} spark-sql> CREATE TABLE tbl1 (col0 int, part0 int) USING parquet PARTITIONED BY (part0); spark-sql> INSERT INTO tbl1 PARTITION (part0=0) SELECT 0; spark-sql> INSERT INTO tbl1 PARTITION (part0=1) SELECT 1; spark-sql> CACHE TABLE tbl1; spark-sql> SELECT * FROM tbl1; 0 0 1 1 spark-sql> ALTER TABLE tbl1 PARTITION (part0 = 0) RENAME TO PARTITION (part0 = 2); spark-sql> SELECT * FROM tbl1; 0 0 1 1 {code} was: Here is the example to reproduce the issue: {code:sql} spark-sql> CREATE TABLE tbl1 (col0 int, part0 int) USING parquet PARTITIONED BY (part0); spark-sql> INSERT INTO tbl1 PARTITION (part0=0) SELECT 0; spark-sql> INSERT INTO tbl1 PARTITION (part0=1) SELECT 1; spark-sql> CACHE TABLE tbl1; spark-sql> SELECT * FROM tbl1; 0 0 1 1 spark-sql> ALTER TABLE tbl1 DROP PARTITION (part0=0); spark-sql> SELECT * FROM tbl1; 0 0 1 1 {code} > ALTER TABLE .. RENAME TO PARTITION doesn't refresh cache > -------------------------------------------------------- > > Key: SPARK-34011 > URL: https://issues.apache.org/jira/browse/SPARK-34011 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 3.0.1, 3.1.0, 3.2.0 > Reporter: Maxim Gekk > Assignee: Maxim Gekk > Priority: Major > Labels: correctness > Fix For: 3.0.2, 3.1.0, 3.2.0 > > > Here is the example to reproduce the issue: > {code:sql} > spark-sql> CREATE TABLE tbl1 (col0 int, part0 int) USING parquet PARTITIONED > BY (part0); > spark-sql> INSERT INTO tbl1 PARTITION (part0=0) SELECT 0; > spark-sql> INSERT INTO tbl1 PARTITION (part0=1) SELECT 1; > spark-sql> CACHE TABLE tbl1; > spark-sql> SELECT * FROM tbl1; > 0 0 > 1 1 > spark-sql> ALTER TABLE tbl1 PARTITION (part0 = 0) RENAME TO PARTITION (part0 > = 2); > spark-sql> SELECT * FROM tbl1; > 0 0 > 1 1 > {code} > -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org