[ https://issues.apache.org/jira/browse/SPARK-19128?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Xiao Li updated SPARK-19128: ---------------------------- Description: {{ALTER TABLE SET LOCATION}} is missing refresh table cache. {noformat} val catalog = spark.sessionState.catalog sql("CREATE TABLE tab1 using parquet AS SELECT 1 as a") sql("CREATE TABLE tab2 using parquet AS SELECT 2 as a") checkAnswer(spark.table("tab1"), Seq(Row(1))) checkAnswer(spark.table("tab2"), Seq(Row(2))) val metadataTab1 = catalog.getTableMetadata(TableIdentifier("tab1")) val locTab1 = metadataTab1.storage.locationUri sql(s"ALTER TABLE tab2 SET LOCATION '${locTab1.get}'") spark.table("tab2").show() {noformat} The above codes still output the contents of the previous location after changing location. was: The following DDL changes miss refreshing metadata cache: - ALTER TABLE SET LOCATION - RENAME TABLE - RENAME PARTITION > Refresh Metadata Cache After ALTER TABLE SET LOCATION > ----------------------------------------------------- > > Key: SPARK-19128 > URL: https://issues.apache.org/jira/browse/SPARK-19128 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 2.1.0 > Reporter: Xiao Li > Assignee: Xiao Li > Labels: correctness > > {{ALTER TABLE SET LOCATION}} is missing refresh table cache. > {noformat} > val catalog = spark.sessionState.catalog > sql("CREATE TABLE tab1 using parquet AS SELECT 1 as a") > sql("CREATE TABLE tab2 using parquet AS SELECT 2 as a") > checkAnswer(spark.table("tab1"), Seq(Row(1))) > checkAnswer(spark.table("tab2"), Seq(Row(2))) > val metadataTab1 = catalog.getTableMetadata(TableIdentifier("tab1")) > val locTab1 = metadataTab1.storage.locationUri > sql(s"ALTER TABLE tab2 SET LOCATION '${locTab1.get}'") > spark.table("tab2").show() > {noformat} > The above codes still output the contents of the previous location after > changing location. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org