This is an automated email from the ASF dual-hosted git repository. wenchen pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push: new 378ac78 [SPARK-35318][SQL][FOLLOWUP] Hide the internal view properties for `show tblproperties` 378ac78 is described below commit 378ac78bdfd197eb921c56b0a6f22cccd7cd42a1 Author: RoryQi <1242949...@qq.com> AuthorDate: Mon Jun 28 07:05:29 2021 +0000 [SPARK-35318][SQL][FOLLOWUP] Hide the internal view properties for `show tblproperties` ### What changes were proposed in this pull request? PR #32441 hid the internal view properties for describe table command, But the `show tblproperties view` case is not covered. ### Why are the changes needed? Avoid internal properties confusing the users. ### Does this PR introduce _any_ user-facing change? Yes Before this change, the user will see below output for `show tblproperties test_view` ``` .... p1 v1 p2 v2 view.catalogAndNamespace.numParts 2 view.catalogAndNamespace.part.0 spark_catalog view.catalogAndNamespace.part.1 default view.query.out.col.0 c1 view.query.out.numCols 1 view.referredTempFunctionsNames [] view.referredTempViewNames [] ... ``` After this change, the internal properties will be hidden. ``` .... p1 v1 p2 v2 ... ``` ### How was this patch tested? existing UT Closes #33016 from jerqi/hide_show_tblproperties. Authored-by: RoryQi <1242949...@qq.com> Signed-off-by: Wenchen Fan <wenc...@databricks.com> --- .../main/scala/org/apache/spark/sql/execution/command/tables.scala | 3 ++- .../test/resources/sql-tests/results/show-tblproperties.sql.out | 7 ------- 2 files changed, 2 insertions(+), 8 deletions(-) diff --git a/sql/core/src/main/scala/org/apache/spark/sql/execution/command/tables.scala b/sql/core/src/main/scala/org/apache/spark/sql/execution/command/tables.scala index 72168f2..7d4d227 100644 --- a/sql/core/src/main/scala/org/apache/spark/sql/execution/command/tables.scala +++ b/sql/core/src/main/scala/org/apache/spark/sql/execution/command/tables.scala @@ -907,7 +907,8 @@ case class ShowTablePropertiesCommand( Seq(Row(p, propValue)) } case None => - catalogTable.properties.map(p => Row(p._1, p._2)).toSeq + catalogTable.properties.filterKeys(!_.startsWith(CatalogTable.VIEW_PREFIX)) + .map(p => Row(p._1, p._2)).toSeq } } } diff --git a/sql/core/src/test/resources/sql-tests/results/show-tblproperties.sql.out b/sql/core/src/test/resources/sql-tests/results/show-tblproperties.sql.out index 1008f9a..e8a9b85 100644 --- a/sql/core/src/test/resources/sql-tests/results/show-tblproperties.sql.out +++ b/sql/core/src/test/resources/sql-tests/results/show-tblproperties.sql.out @@ -59,13 +59,6 @@ struct<key:string,value:string> -- !query output p1 v1 p2 v2 -view.catalogAndNamespace.numParts 2 -view.catalogAndNamespace.part.0 spark_catalog -view.catalogAndNamespace.part.1 default -view.query.out.col.0 c1 -view.query.out.numCols 1 -view.referredTempFunctionsNames [] -view.referredTempViewNames [] -- !query --------------------------------------------------------------------- To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org