Repository: spark Updated Branches: refs/heads/master a10b328db -> 475a29f11
[SPARK-22637][SQL] Only refresh a logical plan once. ## What changes were proposed in this pull request? `CatalogImpl.refreshTable` uses `foreach(..)` to refresh all tables in a view. This traverses all nodes in the subtree and calls `LogicalPlan.refresh()` on these nodes. However `LogicalPlan.refresh()` is also refreshing its children, as a result refreshing a large view can be quite expensive. This PR just calls `LogicalPlan.refresh()` on the top node. ## How was this patch tested? Existing tests. Author: Herman van Hovell <hvanhov...@databricks.com> Closes #19837 from hvanhovell/SPARK-22637. Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/475a29f1 Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/475a29f1 Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/475a29f1 Branch: refs/heads/master Commit: 475a29f11ef488e7cb19bf7e0696d9d099d77c92 Parents: a10b328 Author: Herman van Hovell <hvanhov...@databricks.com> Authored: Tue Nov 28 16:03:47 2017 -0800 Committer: gatorsmile <gatorsm...@gmail.com> Committed: Tue Nov 28 16:03:47 2017 -0800 ---------------------------------------------------------------------- .../src/main/scala/org/apache/spark/sql/internal/CatalogImpl.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) ---------------------------------------------------------------------- http://git-wip-us.apache.org/repos/asf/spark/blob/475a29f1/sql/core/src/main/scala/org/apache/spark/sql/internal/CatalogImpl.scala ---------------------------------------------------------------------- diff --git a/sql/core/src/main/scala/org/apache/spark/sql/internal/CatalogImpl.scala b/sql/core/src/main/scala/org/apache/spark/sql/internal/CatalogImpl.scala index fdd2533..6ae307b 100644 --- a/sql/core/src/main/scala/org/apache/spark/sql/internal/CatalogImpl.scala +++ b/sql/core/src/main/scala/org/apache/spark/sql/internal/CatalogImpl.scala @@ -480,7 +480,7 @@ class CatalogImpl(sparkSession: SparkSession) extends Catalog { if (tableMetadata.tableType == CatalogTableType.VIEW) { // Temp or persistent views: refresh (or invalidate) any metadata/data cached // in the plan recursively. - table.queryExecution.analyzed.foreach(_.refresh()) + table.queryExecution.analyzed.refresh() } else { // Non-temp tables: refresh the metadata cache. sessionCatalog.refreshTable(tableIdent) --------------------------------------------------------------------- To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org