Repository: spark
Updated Branches:
  refs/heads/master 5320adc86 -> 5344bade8


[SPARK-15820][PYSPARK][SQL] Add Catalog.refreshTable into python API

## What changes were proposed in this pull request?

Add Catalog.refreshTable API into python interface for Spark-SQL.

## How was this patch tested?

Existing test.

Author: WeichenXu <weichenxu...@outlook.com>

Closes #13558 from WeichenXu123/update_python_sql_interface_refreshTable.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/5344bade
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/5344bade
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/5344bade

Branch: refs/heads/master
Commit: 5344bade8efb6f12aa43fbfbbbc2e3c0c7d16d98
Parents: 5320adc
Author: WeichenXu <weichenxu...@outlook.com>
Authored: Thu Jun 30 23:00:39 2016 +0800
Committer: Cheng Lian <l...@databricks.com>
Committed: Thu Jun 30 23:00:39 2016 +0800

----------------------------------------------------------------------
 python/pyspark/sql/catalog.py                                   | 5 +++++
 .../src/main/scala/org/apache/spark/sql/catalog/Catalog.scala   | 2 +-
 2 files changed, 6 insertions(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/5344bade/python/pyspark/sql/catalog.py
----------------------------------------------------------------------
diff --git a/python/pyspark/sql/catalog.py b/python/pyspark/sql/catalog.py
index 3033f14..4af930a 100644
--- a/python/pyspark/sql/catalog.py
+++ b/python/pyspark/sql/catalog.py
@@ -232,6 +232,11 @@ class Catalog(object):
         """Removes all cached tables from the in-memory cache."""
         self._jcatalog.clearCache()
 
+    @since(2.0)
+    def refreshTable(self, tableName):
+        """Invalidate and refresh all the cached metadata of the given 
table."""
+        self._jcatalog.refreshTable(tableName)
+
     def _reset(self):
         """(Internal use only) Drop all existing databases (except "default"), 
tables,
         partitions and functions, and set the current database to "default".

http://git-wip-us.apache.org/repos/asf/spark/blob/5344bade/sql/core/src/main/scala/org/apache/spark/sql/catalog/Catalog.scala
----------------------------------------------------------------------
diff --git a/sql/core/src/main/scala/org/apache/spark/sql/catalog/Catalog.scala 
b/sql/core/src/main/scala/org/apache/spark/sql/catalog/Catalog.scala
index 083a63c..91ed9b3 100644
--- a/sql/core/src/main/scala/org/apache/spark/sql/catalog/Catalog.scala
+++ b/sql/core/src/main/scala/org/apache/spark/sql/catalog/Catalog.scala
@@ -214,7 +214,7 @@ abstract class Catalog {
   def clearCache(): Unit
 
   /**
-   * Invalidate and refresh all the cached the metadata of the given table. 
For performance reasons,
+   * Invalidate and refresh all the cached metadata of the given table. For 
performance reasons,
    * Spark SQL or the external data source library it uses might cache certain 
metadata about a
    * table, such as the location of blocks. When those change outside of Spark 
SQL, users should
    * call this function to invalidate the cache.


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to