Repository: spark
Updated Branches:
  refs/heads/master 004e29cba -> 9674af6f6


[SPARK-16568][SQL][DOCUMENTATION] update sql programming guide refreshTable API 
in python code

## What changes were proposed in this pull request?

update `refreshTable` API in python code of the sql-programming-guide.

This API is added in SPARK-15820

## How was this patch tested?

N/A

Author: WeichenXu <weichenxu...@outlook.com>

Closes #14220 from WeichenXu123/update_sql_doc_catalog.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/9674af6f
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/9674af6f
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/9674af6f

Branch: refs/heads/master
Commit: 9674af6f6f81066139ea675de724f951bd0d49c9
Parents: 004e29c
Author: WeichenXu <weichenxu...@outlook.com>
Authored: Tue Jul 19 18:48:41 2016 -0700
Committer: Reynold Xin <r...@databricks.com>
Committed: Tue Jul 19 18:48:41 2016 -0700

----------------------------------------------------------------------
 docs/sql-programming-guide.md | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/9674af6f/docs/sql-programming-guide.md
----------------------------------------------------------------------
diff --git a/docs/sql-programming-guide.md b/docs/sql-programming-guide.md
index 71f3ee4..3af935a 100644
--- a/docs/sql-programming-guide.md
+++ b/docs/sql-programming-guide.md
@@ -869,8 +869,8 @@ spark.catalog().refreshTable("my_table");
 <div data-lang="python"  markdown="1">
 
 {% highlight python %}
-# spark is an existing HiveContext
-spark.refreshTable("my_table")
+# spark is an existing SparkSession
+spark.catalog.refreshTable("my_table")
 {% endhighlight %}
 
 </div>


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to