This is an automated email from the ASF dual-hosted git repository. ruifengz pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push: new 49d2214a7660 [SPARK-47816][CONNECT][DOCS] Document the lazy evaluation of views in `spark.{sql, table}` 49d2214a7660 is described below commit 49d2214a76603a273542b30b87f8c2fe342f67f4 Author: Ruifeng Zheng <ruife...@apache.org> AuthorDate: Fri Apr 12 17:00:26 2024 +0800 [SPARK-47816][CONNECT][DOCS] Document the lazy evaluation of views in `spark.{sql, table}` ### What changes were proposed in this pull request? Document the lazy evaluation of views in `spark.{sql, table}` ### Why are the changes needed? it is by design in Spark Connect, so we need to document it ### Does this PR introduce _any_ user-facing change? doc change ### How was this patch tested? ci ### Was this patch authored or co-authored using generative AI tooling? no Closes #46007 from zhengruifeng/doc_connect_sql_table. Authored-by: Ruifeng Zheng <ruife...@apache.org> Signed-off-by: Ruifeng Zheng <ruife...@apache.org> --- python/pyspark/sql/session.py | 14 ++++++++++++++ 1 file changed, 14 insertions(+) diff --git a/python/pyspark/sql/session.py b/python/pyspark/sql/session.py index f1666a9f575c..1098c41a3f4c 100644 --- a/python/pyspark/sql/session.py +++ b/python/pyspark/sql/session.py @@ -1630,6 +1630,13 @@ class SparkSession(SparkConversionMixin): ------- :class:`DataFrame` + Notes + ----- + In Spark Classic, a temporary view referenced in `spark.sql` is resolved immediately, + while in Spark Connect it is lazily evaluated. + So in Spark Connect if a view is dropped, modified or replaced after `spark.sql`, the + execution may fail or generate different results. + Examples -------- Executing a SQL query. @@ -1756,6 +1763,13 @@ class SparkSession(SparkConversionMixin): ------- :class:`DataFrame` + Notes + ----- + In Spark Classic, a temporary view referenced in `spark.table` is resolved immediately, + while in Spark Connect it is lazily evaluated. + So in Spark Connect if a view is dropped, modified or replaced after `spark.table`, the + execution may fail or generate different results. + Examples -------- >>> spark.range(5).createOrReplaceTempView("table1") --------------------------------------------------------------------- To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org