HyukjinKwon commented on code in PR #39225:
URL: https://github.com/apache/spark/pull/39225#discussion_r1057953902


##########
python/pyspark/sql/connect/window.py:
##########
@@ -242,3 +243,46 @@ def rangeBetween(start: int, end: int) -> "WindowSpec":
 
 
 Window.__doc__ = PySparkWindow.__doc__
+
+
+def _test() -> None:
+    import os
+    import sys
+    import doctest
+    from pyspark.sql import SparkSession as PySparkSession
+    from pyspark.testing.connectutils import should_test_connect, 
connect_requirement_message
+
+    os.chdir(os.environ["SPARK_HOME"])
+
+    if should_test_connect:
+        import pyspark.sql.connect.window
+
+        globs = pyspark.sql.window.__dict__.copy()
+        # Works around to create a regular Spark session
+        sc = SparkContext("local[4]", "sql.connect.window tests", 
conf=SparkConf())
+        globs["_spark"] = PySparkSession(sc, options={"spark.app.name": 
"sql.connect.window tests"})
+
+        # Creates a remote Spark session.
+        globs["spark"] = 
PySparkSession.builder.remote("sc://localhost").getOrCreate()
+
+        (failure_count, test_count) = doctest.testmod(
+            pyspark.sql.connect.window,
+            globs=globs,
+            optionflags=doctest.ELLIPSIS
+            | doctest.NORMALIZE_WHITESPACE
+            | doctest.IGNORE_EXCEPTION_DETAIL,
+        )
+        # TODO(SPARK-41529): Implement stop in RemoteSparkSession.
+        #   Stop the regular Spark session (server) too.

Review Comment:
   ```suggestion
           globs["spark"].stop()
   ```
   
   It's implemented now.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to