dongjoon-hyun commented on code in PR #47402:
URL: https://github.com/apache/spark/pull/47402#discussion_r1683164754


##########
bin/spark-shell:
##########
@@ -34,7 +34,7 @@ fi
 
 export _SPARK_CMD_USAGE="Usage: ./bin/spark-shell [options]
 
-Scala REPL options:
+Scala REPL options (Spark Classic only):

Review Comment:
   This seems to be first `Spark Classic` wording outside `python`.
   
   ```
   $ git grep 'Spark Classic'
   python/pyspark/errors/error-conditions.json:      "Calling property or 
member '<member>' is not supported in PySpark Classic, please use Spark Connect 
instead."
   python/pyspark/sql/classic/__init__.py:"""Spark Classic specific"""
   python/pyspark/sql/column.py:    # Spark Classic Column by default. This is 
NOT an API, and NOT supposed to
   python/pyspark/sql/connect/expressions.py:        #   Column<'CAST(a AS 
BIGINT)'>     <- Spark Classic
   python/pyspark/sql/dataframe.py:    # Spark Classic DataFrame by default. 
This is NOT an API, and NOT supposed to
   python/pyspark/sql/session.py:        In Spark Classic, a temporary view 
referenced in `spark.sql` is resolved immediately,
   python/pyspark/sql/session.py:        In Spark Classic, a temporary view 
referenced in `spark.table` is resolved immediately,
   python/pyspark/sql/tests/connect/test_connect_dataframe_property.py:         
   # Using this temp env to properly invoke mapInPandas in PySpark Classic.
   python/pyspark/sql/tests/connect/test_parity_types.py:    
@unittest.skip("This test is dedicated for PySpark Classic.")
   python/pyspark/sql/tests/connect/test_parity_types.py:    
@unittest.skip("This test is dedicated for PySpark Classic.")
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to