Hyukjin Kwon created SPARK-44926:
------------------------------------

             Summary: Replace the module alias 'sf' instead of 'F' in 
pyspark.sql import functions
                 Key: SPARK-44926
                 URL: https://issues.apache.org/jira/browse/SPARK-44926
             Project: Spark
          Issue Type: Documentation
          Components: PySpark
    Affects Versions: 3.5.0
            Reporter: Hyukjin Kwon


{code}
from pyspark.sql import functions as F
{code}

isn’t very Pythonic - it does not follow PEP 8, see  Package and Module Names 
(https://peps.python.org/pep-0008/#package-and-module-names).

Modules should have short, all-lowercase names. Underscores can be used in the 
module name if it improves
readability. Python packages should also have short, all-lowercase names, 
although the use of underscores
is discouraged.

Therefore, the module’s alias should follow this. In practice, the uppercase is 
only used at the module/package
level constants in my experience, see also Constants 
(https://peps.python.org/pep-0008/#constants).

See also this stackoverflow comment 
(https://stackoverflow.com/questions/70458086/how-to-correctly-import-pyspark-sql-functions#comment129714058_70458115).



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to