[ 
https://issues.apache.org/jira/browse/SPARK-44928?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17757894#comment-17757894
 ] 

ASF GitHub Bot commented on SPARK-44928:
----------------------------------------

User 'HyukjinKwon' has created a pull request for this issue:
https://github.com/apache/spark/pull/42628

> Replace the module alias 'sf' instead of 'F' in pyspark.sql import functions
> ----------------------------------------------------------------------------
>
>                 Key: SPARK-44928
>                 URL: https://issues.apache.org/jira/browse/SPARK-44928
>             Project: Spark
>          Issue Type: Documentation
>          Components: PySpark
>    Affects Versions: 3.5.0
>            Reporter: Hyukjin Kwon
>            Priority: Major
>
> {code}
> from pyspark.sql import functions as F
> {code}
> isn’t very Pythonic - it does not follow PEP 8, see  Package and Module Names 
> (https://peps.python.org/pep-0008/#package-and-module-names).
> {quote}
> Modules should have short, all-lowercase names. Underscores can be used in 
> the module name if it improves
> readability. Python packages should also have short, all-lowercase names, 
> although the use of underscores
> is discouraged.
> {quote}
> Therefore, the module’s alias should follow this. In practice, the uppercase 
> is only used at the module/package
> level constants in my experience, see also Constants 
> (https://peps.python.org/pep-0008/#constants).
> See also this stackoverflow comment 
> (https://stackoverflow.com/questions/70458086/how-to-correctly-import-pyspark-sql-functions#comment129714058_70458115).



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to