[ https://issues.apache.org/jira/browse/SPARK-36554?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17432455#comment-17432455 ]
Nicolas Azrak commented on SPARK-36554: --------------------------------------- Hi [~lekshmiii] [~hyukjin.kwon] I've created [https://github.com/apache/spark/pull/34356] to fix this. It's my first PR, can I get a review? > Error message while trying to use spark sql functions directly on dataframe > columns without using select expression > ------------------------------------------------------------------------------------------------------------------- > > Key: SPARK-36554 > URL: https://issues.apache.org/jira/browse/SPARK-36554 > Project: Spark > Issue Type: Bug > Components: Documentation, Examples, PySpark > Affects Versions: 3.1.1 > Reporter: Lekshmi Ramachandran > Priority: Minor > Labels: documentation, features, functions, spark-sql > Attachments: Screen Shot .png > > Original Estimate: 24h > Remaining Estimate: 24h > > The below code generates a dataframe successfully . Here make_date function > is used inside a select expression > > from pyspark.sql.functions import expr, make_date > df = spark.createDataFrame([(2020, 6, 26), (1000, 2, 29), (-44, 1, 1)],['Y', > 'M', 'D']) > df.select("*",expr("make_date(Y,M,D) as lk")).show() > > The below code fails with a message "cannot import name 'make_date' from > 'pyspark.sql.functions'" . Here the make_date function is directly called on > dataframe columns without select expression > > from pyspark.sql.functions import make_date > df = spark.createDataFrame([(2020, 6, 26), (1000, 2, 29), (-44, 1, 1)],['Y', > 'M', 'D']) > df.select(make_date(df.Y,df.M,df.D).alias("datefield")).show() > > The error message generated is misleading when it says "cannot import > make_date from pyspark.sql.functions" > -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org