[ 
https://issues.apache.org/jira/browse/SPARK-39433?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean R. Owen updated SPARK-39433:
---------------------------------
    Priority: Minor  (was: Major)

Not sure; "w" is not supported, it seems: 
https://spark.apache.org/docs/latest/sql-ref-datetime-pattern.html

Yet it sort of works. The cases that return null have the first day of the 
first week of the year before the start of the year (at least, that's what Java 
does with it).

[~maxgekk] do you happen to know?

> to_date function returns a null for the first week of the year
> --------------------------------------------------------------
>
>                 Key: SPARK-39433
>                 URL: https://issues.apache.org/jira/browse/SPARK-39433
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>    Affects Versions: 3.1.2
>            Reporter: CHARLES HOGG
>            Priority: Minor
>
> When I use week of year in the to_date function, the first week of the year 
> returns a null for many years.
> ```
> df=pyrasa.sparkSession.createDataFrame([["2013-01"],["2013-02"],["2017-01"],["2018-01"]],["input"])
> df.select(func.col("input"),func.to_date(func.col("input"),"yyyy-ww").alias("date"))
>  \
>   .show()
> ```
> ```
> +-------+----------+
> |  input|      date|
> +-------+----------+
> |2013-01|      null|
> |2013-02|2013-01-06|
> |2017-01|2017-01-01|
> |2018-01|      null|
> +-------+----------+
> ```
> Why is this? Is it a bug in the to_date function?



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to