Daeho Ro created SPARK-32683:
--------------------------------

             Summary: Datetime Pattern F works wrongly.
                 Key: SPARK-32683
                 URL: https://issues.apache.org/jira/browse/SPARK-32683
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 3.0.0
         Environment: Windows 10 Pro with Jupyter Lab Docker Image for the 
spark 3.0.0 and python 3.8.5.


REPOSITORY : jupyter/all-spark-notebook
TAG:  f1811928b3dd 
            Reporter: Daeho Ro


>From the docs, the pattern F should give a week of the month.
|*Symbol*|*Meaning*|*Presentation*|*Example*|
|F|week-of-month|number(1)|3|

I have tested in the scala spark 3.0.0 and pyspark 3.0.0:
{code:java}
from pyspark.sql.functions import *df.withColumn('date', to_timestamp('date', 
'yyyy-MM-dd')) \
  .withColumn('month', month('date')) \
  .withColumn('week', date_format('date', 'F')) \
  .show(10, False)

+-------------------+-----+----+
|date               |month|week|
+-------------------+-----+----+
|2020-08-01 00:00:00|8    |1   |
|2020-08-02 00:00:00|8    |2   |
|2020-08-03 00:00:00|8    |3   |
|2020-08-04 00:00:00|8    |4   |
|2020-08-05 00:00:00|8    |5   |
|2020-08-06 00:00:00|8    |6   |
|2020-08-07 00:00:00|8    |7   |
|2020-08-08 00:00:00|8    |1   |
|2020-08-09 00:00:00|8    |2   |
|2020-08-10 00:00:00|8    |3   |
+-------------------+-----+----+ {code}
The `week` column is not the week of the month. It is a day of the week as a 
number.

!image-2020-08-21-21-31-32-297.png!

>From my calendar, the first day of August should have 1 for the week-of-month 
>and from 2nd to 8th should have 2 and so on.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to