[ https://issues.apache.org/jira/browse/SPARK-20771?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Xiangrui Meng updated SPARK-20771: ---------------------------------- Description: The weekofyear() implementation follows HIVE / ISO 8601 week number. However it is not useful because it doesn't return the year of the week start. For example, weekofyear("2017-01-01") returns 52 Anyone using this with groupBy('week) might do the aggregation or ordering wrong. A better implementation should return the year number of the week as well. MySQL's yearweek() is much better in this sense: https://dev.mysql.com/doc/refman/5.5/en/date-and-time-functions.html#function_yearweek. Maybe we should implement that in Spark. was: The weekofyear() implementation follows HIVE / ISO 8601 week number. However it is not useful because it doesn't return the year of the week start. For example, weekofyear("2017-01-01") returns 52 Anyone using this with groupBy('week) might do the aggregation wrong. A better implementation should return the year number of the week as well. MySQL's yearweek() is much better in this sense: https://dev.mysql.com/doc/refman/5.5/en/date-and-time-functions.html#function_yearweek. Maybe we should implement that in Spark. > Usability issues with weekofyear() > ---------------------------------- > > Key: SPARK-20771 > URL: https://issues.apache.org/jira/browse/SPARK-20771 > Project: Spark > Issue Type: Improvement > Components: SQL > Affects Versions: 2.2.0 > Reporter: Xiangrui Meng > Priority: Minor > > The weekofyear() implementation follows HIVE / ISO 8601 week number. However > it is not useful because it doesn't return the year of the week start. For > example, > weekofyear("2017-01-01") returns 52 > Anyone using this with groupBy('week) might do the aggregation or ordering > wrong. A better implementation should return the year number of the week as > well. > MySQL's yearweek() is much better in this sense: > https://dev.mysql.com/doc/refman/5.5/en/date-and-time-functions.html#function_yearweek. > Maybe we should implement that in Spark. -- This message was sent by Atlassian JIRA (v6.3.15#6346) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org