Github user tarekauel commented on the pull request:

    https://github.com/apache/spark/pull/6981#issuecomment-116502627
  
    The tests fail because Hive calculates for the May, 6 2011 18 as week 
number, Java `Locale.US` 19.
    
    The reason is a different definition of week 1 (see [The definition of Week 
of Year is local dependent.](http://stackoverflow.com/a/4608695/3532525)).
    
    
    ```
    scala> new java.text.SimpleDateFormat("w", java.util.Locale.US).format(new 
java.text.SimpleDateFormat("dd-MM-yyyy", 
java.util.Locale.US).parse("06-05-2011"))
    res16: String = 19
    ```
    
    ```
    scala> new java.text.SimpleDateFormat("w", 
java.util.Locale.GERMANY).format(new java.text.SimpleDateFormat("dd-MM-yyyy", 
java.util.Locale.GERMANY).parse("06-05-2011"))
    res15: String = 18
    ```
    
    @Davies It seems to be that the `SimpleDateFormat` ignores that the default 
Timezone has been set in `beforeAll` of `HiveCompatibilitySuite`. Shall I add 
`WeekOfYear` to the whitelist?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to