[ https://issues.apache.org/jira/browse/SPARK-11413?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14982662#comment-14982662 ]
Yongjia Wang commented on SPARK-11413: -------------------------------------- yea, the fix is to update the joda.version number from 2.5 to 2.8.1 or to the newest 2.9 in Spark's pom.xml. This PR would just be a 1 line change in pom.xml It's only an issue when compiling with java 1.8u60, due to the changes in time format, and s3 request in some way relies on the time format in its header. Those links in my comment should explain better. It's no new dependency, but can be considered a java8 s3 combo bug. So this only affects the combination of java 8u60 + using s3a, which should become quite common and hits more people. If I understand correctly, building with java7 or java8 before u60 should be fine, even running with jre after java 8u60. > Java 8 build has problem with joda-time and s3 request, should bump joda-time > version > ------------------------------------------------------------------------------------- > > Key: SPARK-11413 > URL: https://issues.apache.org/jira/browse/SPARK-11413 > Project: Spark > Issue Type: Improvement > Components: Build > Reporter: Yongjia Wang > Priority: Minor > > Joda-time has problems with formatting time zones starting with Java 1.8u60, > and this will cause s3 request to fail. It is said to have been fixed at > joda-time 2.8.1. > Spark is still using joda-time 2.5 by fault, if java8 is used to build spark, > should set -Djoda.version=2.8.1 or above. > I was hit by this problem, and -Djoda.version=2.9 worked. > I don't see any reason not to bump up joda-time version in pom.xml > Should I create a pull request for this? It is trivial. > https://github.com/aws/aws-sdk-java/issues/484 > https://github.com/aws/aws-sdk-java/issues/444 > http://stackoverflow.com/questions/32058431/aws-java-sdk-aws-authentication-requires-a-valid-date-or-x-amz-date-header -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org