[ https://issues.apache.org/jira/browse/SPARK-4169?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Niklas Wilcke updated SPARK-4169: --------------------------------- Description: With a non english locale the method isBindCollision in core/src/main/scala/org/apache/spark/util/Utils.scala doesn't work because it checks the exception message, which is locale dependent. The test suite core/src/test/scala/org/apache/spark/util/UtilsSuite.scala also contains a locale dependent test "string formatting of time durations" which uses a DecimalSeperator which is locale dependent. I created a pull request on github to solve this issue. was: With a non english locale the method isBindCollision in core/src/main/scala/org/apache/spark/util/Utils.scala doesn't work because it checks the exception message, which is locale dependent. The test suite core/src/test/scala/org/apache/spark/util/UtilsSuite.scala also contains a locale dependent test "string formatting of time durations" which uses a DecimalSeperator which is locale dependent. > [Core] Locale dependent code > ---------------------------- > > Key: SPARK-4169 > URL: https://issues.apache.org/jira/browse/SPARK-4169 > Project: Spark > Issue Type: Bug > Components: Spark Core > Affects Versions: 1.1.0 > Environment: Debian, Locale: de_DE > Reporter: Niklas Wilcke > Fix For: 1.2.0 > > Original Estimate: 0.25h > Remaining Estimate: 0.25h > > With a non english locale the method isBindCollision in > core/src/main/scala/org/apache/spark/util/Utils.scala > doesn't work because it checks the exception message, which is locale > dependent. > The test suite > core/src/test/scala/org/apache/spark/util/UtilsSuite.scala > also contains a locale dependent test "string formatting of time durations" > which uses a DecimalSeperator which is locale dependent. > I created a pull request on github to solve this issue. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org