[ https://issues.apache.org/jira/browse/SPARK-3794?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Michael Armbrust resolved SPARK-3794. ------------------------------------- Resolution: Fixed Fix Version/s: 1.2.0 Issue resolved by pull request 2662 [https://github.com/apache/spark/pull/2662] > Building spark core fails due to inadvertent dependency on Commons IO > --------------------------------------------------------------------- > > Key: SPARK-3794 > URL: https://issues.apache.org/jira/browse/SPARK-3794 > Project: Spark > Issue Type: Bug > Components: Spark Core > Affects Versions: 1.2.0 > Environment: Mac OS X 10.9.5 > Reporter: cocoatomo > Labels: spark > Fix For: 1.2.0 > > > At the commit cf1d32e3e1071829b152d4b597bf0a0d7a5629a2, building spark core > result in compilation error when we specify some hadoop versions. > To reproduce this issue, we should execute following command with > <hadoop.version>=1.1.0, 1.1.1, 1.1.2, 1.2.0, 1.2.1, or 2.2.0. > {noformat} > $ cd ./core > $ mvn -Dhadoop.version=<hadoop.version> -DskipTests clean compile > ... > [ERROR] > /Users/tomohiko/MyRepos/Scala/spark/core/src/main/scala/org/apache/spark/util/Utils.scala:720: > value listFilesAndDirs is not a member of object > org.apache.commons.io.FileUtils > [ERROR] val files = FileUtils.listFilesAndDirs(dir, > TrueFileFilter.TRUE, TrueFileFilter.TRUE) > [ERROR] ^ > {noformat} > Because that compilation uses commons-io version 2.1 and > FileUtils#listFilesAndDirs method was added at commons-io version 2.2, this > compilation always fails. > FileUtils#listFilesAndDirs → > http://commons.apache.org/proper/commons-io/apidocs/org/apache/commons/io/FileUtils.html#listFilesAndDirs%28java.io.File,%20org.apache.commons.io.filefilter.IOFileFilter,%20org.apache.commons.io.filefilter.IOFileFilter%29 > Because a hadoop-client in those problematic version depends on commons-io > 2.1 not 2.4, we should have assumption that commons-io is version 2.1. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org