Spark's Hadooop Dependency

2014-06-25 Thread Robert James
To add Spark to a SBT project, I do:
  libraryDependencies += org.apache.spark %% spark-core % 1.0.0
% provided

How do I make sure that the spark version which will be downloaded
will depend on, and use, Hadoop 2, and not Hadoop 1?

Even with a line:
   libraryDependencies += org.apache.hadoop % hadoop-client % 2.4.0

I still see SBT downloading Hadoop 1:

[debug] == resolving dependencies
org.apache.spark#spark-core_2.10;1.0.0-org.apache.hadoop#hadoop-client;1.0.4
[compile-master(*)]
[debug] dependency descriptor has been mediated: dependency:
org.apache.hadoop#hadoop-client;2.4.0 {compile=[default(compile)]} =
dependency: org.apache.hadoop#hadoop-client;1.0.4
{compile=[default(compile)]}


Re: Spark's Hadooop Dependency

2014-06-25 Thread Koert Kuipers
libraryDependencies ++= Seq(
  org.apache.spark %% spark-core % versionSpark % provided
exclude(org.apache.hadoop, hadoop-client)
  org.apache.hadoop % hadoop-client % versionHadoop % provided
)


On Wed, Jun 25, 2014 at 11:26 AM, Robert James srobertja...@gmail.com
wrote:

 To add Spark to a SBT project, I do:
   libraryDependencies += org.apache.spark %% spark-core % 1.0.0
 % provided

 How do I make sure that the spark version which will be downloaded
 will depend on, and use, Hadoop 2, and not Hadoop 1?

 Even with a line:
libraryDependencies += org.apache.hadoop % hadoop-client % 2.4.0

 I still see SBT downloading Hadoop 1:

 [debug] == resolving dependencies

 org.apache.spark#spark-core_2.10;1.0.0-org.apache.hadoop#hadoop-client;1.0.4
 [compile-master(*)]
 [debug] dependency descriptor has been mediated: dependency:
 org.apache.hadoop#hadoop-client;2.4.0 {compile=[default(compile)]} =
 dependency: org.apache.hadoop#hadoop-client;1.0.4
 {compile=[default(compile)]}