Robert,

You can build a Spark application using Maven for Hadoop 2 by adding a
dependency on the Hadoop 2.* hadoop-client package. If you define any
Hadoop Input/Output formats, you may also need to depend on the
hadoop-mapreduce package.

Regards,

Frank Austin Nothaft
fnoth...@berkeley.edu
fnoth...@eecs.berkeley.edu
202-340-0466


On Sun, Jun 29, 2014 at 12:20 PM, Robert James <srobertja...@gmail.com>
wrote:

> Although Spark's home page offers binaries for Spark 1.0.0 with Hadoop
> 2, the Maven repository only seems to have one version, which uses
> Hadoop 1.
>
> Is it possible to use a Maven link and Hadoop 2? What is the id?
>
> If not: How can I use the prebuilt binaries to use Hadoop 2? Do I just
> copy the lib/ dir into my classpath?
>

Reply via email to