I think SPARK_CLASSPATH is deprecated.

Can you show the command line launching your Spark job ?
Which Spark release do you use ?

Thanks



On Thu, Feb 11, 2016 at 5:38 PM, Charlie Wright <charliewri...@live.ca>
wrote:

> built and installed hadoop with:
> mvn package -Pdist -DskipTests -Dtar
> mvn install -DskipTests
>
> built spark with:
> mvn -Pyarn -Phadoop-2.6 -Dhadoop.version=2.8.0-SNAPSHOT -DskipTests clean
> package
>
> Where would I check the classpath? Is it the environment variable
> SPARK_CLASSPATH?
>
> Charles
>
> ------------------------------
> Date: Thu, 11 Feb 2016 17:29:00 -0800
> Subject: Re: Building Spark with a Custom Version of Hadoop: HDFS
> ClassNotFoundException
> From: yuzhih...@gmail.com
> To: charliewri...@live.ca
> CC: d...@spark.apache.org
>
> Hdfs class is in hadoop-hdfs-XX.jar
>
> Can you check the classpath to see if the above jar is there ?
>
> Please describe the command lines you used for building hadoop / Spark.
>
> Cheers
>
> On Thu, Feb 11, 2016 at 5:15 PM, Charlie Wright <charliewri...@live.ca>
> wrote:
>
> I am having issues trying to run a test job on a built version of Spark
> with a custom Hadoop JAR.
> My custom hadoop version runs without issues and I can run jobs from a
> precompiled version of Spark (with Hadoop) no problem.
>
> However, whenever I try to run the same Spark example on the Spark version
> with my custom hadoop JAR - I get this error:
> "Exception in thread "main" java.lang.RuntimeException:
> java.lang.ClassNotFoundException: Class org.apache.hadoop.fs.Hdfs not found"
>
> Does anybody know why this is happening?
>
> Thanks,
> Charles.
>
>
>

Reply via email to