I solved this issue by putting hbase-protobuf in Hadoop classpath, and not
in the spark classpath.

export HADOOP_CLASSPATH="/path/to/jar/hbase-protocol-0.98.1-cdh5.1.0.jar"



On Tue, Aug 26, 2014 at 5:42 PM, Ashish Jain <ashish....@gmail.com> wrote:

> Hello,
>
> I'm using the following version of Spark - 1.0.0+cdh5.1.0+41
> (1.cdh5.1.0.p0.27).
>
> I've tried to specify the libraries Spark uses using the following ways -
>
> 1) Adding it to spark context
> 2) Specifying the jar path in
>   a) spark.executor.extraClassPath
>   b) spark.executor.extraLibraryPath
> 3) Copying the libraries to spark/lib
> 4) Specifying the path in SPARK_CLASSPATH, even SPARK_LIBRARY_PATH
> 5) Passing as --jars argument
>
> but the spark application is not able to pick up the libraries although, I
> can see the message "SparkContext: Added JAR file". I get a NoClassDef
> found error.
>
> The only way I've been to able to make it work right now is by merging my
> application jar with all the library jars.
>
> What might be going on?
>
> I need it right now to specify hbase-protocol-0.98.1-cdh5.1.0.jar in
> SPARK_CLASSPATH as mentioned here
> https://issues.apache.org/jira/browse/HBASE-10877. I'm using spark-submit
> to submit the job
>
> Thanks
> Ashish
>
>

Reply via email to