Robert:

When using `spark-submit`, the application jar along with any jars included
with the `--jars` option
will be automatically transferred to the cluster. URLs supplied after
`--jars` must be separated by commas. That list is included on the driver
and executor classpaths.     Directory expansion does not work with
`--jars`.

For spark.driver.extraClassPath :

Extra classpath entries to prepend to the classpath of the driver.
*Note:* In client mode, this config must not be set through the
SparkConf directly
in your application, because the driver JVM has already started at that
point. Instead, please set this through the --driver-class-path command
line option or in your default properties file.

FYI

On Tue, Jul 5, 2016 at 3:39 PM, Robert James <srobertja...@gmail.com> wrote:

> I'm using spark-shell.  The perplexing thing is that if I load it via
> spark-shell --jars, it seems to work.  However, if I load it via
> spark.driver.extraClassPath in the config file, it seems to fail.
> What is the difference between --jars (command line) and
> spark.driver.extraClassPath (config)?
>
> On 7/5/16, Dima Spivak <dspi...@cloudera.com> wrote:
> > Hey Robert,
> >
> > HBaseConfiguration is part of the hbase-common module of the HBase
> project.
> > Are you using Maven to provide dependencies or just running java -cp?
> >
> > -Dima
> >
> > On Monday, July 4, 2016, Robert James <srobertja...@gmail.com> wrote:
> >
> >> When trying to load HBase via Spark, I get NoClassDefFoundError
> >> org/apache/hadoop/hbase/HBaseConfiguration errors.
> >>
> >> How do I provide that class to Spark?
> >>
> >
>

Reply via email to