Although this has been discussed a number of times here, I am still unclear
how to add user jars to the spark-shell:

a) for importing classes for use directly within the shell interpreter

b) for  invoking SparkContext commands with closures referencing user
supplied classes contained within jar's.

Similarly to other posts, I have gone through:

 updating bin/spark-env.sh
 SPARK_CLASSPATH
 SPARK_SUBMIT_OPTS
  creating conf/spark-defaults.conf  and adding
 spark.executor.extraClassPath
--driver-class-path
  etc

Hopefully there would be something along the lines of  a single entry added
to some claspath somewhere like this

   SPARK_CLASSPATH/driver-class-path/spark.executor.extraClassPath (or
whatever is the correct option..)  =
$HBASE_HOME/*:$HBASE_HOME/lib/*:$SPARK_CLASSPATH

Any ideas here?

thanks

Reply via email to