As mentioned, deprecated in Spark 1.0+.
Try to use the --driver-class-path:
./bin/spark-shell --driver-class-path yourlib.jar:abc.jar:xyz.jar
Don't use glob *, specify the JAR one by one with colon.
Date: Wed, 9 Jul 2014 13:45:07 -0700
From: kat...@cs.pitt.edu
Subject: SPARK_CLASSPATH Wa
Hello,
I have installed Apache Spark v1.0.0 in a machine with a proprietary Hadoop
Distribution installed (v2.2.0 without yarn). Due to the fact that the
Hadoop Distribution that I am using, uses a list of jars , I do the
following changes to the conf/spark-env.sh
#!/usr/bin/env bash
export HADO