Hi All;

I have use case where some of the jars on HDFS, these jars I want to include
in my driver class path

if I pass with --jars it works fine, but if I pass using
spark.driver.extraClassPath it is failed

spark-sql --master yarn --jars hdfs://hacluster/tmp/testjar/* //Jars are
loaded to the classpath


spark-sql --master yarn --conf
spark.driver.extraClassPath=hdfs://hacluster/tmp/testjar/* //Jars did not
load to classpath

Is the limitation/bug ?

Even in spark documentation no limitation is specified for
spark.driver.extraClassPath












--
Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to