Hi,

I met some interesting problems with --jars options
As I use the third party dependencies: elasticsearch-spark, I pass this jar
with the following command:
./bin/spark-submit --jars path-to-dependencies ...
It works well.
However, if I use HiveContext.sql, spark will lost the dependencies that I
passed.It seems that the execution of HiveContext will override the
configuration.(But if we check sparkContext._conf, the configuration is
unchanged)

But if I passed dependencies with --driver-class-path
and spark.executor.extraClassPath. The problem will disappear.

Is there anyone know why this interesting problem happens?

Thanks a lot for your help in advance.

Cheers
Gen

Reply via email to