Here's the PR
https://github.com/apache/zeppelin/pull/3308

Y. Ethan Guo <guoyi...@uber.com> 于2019年2月28日周四 上午2:50写道:

> Hi All,
>
> I'm trying to use the new feature of yarn cluster mode to run Spark 2.4.0
> jobs on Zeppelin 0.8.1. I've set the SPARK_HOME, SPARK_SUBMIT_OPTIONS, and
> HADOOP_CONF_DIR env variables in zeppelin-env.sh so that the Spark
> interpreter can be started in the cluster. I used `--jars` in
> SPARK_SUBMIT_OPTIONS to add local jars. However, when I tried to import a
> class from the jars in a Spark paragraph, the interpreter complained that
> it cannot find the package and class ("<console>:23: error: object ... is
> not a member of package ..."). Looks like the jars are not properly
> imported.
>
> I followed the instruction here
> <https://zeppelin.apache.org/docs/0.8.1/interpreter/spark.html#2-loading-spark-properties>
> to add the jars, but it seems that it's not working in the cluster mode.
> And this issue seems to be related to this bug:
> https://jira.apache.org/jira/browse/ZEPPELIN-3986.  Is there any update
> on fixing it? What is the right way to add local jars in yarn cluster mode?
> Any help and update are much appreciated.
>
>
> Here's the SPARK_SUBMIT_OPTIONS I used (packages and jars paths omitted):
>
> export SPARK_SUBMIT_OPTIONS="--driver-memory 12G --packages ... --jars ...
> --repositories
> https://repository.cloudera.com/artifactory/public/,https://repository.cloudera.com/content/repositories/releases/,http://repo.spring.io/plugins-release/
> "
>
> Thanks,
> - Ethan
> --
> Best,
> - Ethan
>


-- 
Best Regards

Jeff Zhang

Reply via email to