Hi,
May be someone already had experience to build docker image for spark?
I want to build docker image with spark inside but configured against remote
YARN cluster.
I have already created image with spark 1.6.2 inside.
But when I run 
spark-shell --master yarn --deploy-mode client --driver-memory 32G
--executor-memory 32G --executor-cores 8
inside docker I get the following exception
Diagnostics: java.io.FileNotFoundException: File
file:/usr/local/spark/lib/spark-assembly-1.6.2-hadoop2.2.0.jar does not
exist

Any suggestions?
Do I need to load spark-assembly i HDFS and set
spark.yarn.jar=hdfs://spark-assembly-1.6.2-hadoop2.2.0.jar ?

Here is my Dockerfile
https://gist.github.com/ponkin/cac0a071e7fe75ca7c390b7388cf4f91



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Run-spark-shell-inside-Docker-container-against-remote-YARN-cluster-tp27967.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to