Also looks like you are mixing configuration properties from different
versions of Spark on Kubernetes.
"spark.kubernetes.{driver|executor}.docker.image" is only available in the
apache-spark-on-k8s fork, whereas "spark.kubernetes.container.image" is new
in Spark 2.3.0. Please make sure you use
Thanks so much! I'll take a look at the guide right now. The versions
should all be 2.2 of spark. In my configuration, I'm using
--conf
spark.kubernetes.driver.docker.image=kubespark/spark-driver:v2.2.0-kubernetes-0.5.0
\
--conf
Which version of Spark are you using to run spark-submit, and which
version of Spark your container image is based off? This looks to be caused
my mismatched versions of Spark used for spark-submit and for the
driver/executor at runtime.
On Mon, Apr 30, 2018 at 12:00 PM, Holden Karau
So, while its not perfect, I have a guide focused on running custom Spark
on GKE
https://cloud.google.com/blog/big-data/2018/03/testing-future-apache-spark-releases-and-changes-on-google-kubernetes-engine-and-cloud-dataproc
and
if you want to run pre-built Spark on GKE there is a solutions article
Hello all,
I've been trying to spark-submit a job to the Google Kubernetes Engine but
I keep encountering a "Exception in thread "main"
java.lang.IllegalArgumentException:
Server properties file given at /opt/spark/work-dir/driver does not exist
or is not a file."
error. I'm unsure of how to even