Hi team,
I'm working on integration between Mesos & Spark. For now, I can start
SlaveMesosDispatcher in a docker; and I like to also run Spark executor in
Mesos docker. I do the following configuration for it, but I got an error; any
suggestion?Configuration:Spark:
Hi Klaus,
Sorry not next to a computer but it could possibily be a bug that it doesn't
take SPARK_HOME as the base path. Currently the spark image seems to set the
working directory so that it works.
I'll look at the code to verify but seems like it could be the case. If it's
true feel free
6-10-8245 4084 | mad...@cn.ibm.com | http://www.cguru.net
Subject: Re: How to enable Spark mesos docker executor?
From: t...@mesosphere.io
Date: Fri, 16 Oct 2015 10:11:36 +0800
CC: user@spark.apache.org
To: kl...@cguru.net
Hi Klaus,
Sorry not next to a computer but it could possibily be a