Hi Eran,

I need to investigate but perhaps that's true, we're using SPARK_JAVA_OPTS
to pass all the options and not --conf.

I'll take a look at the bug, but if you can try the workaround and see if
that fixes your problem.

Tim

On Thu, Mar 10, 2016 at 10:08 AM, Eran Chinthaka Withana <
eran.chinth...@gmail.com> wrote:

> Hi Timothy
>
> What version of spark are you guys running?
>>
>
> I'm using Spark 1.6.0. You can see the Dockerfile I used here:
> https://github.com/echinthaka/spark-mesos-docker/blob/master/docker/mesos-spark/Dockerfile
>
>
>
>> And also did you set the working dir in your image to be spark home?
>>
>
> Yes I did. You can see it here: https://goo.gl/8PxtV8
>
> Can it be because of this:
> https://issues.apache.org/jira/browse/SPARK-13258 as Guillaume pointed
> out above? As you can see, I'm passing in the docker image URI through
> spark-submit (--conf spark.mesos.executor.docker.
> image=echinthaka/mesos-spark:0.23.1-1.6.0-2.6)
>
> Thanks,
> Eran
>
>
>

Reply via email to