Re: spark driver in docker

2016-03-05 Thread Timothy Chen
Will need more information to help you, what's the commands you used to launch 
slave/master, and what error message did you see in the driver logs?

Tim


> On Mar 5, 2016, at 4:34 AM, Mailing List  wrote:
> 
> I am trying to do the same but till now no luck...
> I have everything running inside docker container including mesos master, 
> mesos slave , marathon , spark mesos cluster dispatcher.
> 
> But when I try to submit the job using spark submit as a docker container it 
> fails ...
> 
> Between this setup is on single centos 7 machine.
> 
> Let me know if you have any insights as how to make the submit work.
> 
> Ashish
> 
> Sent from my iPad
> 
>> On Mar 5, 2016, at 3:33 AM, Tamas Szuromi  
>> wrote:
>> 
>> Hi, Have a look at on http://spark.apache.org/docs/latest/configuration.html 
>> what ports need to be exposed. With mesos we had a lot of problems with 
>> container networking but yes the --net=host is a shortcut.
>> 
>> Tamas
>> 
>> 
>> 
>>> On 4 March 2016 at 22:37, yanlin wang  wrote:
>>> We would like to run multiple spark driver in docker container. Any 
>>> suggestion for the port expose and network settings for docker so driver is 
>>> reachable by the worker nodes? —net=“hosts” is the last thing we want to do.
>>> 
>>> Thx
>>> Yanlin
>>> -
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>> 


Re: spark driver in docker

2016-03-05 Thread Mailing List
I am trying to do the same but till now no luck...
I have everything running inside docker container including mesos master, mesos 
slave , marathon , spark mesos cluster dispatcher.

But when I try to submit the job using spark submit as a docker container it 
fails ...

Between this setup is on single centos 7 machine.

Let me know if you have any insights as how to make the submit work.

Ashish

Sent from my iPad

> On Mar 5, 2016, at 3:33 AM, Tamas Szuromi  
> wrote:
> 
> Hi, Have a look at on http://spark.apache.org/docs/latest/configuration.html 
> what ports need to be exposed. With mesos we had a lot of problems with 
> container networking but yes the --net=host is a shortcut.
> 
> Tamas
> 
> 
> 
>> On 4 March 2016 at 22:37, yanlin wang  wrote:
>> We would like to run multiple spark driver in docker container. Any 
>> suggestion for the port expose and network settings for docker so driver is 
>> reachable by the worker nodes? —net=“hosts” is the last thing we want to do.
>> 
>> Thx
>> Yanlin
>> -
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>> 
> 


Re: spark driver in docker

2016-03-05 Thread Tamas Szuromi
Hi, Have a look at on
http://spark.apache.org/docs/latest/configuration.html what
ports need to be exposed. With mesos we had a lot of problems with
container networking but yes the --net=host is a shortcut.

Tamas



On 4 March 2016 at 22:37, yanlin wang  wrote:

> We would like to run multiple spark driver in docker container. Any
> suggestion for the port expose and network settings for docker so driver is
> reachable by the worker nodes? —net=“hosts” is the last thing we want to do.
>
> Thx
> Yanlin
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>