You find several presentations on this at the Spark summit web page.

Generally you have also to make a decision if you run one cluster for all 
applications or one cluster per application in the container context.

Not sure though why do you want to run just on one node. If you have only one 
node then you may want to go not for Spark at all.

> On 25. Dec 2017, at 09:54, sujeet jog <sujeet....@gmail.com> wrote:
> 
> Folks, 
> 
> Can you share your experience of running spark under docker  on a single 
> local / standalone node.
> Anybody using it under production environments ?,  we have a existing Docker 
> Swarm deployment, and i want to run Spark in a seperate FAT VM  hooked / 
> controlled by docker swarm  
> 
> I know there is no official clustering support for running spark under docker 
> swarm,  but can it be used to run on a single FAT VM controlled by Swarm.
> 
> Any insights on this would be appreciated / production mode experiences etc.
> 
> Thanks, 
> Sujeet

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to