I'm assuming by spark-client you mean the spark driver program. In that
case you can pick any machine (say Node 7), create your driver program in
it and use spark-submit to submit it to the cluster or if you create the
SparkContext within your driver program (specifying all the properties)
then you may simply run it with sbt run.

Thanks
Best Regards

On Sun, Jun 14, 2015 at 6:17 AM, MrAsanjar . <afsan...@gmail.com> wrote:

> I have following hadoop & spark cluster nodes configuration:
> Nodes 1 & 2 are resourceManager and nameNode respectivly
> Nodes 3, 4, and 5 each includes nodeManager & dataNode
> Node 7 is Spark-master configured to run yarn-client or yarn-master modes
> I have tested it and it works fine.
> Is there any instuctions on how to setup spark client in a cluster mode?
> I am not sure if I am doing it right.
> Thanks in advance
>

Reply via email to