assuming this is running on yarn there is really spark-master. every job
created its own "master" within a yarn application.

On Tue, Mar 7, 2017 at 6:27 PM, Adaryl Wakefield <
adaryl.wakefi...@hotmail.com> wrote:

> I’m running a three node cluster along with Spark along with Hadoop as
> part of a HDP stack. How do I find my Spark Master? I’m just seeing the
> clients. I’m trying to figure out what goes in setMaster() aside from
> local[*].
>
>
>
> Adaryl "Bob" Wakefield, MBA
> Principal
> Mass Street Analytics, LLC
> 913.938.6685 <(913)%20938-6685>
>
> www.massstreet.net
>
> www.linkedin.com/in/bobwakefieldmba
> Twitter: @BobLovesData
>
>
>

Reply via email to