Ah so I see setMaster(‘yarn-client’). Hmm.

What I was ultimately trying to do was develop with Eclipse on my windows box 
and have the code point to my cluster so it executes there instead of my local 
windows machine. Perhaps I’m going about this wrong.

Adaryl "Bob" Wakefield, MBA
Principal
Mass Street Analytics, LLC
913.938.6685
www.massstreet.net<http://www.massstreet.net>
www.linkedin.com/in/bobwakefieldmba<http://www.linkedin.com/in/bobwakefieldmba>
Twitter: @BobLovesData

From: Koert Kuipers [mailto:ko...@tresata.com]
Sent: Tuesday, March 7, 2017 7:47 PM
To: Adaryl Wakefield <adaryl.wakefi...@hotmail.com>
Cc: user@spark.apache.org
Subject: Re: finding Spark Master

assuming this is running on yarn there is really spark-master. every job 
created its own "master" within a yarn application.

On Tue, Mar 7, 2017 at 6:27 PM, Adaryl Wakefield 
<adaryl.wakefi...@hotmail.com<mailto:adaryl.wakefi...@hotmail.com>> wrote:
I’m running a three node cluster along with Spark along with Hadoop as part of 
a HDP stack. How do I find my Spark Master? I’m just seeing the clients. I’m 
trying to figure out what goes in setMaster() aside from local[*].

Adaryl "Bob" Wakefield, MBA
Principal
Mass Street Analytics, LLC
913.938.6685<tel:(913)%20938-6685>
www.massstreet.net<http://www.massstreet.net>
www.linkedin.com/in/bobwakefieldmba<http://www.linkedin.com/in/bobwakefieldmba>
Twitter: @BobLovesData


Reply via email to