oert Kuipers
Cc: user@spark.apache.org
Subject: RE: finding Spark Master
Ah so I see setMaster(‘yarn-client’). Hmm.
What I was ultimately trying to do was develop with Eclipse on my windows box
and have the code point to my cluster so it executes there instead of my local
windows machine.
M
To: Adaryl Wakefield <adaryl.wakefi...@hotmail.com>
Cc: user@spark.apache.org
Subject: Re: finding Spark Master
assuming this is running on yarn there is really spark-master. every job
created its own "master" within a yarn application.
On Tue, Mar 7, 2017 at 6:27 PM, Adaryl Wa
assuming this is running on yarn there is really spark-master. every job
created its own "master" within a yarn application.
On Tue, Mar 7, 2017 at 6:27 PM, Adaryl Wakefield <
adaryl.wakefi...@hotmail.com> wrote:
> I’m running a three node cluster along with Spark along with Hadoop as
> part of
r: @BobLovesData
From: ayan guha [mailto:guha.a...@gmail.com]
Sent: Tuesday, March 7, 2017 5:59 PM
To: Adaryl Wakefield <adaryl.wakefi...@hotmail.com>; user@spark.apache.org
Subject: Re: finding Spark Master
yarn-client or yarn-cluster
On Wed, 8 Mar 2017 at 10:28 am, Adaryl Wakefield
&
yarn-client or yarn-cluster
On Wed, 8 Mar 2017 at 10:28 am, Adaryl Wakefield <
adaryl.wakefi...@hotmail.com> wrote:
> I’m running a three node cluster along with Spark along with Hadoop as
> part of a HDP stack. How do I find my Spark Master? I’m just seeing the
> clients. I’m trying to figure