No, the name is origin from the standard standalone mode and add a yarn prefix 
to distinguish it I think. But it do run on yarn cluster.

About the way they run and difference of yarn-standalone mode and yarn-client 
mode, the doc also have the details, in short, yarn-standalone have 
spark-context(thus Driver) run together with AM on yarn, yarn-client have 
spark-context run on local machine where you launch the cmd. For both mode, the 
executor all run on yarn cluster.



Best Regards,
Raymond Liu

From: Azuryy Yu [mailto:azury...@gmail.com]
Sent: Tuesday, December 17, 2013 1:19 PM
To: user@spark.incubator.apache.org
Subject: Re: About spark.driver.host

Hi raymond,

I specified Master and Slaves in the conf.

As for yarn-standalone  and yarn-client, I have some confusion:

If I am use yarn-standalone, does that mean, It's not run on yarn cluster, only 
pseudo-<http://dict.cn/pseudo->distributed?


On Tue, Dec 17, 2013 at 1:03 PM, Liu, Raymond 
<raymond....@intel.com<mailto:raymond....@intel.com>> wrote:
Hmm, I don't see what mode you are trying to use? You specify the MASTER in 
conf file?

I think in the run-on-yarn doc, the example for yarn standalone mode mentioned 
that you also need to pass in -args=yarn-standalone for Client etc.
And if using yarn-client mode, you don't need to invoke Client by yourself, 
instead use something like:



SPARK_JAR=xxx SPARK_YARN_APP_JAR=xxx ./run-example 
org.apache.spark.examples.SparkPi yarn-client

Best Regards,
Raymond Liu

From: Azuryy Yu [mailto:azury...@gmail.com<mailto:azury...@gmail.com>]
Sent: Tuesday, December 17, 2013 12:43 PM
To: user@spark.incubator.apache.org<mailto:user@spark.incubator.apache.org>
Subject: Re: About spark.driver.host

Raymond:
Add addtional: Yes, I build Spark-0.8.1 with -Pnew-yarn, and I followed 
run-on-yarn.cmd strictly.

Spark web UI shows good for everything.

On Tue, Dec 17, 2013 at 12:36 PM, Azuryy Yu 
<azury...@gmail.com<mailto:azury...@gmail.com>> wrote:
Thanks, Raymond!
My command for Yarn mode:
SPARK_JAR=spark-0.8.1/lib/spark-assembly_2.9.3-0.8.1-incubating-hadoop1.2.1.jar 
./spark-0.8.1/bin/spark-class org.apache.spark.deploy.yarn.Client --jar 
spark-0.8.1/spark-examples_2.9.3-0.8.1-incubating.jar --class 
org.apache.spark.examples.SparkPi


please ingnore hadoop version, it's our customized, which is hadoop-2x actually.

but if I don't set spark.driver.*,  App Master cannot start, here is the log:
13/12/17 11:07:13 INFO yarn.ApplicationMaster: Starting the user JAR in a 
separate Thread
13/12/17 11:07:13 INFO yarn.ApplicationMaster: Waiting for Spark driver to be 
reachable.
13/12/17 11:07:13 WARN yarn.ApplicationMaster: Failed to connect to driver at 
null:null, retrying ...
Usage: SparkPi <master> [<slices>]
13/12/17 11:07:13 WARN yarn.ApplicationMaster: Failed to connect to driver at 
null:null, retrying ...
13/12/17 11:07:13 INFO yarn.ApplicationMaster: AppMaster received a signal.
13/12/17 11:07:13 WARN yarn.ApplicationMaster: Failed to connect to driver at 
null:null, retrying ...


After retry 'spark.yarn.applicationMaster.waitTries'(default 10), Job failed.




On Tue, Dec 17, 2013 at 12:07 PM, Liu, Raymond 
<raymond....@intel.com<mailto:raymond....@intel.com>> wrote:
It's what it said on the document.  For yarn-standalone mode, it will be the 
host of where spark AM runs, while for yarn-client mode, it will be the local 
host you run the cmd.

And what's cmd you run SparkPi ? I think you actually don't need to set 
sprak.driver.host manually for Yarn mode , SparkContext will handle it for you 
in Automatically and pass it to AM and Executor to use to connect to Driver.

Did you follow the guide in docs/running-on-yarn.md<http://running-on-yarn.md> ?


Best Regards,
Raymond Liu

From: Azuryy Yu [mailto:azury...@gmail.com<mailto:azury...@gmail.com>]
Sent: Tuesday, December 17, 2013 11:16 AM
To: user@spark.incubator.apache.org<mailto:user@spark.incubator.apache.org>
Subject: About spark.driver.host

Hi,

I am using spark-0,8,1, and what's the meaning of spark.driver.host? I ran 
SparkPi failed.(either yarn-standalone or yarn-client)

It was 'Hostname or IP address for the driver to listen on.' in the document. 
but what host the Driver will listen on? the RM on the yarn? if yes, I 
configured spark.driver.host in the spark-env.sh as resource manager host and 
port:
export SPARK_DAEMON_JAVA_OPTS="-Dspark.driver.host=10.2.8.1 
-Dspark.driver.port=8032"

but it doesn't work. I find in the log:
WARN yarn.ApplicationMaster: Failed to connect to driver at null:null, retrying 
...

Even if I added these two system env variables to the JAVA_OPTS in the 
bin/spark-class, it also doen't work, please help.

Any inputs are appreciated.



Reply via email to