Filed https://issues.apache.org/jira/browse/SPARK-6653
On Sun, Mar 29, 2015 at 8:18 PM, Shixiong Zhu zsxw...@gmail.com wrote:
LGTM. Could you open a JIRA and send a PR? Thanks.
Best Regards,
Shixiong Zhu
2015-03-28 7:14 GMT+08:00 Manoj Samel manojsamelt...@gmail.com:
I looked @ the 1.3.0
LGTM. Could you open a JIRA and send a PR? Thanks.
Best Regards,
Shixiong Zhu
2015-03-28 7:14 GMT+08:00 Manoj Samel manojsamelt...@gmail.com:
I looked @ the 1.3.0 code and figured where this can be added
In org.apache.spark.deploy.yarn ApplicationMaster.scala:282 is
actorSystem =
I looked @ the 1.3.0 code and figured where this can be added
In org.apache.spark.deploy.yarn ApplicationMaster.scala:282 is
actorSystem = AkkaUtils.createActorSystem(sparkYarnAM,
Utils.localHostName, 0,
conf = sparkConf, securityManager = securityMgr)._1
If I change it to below,
There is no configuration for it now.
Best Regards,
Shixiong Zhu
2015-03-26 7:13 GMT+08:00 Manoj Samel manojsamelt...@gmail.com:
There may be firewall rules limiting the ports between host running spark
and the hadoop cluster. In that case, not all ports are allowed.
Can it be a range of
Spark 1.3, Hadoop 2.5, Kerbeors
When running spark-shell in yarn client mode, it shows following message
with a random port every time (44071 in example below). Is there a way to
specify that port to a specific port ? It does not seem to be part of ports
specified in
It's a random port to avoid port conflicts, since multiple AMs can run in
the same machine. Why do you need a fixed port?
Best Regards,
Shixiong Zhu
2015-03-26 6:49 GMT+08:00 Manoj Samel manojsamelt...@gmail.com:
Spark 1.3, Hadoop 2.5, Kerbeors
When running spark-shell in yarn client mode,
There may be firewall rules limiting the ports between host running spark
and the hadoop cluster. In that case, not all ports are allowed.
Can it be a range of ports that can be specified ?
On Wed, Mar 25, 2015 at 4:06 PM, Shixiong Zhu zsxw...@gmail.com wrote:
It's a random port to avoid port