Re: Worker Spark Port

2015-05-15 Thread James King
So I'm using code like this to use specific ports:

val conf = new SparkConf()
.setMaster(master)
.setAppName(namexxx)
.set(spark.driver.port, 51810)
.set(spark.fileserver.port, 51811)
.set(spark.broadcast.port, 51812)
.set(spark.replClassServer.port, 51813)
.set(spark.blockManager.port, 51814)
.set(spark.executor.port, 51815)

My question now is : Will the master forward the spark.executor.port
value (to use) to the worker when it hands it a task to do?

Also the property spark.executor.port is different from the Worker
spark port, how can I make the Worker run on a specific port?

Regards

jk


On Wed, May 13, 2015 at 7:51 PM, James King jakwebin...@gmail.com wrote:

 Indeed, many thanks.


 On Wednesday, 13 May 2015, Cody Koeninger c...@koeninger.org wrote:

 I believe most ports are configurable at this point, look at

 http://spark.apache.org/docs/latest/configuration.html

 search for .port

 On Wed, May 13, 2015 at 9:38 AM, James King jakwebin...@gmail.com
 wrote:

 I understated that this port value is randomly selected.

 Is there a way to enforce which spark port a Worker should use?





Re: Worker Spark Port

2015-05-15 Thread ayan guha
Hi

I think you are mixing things a bit.

Worker is part of the cluster. So it is governed by cluster manager. If you
are running standalone cluster, then you can modify spark-env and
configure SPARK_WORKER_PORT.

executors, on the other hand, are bound with an application, ie spark
context. Thus you modify executor properties through a context.

So, master != driver and executor != worker.

Best
Ayan

On Fri, May 15, 2015 at 7:52 PM, James King jakwebin...@gmail.com wrote:

 So I'm using code like this to use specific ports:

 val conf = new SparkConf()
 .setMaster(master)
 .setAppName(namexxx)
 .set(spark.driver.port, 51810)
 .set(spark.fileserver.port, 51811)
 .set(spark.broadcast.port, 51812)
 .set(spark.replClassServer.port, 51813)
 .set(spark.blockManager.port, 51814)
 .set(spark.executor.port, 51815)

 My question now is : Will the master forward the spark.executor.port value 
 (to use) to the worker when it hands it a task to do?

 Also the property spark.executor.port is different from the Worker spark 
 port, how can I make the Worker run on a specific port?

 Regards

 jk


 On Wed, May 13, 2015 at 7:51 PM, James King jakwebin...@gmail.com wrote:

 Indeed, many thanks.


 On Wednesday, 13 May 2015, Cody Koeninger c...@koeninger.org wrote:

 I believe most ports are configurable at this point, look at

 http://spark.apache.org/docs/latest/configuration.html

 search for .port

 On Wed, May 13, 2015 at 9:38 AM, James King jakwebin...@gmail.com
 wrote:

 I understated that this port value is randomly selected.

 Is there a way to enforce which spark port a Worker should use?






-- 
Best Regards,
Ayan Guha


Re: Worker Spark Port

2015-05-15 Thread James King
I think this answers my question

executors, on the other hand, are bound with an application, ie spark
context. Thus you modify executor properties through a context.

Many Thanks.

jk

On Fri, May 15, 2015 at 3:23 PM, ayan guha guha.a...@gmail.com wrote:

 Hi

 I think you are mixing things a bit.

 Worker is part of the cluster. So it is governed by cluster manager. If
 you are running standalone cluster, then you can modify spark-env and
 configure SPARK_WORKER_PORT.

 executors, on the other hand, are bound with an application, ie spark
 context. Thus you modify executor properties through a context.

 So, master != driver and executor != worker.

 Best
 Ayan

 On Fri, May 15, 2015 at 7:52 PM, James King jakwebin...@gmail.com wrote:

 So I'm using code like this to use specific ports:

 val conf = new SparkConf()
 .setMaster(master)
 .setAppName(namexxx)
 .set(spark.driver.port, 51810)
 .set(spark.fileserver.port, 51811)
 .set(spark.broadcast.port, 51812)
 .set(spark.replClassServer.port, 51813)
 .set(spark.blockManager.port, 51814)
 .set(spark.executor.port, 51815)

 My question now is : Will the master forward the spark.executor.port value 
 (to use) to the worker when it hands it a task to do?

 Also the property spark.executor.port is different from the Worker spark 
 port, how can I make the Worker run on a specific port?

 Regards

 jk


 On Wed, May 13, 2015 at 7:51 PM, James King jakwebin...@gmail.com
 wrote:

 Indeed, many thanks.


 On Wednesday, 13 May 2015, Cody Koeninger c...@koeninger.org wrote:

 I believe most ports are configurable at this point, look at

 http://spark.apache.org/docs/latest/configuration.html

 search for .port

 On Wed, May 13, 2015 at 9:38 AM, James King jakwebin...@gmail.com
 wrote:

 I understated that this port value is randomly selected.

 Is there a way to enforce which spark port a Worker should use?






 --
 Best Regards,
 Ayan Guha



Worker Spark Port

2015-05-13 Thread James King
I understated that this port value is randomly selected.

Is there a way to enforce which spark port a Worker should use?


Re: Worker Spark Port

2015-05-13 Thread Cody Koeninger
I believe most ports are configurable at this point, look at

http://spark.apache.org/docs/latest/configuration.html

search for .port

On Wed, May 13, 2015 at 9:38 AM, James King jakwebin...@gmail.com wrote:

 I understated that this port value is randomly selected.

 Is there a way to enforce which spark port a Worker should use?



Re: Worker Spark Port

2015-05-13 Thread James King
Indeed, many thanks.

On Wednesday, 13 May 2015, Cody Koeninger c...@koeninger.org wrote:

 I believe most ports are configurable at this point, look at

 http://spark.apache.org/docs/latest/configuration.html

 search for .port

 On Wed, May 13, 2015 at 9:38 AM, James King jakwebin...@gmail.com
 javascript:_e(%7B%7D,'cvml','jakwebin...@gmail.com'); wrote:

 I understated that this port value is randomly selected.

 Is there a way to enforce which spark port a Worker should use?