Re: Running Spark in local mode

2016-06-19 Thread Ashok Kumar
Thank you all sirs Appreciated Mich your clarification. On Sunday, 19 June 2016, 19:31, Mich Talebzadeh wrote: Thanks Jonathan for your points I am aware of the fact yarn-client and yarn-cluster are both depreciated (still work in 1.6.1), hence the new

Re: Running Spark in local mode

2016-06-19 Thread Mich Talebzadeh
Thanks Jonathan for your points I am aware of the fact yarn-client and yarn-cluster are both depreciated (still work in 1.6.1), hence the new nomenclature. Bear in mind this is what I stated in my notes: "YARN Cluster Mode, the Spark driver runs inside an application master process which is

Re: Running Spark in local mode

2016-06-19 Thread Jonathan Kelly
Mich, what Jacek is saying is not that you implied that YARN relies on two masters. He's just clarifying that yarn-client and yarn-cluster modes are really both using the same (type of) master (simply "yarn"). In fact, if you specify "--master yarn-client" or "--master yarn-cluster", spark-submit

Re: Running Spark in local mode

2016-06-19 Thread Mich Talebzadeh
Good points but I am an experimentalist In Local mode I have this In local mode with: --master local This will start with one thread or equivalent to –master local[1]. You can also start by more than one thread by specifying the number of threads *k* in –master local[k]. You can also start

Re: Running Spark in local mode

2016-06-19 Thread Jacek Laskowski
On Sun, Jun 19, 2016 at 12:30 PM, Mich Talebzadeh wrote: > Spark Local - Spark runs on the local host. This is the simplest set up and > best suited for learners who want to understand different concepts of Spark > and those performing unit testing. There are also the

Re: Running Spark in local mode

2016-06-19 Thread Mich Talebzadeh
Spark works on different modes, either local (Spark or anything else does not manager) resources and standalone (Spark itself manages resources) plus others (see below) These are from my notes, excluding mesos that I have not used - Spark Local - Spark runs on the local host. This is the

Re: Running Spark in local mode

2016-06-19 Thread Takeshi Yamamuro
There are many technical differences inside though, how to use is the almost same with each other. yea, in a standalone mode, spark runs in a cluster way: see http://spark.apache.org/docs/1.6.1/cluster-overview.html // maropu On Sun, Jun 19, 2016 at 6:14 PM, Ashok Kumar

Re: Running Spark in local mode

2016-06-19 Thread Ashok Kumar
thank you  What are the main differences between a local mode and standalone mode. I understand local mode does not support cluster. Is that the only difference? On Sunday, 19 June 2016, 9:52, Takeshi Yamamuro wrote: Hi, In a local mode, spark runs in a single

Re: Running Spark in local mode

2016-06-19 Thread Takeshi Yamamuro
Hi, In a local mode, spark runs in a single JVM that has a master and one executor with `k` threads. https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/scheduler/local/LocalSchedulerBackend.scala#L94 // maropu On Sun, Jun 19, 2016 at 5:39 PM, Ashok Kumar

Running Spark in local mode

2016-06-19 Thread Ashok Kumar
Hi, I have been told Spark in Local mode is simplest for testing. Spark document covers little on local mode except the cores used in --master local[k].  Where are the the driver program, executor and resources. Do I need to start worker threads and how many app I can use safely without

Re: Running Spark in Local Mode

2015-06-11 Thread mrm
Hi, Did you resolve this? I have the same questions. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Running-Spark-in-Local-Mode-tp22279p23278.html Sent from the Apache Spark User List mailing list archive at Nabble.com

Re: Running Spark in local mode seems to ignore local[N]

2015-05-11 Thread Dmitry Goldenberg
[32] did not yield 32 worker threads. Any recommendations? Thanks. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Running-Spark-in-local-mode-seems-to-ignore-local-N-tp22851.html Sent from the Apache Spark User List mailing list archive

Re: Running Spark in local mode seems to ignore local[N]

2015-05-11 Thread Sean Owen
recommendations? Thanks. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Running-Spark-in-local-mode-seems-to-ignore-local-N-tp22851.html Sent from the Apache Spark User List mailing list archive at Nabble.com

Re: Running Spark in local mode seems to ignore local[N]

2015-05-11 Thread Dmitry Goldenberg
. Running with local[32] did not yield 32 worker threads. Any recommendations? Thanks. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Running-Spark-in-local-mode-seems-to-ignore-local-N-tp22851.html Sent from

Re: Running Spark in local mode seems to ignore local[N]

2015-05-11 Thread Sean Owen
? Thanks. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Running-Spark-in-local-mode-seems-to-ignore-local-N-tp22851.html Sent from the Apache Spark User List mailing list archive at Nabble.com

Re: Running Spark in local mode seems to ignore local[N]

2015-05-11 Thread Marcelo Vanzin
-spark-user-list.1001560.n3.nabble.com/Running-Spark-in-local-mode-seems-to-ignore-local-N-tp22851.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr

Re: Running Spark in local mode seems to ignore local[N]

2015-05-11 Thread Dmitry Goldenberg
threads. Any recommendations? Thanks. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Running-Spark-in-local-mode-seems-to-ignore-local-N-tp22851.html Sent from the Apache Spark User List mailing list archive at Nabble.com

Re: Running Spark in local mode seems to ignore local[N]

2015-05-11 Thread Dmitry Goldenberg
recommendations? Thanks. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Running-Spark-in-local-mode-seems-to-ignore-local-N-tp22851.html Sent from the Apache Spark User List mailing list archive at Nabble.com

Re: Running Spark in Local Mode

2015-03-29 Thread Saisai Shao
. Is there anyway to set up the max memory used by each worker thread/node? I only find we can set the memory for each executor? (spark.executor.mem) Thank you! -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Running-Spark-in-Local-Mode-tp22279.html

Running Spark in Local Mode

2015-03-29 Thread FreePeter
the memory for each executor? (spark.executor.mem) Thank you! -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Running-Spark-in-Local-Mode-tp22279.html Sent from the Apache Spark User List mailing list archive at Nabble.com

Running Spark in Local Mode vs. Single Node Cluster

2014-09-22 Thread kriskalish
Spark 1.0.2 to 1.1.0 in the next day or so to see if that helps. Does anyone have any experience on the matter? -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Running-Spark-in-Local-Mode-vs-Single-Node-Cluster-tp14834.html Sent from the Apache Spark User List