Thank you all sirs
Appreciated Mich your clarification.
On Sunday, 19 June 2016, 19:31, Mich Talebzadeh
wrote:
Thanks Jonathan for your points
I am aware of the fact yarn-client and yarn-cluster are both depreciated (still
work in 1.6.1), hence the new
Thanks Jonathan for your points
I am aware of the fact yarn-client and yarn-cluster are both depreciated
(still work in 1.6.1), hence the new nomenclature.
Bear in mind this is what I stated in my notes:
"YARN Cluster Mode, the Spark driver runs inside an application master
process which is
Mich, what Jacek is saying is not that you implied that YARN relies on two
masters. He's just clarifying that yarn-client and yarn-cluster modes are
really both using the same (type of) master (simply "yarn"). In fact, if
you specify "--master yarn-client" or "--master yarn-cluster", spark-submit
Good points but I am an experimentalist
In Local mode I have this
In local mode with:
--master local
This will start with one thread or equivalent to –master local[1]. You can
also start by more than one thread by specifying the number of threads *k*
in –master local[k]. You can also start
On Sun, Jun 19, 2016 at 12:30 PM, Mich Talebzadeh
wrote:
> Spark Local - Spark runs on the local host. This is the simplest set up and
> best suited for learners who want to understand different concepts of Spark
> and those performing unit testing.
There are also the
Spark works on different modes, either local (Spark or anything else does
not manager) resources and standalone (Spark itself manages resources)
plus others (see below)
These are from my notes, excluding mesos that I have not used
- Spark Local - Spark runs on the local host. This is the
There are many technical differences inside though, how to use is the
almost same with each other.
yea, in a standalone mode, spark runs in a cluster way: see
http://spark.apache.org/docs/1.6.1/cluster-overview.html
// maropu
On Sun, Jun 19, 2016 at 6:14 PM, Ashok Kumar
thank you
What are the main differences between a local mode and standalone mode. I
understand local mode does not support cluster. Is that the only difference?
On Sunday, 19 June 2016, 9:52, Takeshi Yamamuro
wrote:
Hi,
In a local mode, spark runs in a single
Hi,
In a local mode, spark runs in a single JVM that has a master and one
executor with `k` threads.
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/scheduler/local/LocalSchedulerBackend.scala#L94
// maropu
On Sun, Jun 19, 2016 at 5:39 PM, Ashok Kumar
Hi,
I have been told Spark in Local mode is simplest for testing. Spark document
covers little on local mode except the cores used in --master local[k].
Where are the the driver program, executor and resources. Do I need to start
worker threads and how many app I can use safely without
Hi,
Did you resolve this? I have the same questions.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Running-Spark-in-Local-Mode-tp22279p23278.html
Sent from the Apache Spark User List mailing list archive at Nabble.com
[32] did not yield 32 worker threads.
Any recommendations? Thanks.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Running-Spark-in-local-mode-seems-to-ignore-local-N-tp22851.html
Sent from the Apache Spark User List mailing list archive
recommendations? Thanks.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Running-Spark-in-local-mode-seems-to-ignore-local-N-tp22851.html
Sent from the Apache Spark User List mailing list archive at Nabble.com
.
Running with local[32] did not yield 32 worker threads.
Any recommendations? Thanks.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Running-Spark-in-local-mode-seems-to-ignore-local-N-tp22851.html
Sent from
? Thanks.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Running-Spark-in-local-mode-seems-to-ignore-local-N-tp22851.html
Sent from the Apache Spark User List mailing list archive at Nabble.com
-spark-user-list.1001560.n3.nabble.com/Running-Spark-in-local-mode-seems-to-ignore-local-N-tp22851.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user-unsubscr
threads.
Any recommendations? Thanks.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Running-Spark-in-local-mode-seems-to-ignore-local-N-tp22851.html
Sent from the Apache Spark User List mailing list archive at
Nabble.com
recommendations? Thanks.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Running-Spark-in-local-mode-seems-to-ignore-local-N-tp22851.html
Sent from the Apache Spark User List mailing list archive at
Nabble.com
. Is there anyway to set up the max memory used by each worker
thread/node?
I only find we can set the memory for each executor? (spark.executor.mem)
Thank you!
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Running-Spark-in-Local-Mode-tp22279.html
the memory for each executor? (spark.executor.mem)
Thank you!
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Running-Spark-in-Local-Mode-tp22279.html
Sent from the Apache Spark User List mailing list archive at Nabble.com
Spark 1.0.2 to 1.1.0
in the next day or so to see if that helps.
Does anyone have any experience on the matter?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Running-Spark-in-Local-Mode-vs-Single-Node-Cluster-tp14834.html
Sent from the Apache Spark User List
21 matches
Mail list logo