://apache-spark-user-list.1001560.n3.nabble.com/How-does-one-decide-no-of-executors-cores-memory-allocation-tp23326p23369.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user
in context:
http://apache-spark-user-list.1001560.n3.nabble.com/How-does-one-decide-no-of-executors-cores-memory-allocation-tp23326p23339.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e
Subject: Re: How does one decide no of executors/cores/memory allocation?
I realize that there are a lot of ways to configure my application in spark.
The part that is not clear is that how do I decide say for example in how
many partitions should I divide my data or how much ram should I have or how
and the number of workers to start by initializing
SPARK_EXECUTOR_INSTANCES in the spark_home/conf/spark-env.sh file.
Thanks
Himanshu
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/How-does-one-decide-no-of-executors-cores-memory-allocation-tp23326p23330
How do I decide in how many partitions I break up my data into, how many
executors should I have? I guess memory and cores will be allocated based on
the number of executors I have.
Thanks
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/How-does-one-decide
://apache-spark-user-list.1001560.n3.nabble.com/How-does-one-decide-no-of-executors-cores-memory-allocation-tp23326.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user