Re: How does one decide no of executors/cores/memory allocation?

2015-06-17 Thread nsalian
://apache-spark-user-list.1001560.n3.nabble.com/How-does-one-decide-no-of-executors-cores-memory-allocation-tp23326p23369.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user

Re: How does one decide no of executors/cores/memory allocation?

2015-06-16 Thread shreesh
in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-does-one-decide-no-of-executors-cores-memory-allocation-tp23326p23339.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e

RE: How does one decide no of executors/cores/memory allocation?

2015-06-16 Thread Evo Eftimov
Subject: Re: How does one decide no of executors/cores/memory allocation? I realize that there are a lot of ways to configure my application in spark. The part that is not clear is that how do I decide say for example in how many partitions should I divide my data or how much ram should I have or how

Re: How does one decide no of executors/cores/memory allocation?

2015-06-16 Thread Himanshu Mehra
and the number of workers to start by initializing SPARK_EXECUTOR_INSTANCES in the spark_home/conf/spark-env.sh file. Thanks Himanshu -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-does-one-decide-no-of-executors-cores-memory-allocation-tp23326p23330

How does one decide no of executors/cores/memory allocation?

2015-06-15 Thread shreesh
How do I decide in how many partitions I break up my data into, how many executors should I have? I guess memory and cores will be allocated based on the number of executors I have. Thanks -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-does-one-decide

Re: How does one decide no of executors/cores/memory allocation?

2015-06-15 Thread gaurav sharma
://apache-spark-user-list.1001560.n3.nabble.com/How-does-one-decide-no-of-executors-cores-memory-allocation-tp23326.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user