Hi,
You can also set the cores in the spark application itself .
http://spark.apache.org/docs/1.0.1/spark-standalone.html
On Wed, Nov 19, 2014 at 6:11 AM, Pat Ferrel-2 [via Apache Spark User List]
ml-node+s1001560n19238...@n3.nabble.com wrote:
OK hacking the start-slave.sh did it
On Nov
Looks like I can do this by not using start-all.sh but starting each worker
separately passing in a '--cores n' to the master? No config/env way?
On Nov 18, 2014, at 3:14 PM, Pat Ferrel p...@occamsmachete.com wrote:
I see the default and max cores settings but these seem to control total cores
This seems to work only on a ‘worker’ not the master? So I’m back to having no
way to control cores on the master?
On Nov 18, 2014, at 3:24 PM, Pat Ferrel p...@occamsmachete.com wrote:
Looks like I can do this by not using start-all.sh but starting each worker
separately passing in a '--cores
OK hacking the start-slave.sh did it
On Nov 18, 2014, at 4:12 PM, Pat Ferrel p...@occamsmachete.com wrote:
This seems to work only on a ‘worker’ not the master? So I’m back to having no
way to control cores on the master?
On Nov 18, 2014, at 3:24 PM, Pat Ferrel p...@occamsmachete.com wrote: