Hi All,

I am running Spark & MR on Mesos. Is there a configuration setting for
Spark to define the minimum required slots (similar to MapReduce's
mapred.mesos.total.reduce.slots.minimum and mapred.mesos.total.map.slots.
minimum)? The most related property I see is this: spark.scheduler.
minRegisteredResourcesRatio found on the documentation here:
http://spark.apache.org/docs/1.2.1/configuration.html#spark-properties
What I wanted to have is a fixed amount of trackers assigned for Spark so I
can share my cluster with MR. Any suggestion on parts of code or
documentation that I should check for a full list of available
configurations?

thanks,
Stratos

Reply via email to