I havent been able to set the cores with that option in Spark 1.0.0 either.
To work around that, setting the environment variable:
SPARK_JAVA_OPTS="-Dspark.cores.max=<numCores>" seems to do the trick.

Matt Kielo
Data Scientist
Oculus Info Inc.

On Tue, Jun 3, 2014 at 11:15 AM, Marek Wiewiorka <marek.wiewio...@gmail.com>
wrote:

> Hi All,
> there is information in 1.0.0 Spark's documentation that
> there is an option "--cores" that one can use to set the number of cores
> that spark-shell uses on the cluster:
>
> You can also pass an option --cores <numCores> to control the number of
> cores that spark-shell uses on the cluster.
>
> This option does not seem to work for me.
> If run the following command:
> ./spark-shell --cores 12
> I'm keep on getting an error:
> bad option: '--cores'
>
> Is there any other way of controlling the total number of cores used by
> sparkshell?
>
> Thanks,
> Marek
>
>

Reply via email to