That used to work with version 0.9.1 and earlier and does not seem to work
with 1.0.0.
M.




2014-06-03 17:53 GMT+02:00 Mikhail Strebkov <streb...@gmail.com>:

> Try -c <numCores> instead, works for me, e.g.
>
> bin/spark-shell -c 88
>
>
>
> On Tue, Jun 3, 2014 at 8:15 AM, Marek Wiewiorka <marek.wiewio...@gmail.com
> > wrote:
>
>> Hi All,
>> there is information in 1.0.0 Spark's documentation that
>> there is an option "--cores" that one can use to set the number of cores
>> that spark-shell uses on the cluster:
>>
>> You can also pass an option --cores <numCores> to control the number of
>> cores that spark-shell uses on the cluster.
>>
>> This option does not seem to work for me.
>> If run the following command:
>> ./spark-shell --cores 12
>> I'm keep on getting an error:
>> bad option: '--cores'
>>
>> Is there any other way of controlling the total number of cores used by
>> sparkshell?
>>
>> Thanks,
>> Marek
>>
>>
>

Reply via email to