What is your cluster manager? For example on YARN you would specify
--executor-cores. Read:
http://spark.apache.org/docs/latest/running-on-yarn.html

On Thu, Jan 15, 2015 at 8:54 PM, Wang, Ningjun (LNG-NPV)
<ningjun.w...@lexisnexis.com> wrote:
> I have a standalone spark cluster with only one node with 4 CPU cores. How
> can I force spark to do parallel processing of my RDD using multiple
> threads? For example I can do the following
>
>
>
> Spark-submit  --master local[4]
>
>
>
> However I really want to use the cluster as follow
>
>
>
> Spark-submit  --master spark://10.125.21.15:7070
>
>
>
> In that case, how can I make sure the RDD is processed with multiple
> threads/cores?
>
>
>
> Thanks
>
> Ningjun
>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to