Number of parallel tasks

2015-02-25 Thread Akshat Aranya
I have Spark running in standalone mode with 4 executors, and each executor
with 5 cores each (spark.executor.cores=5).  However, when I'm processing
an RDD with ~90,000 partitions, I only get 4 parallel tasks.  Shouldn't I
be getting 4x5=20 parallel task executions?


Re: Number of parallel tasks

2015-02-25 Thread Akhil Das
Did you try setting .set(spark.cores.max, 20)

Thanks
Best Regards

On Wed, Feb 25, 2015 at 10:21 PM, Akshat Aranya aara...@gmail.com wrote:

 I have Spark running in standalone mode with 4 executors, and each
 executor with 5 cores each (spark.executor.cores=5).  However, when I'm
 processing an RDD with ~90,000 partitions, I only get 4 parallel tasks.
 Shouldn't I be getting 4x5=20 parallel task executions?