Keep in mind that your executors will be able to run some fixed number
of tasks in parallel, given your configuration. You should not
necessarily expect that arbitrarily many RDDs and tasks would schedule
simultaneously.

On Mon, Jan 19, 2015 at 5:34 PM, critikaled <isasmani....@gmail.com> wrote:
> Hi, john and david
> I tried this to run them concurrently List(RDD1,RDD2,.....).par.foreach{
> rdd=>
>     rdd.collect().foreach(println)
> }
> this was able to successfully register the task but the parallelism of the
> stages is limited it was able run 4 of them some time and only one of them
> some time which was not consistent.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to