can two stages of single job run in parallel in spark? e.g one stage is ,map transformation and another is repartition on mapped rdd.
rdd.map(function,100).repartition(30); can it happen that map transformation which is running 100 tasks after few of them say (10 ) are finished and spark started another stage repartition which started copying data from mapped stage nodes in parallel. Thanks