Hi All,

May be I need to just set some property or its a known issue. My spark
application hangs in test environment whenever I see following message -

16/06/26 11:13:34 INFO DAGScheduler: *Submitting 2 missing tasks from
ShuffleMapStage* 145 (MapPartitionsRDD[590] at rdd at
WriteDataFramesDecorator.scala:61)
16/06/26 11:13:34 INFO TaskSchedulerImpl: Adding task set 145.0 with 2 tasks
16/06/26 11:13:34 INFO TaskSetManager: Starting task 0.0 in stage 145.0
(TID 186, localhost, PROCESS_LOCAL, 2389 bytes)
16/06/26 11:13:34 INFO Executor: Running task 0.0 in stage 145.0 (TID 186)
16/06/26 11:13:34 INFO BlockManager: Found block rdd_575_0 locally
16/06/26 11:13:34 INFO GenerateMutableProjection: Code generated in 3.796 ms
16/06/26 11:13:34 INFO Executor: Finished task 0.0 in stage 145.0 (TID
186). 2578 bytes result sent to driver
16/06/26 11:13:34 INFO TaskSetManager: Finished task 0.0 in stage 145.0
(TID 186) in 24 ms on localhost (1/2)

It happens with any action. The application works fine whenever I
notice "*Submitting
1 missing tasks from ShuffleMapStage". *For this I need to tweak the plan
like using repartition, coalesce etc but this also doesn't help always.

Some of the Spark properties are as given below -

Name                Value
spark.app.id        local-1466914377931
spark.app.name      SparkTest
spark.cores.max      3
spark.default.parallelism 1
spark.driver.allowMultipleContexts true
spark.executor.id    driver
spark.externalBlockStore.folderName
spark-050049bd-c058-4035-bc3d-2e73a08e8d0c
spark.master        local[2]
spark.scheduler.mode FIFO
spark.ui.enabled    true


Thanks,
Ravi.

Reply via email to