Re: Spark Task is not created

2016-06-26 Thread Ravindra
I have a lot of spark tests. And the failure is not deterministic. It can
happen at any action that I do. Buy given below logs are common. And I
overcome that using the repartitioning, coalescing etc so that I don't get
that Submitting 2 missing tasks from ShuffleMapStage. Basically ensuring
that there is only one task.

I doubt if this has anything to do with some property.

In the Ui I don't see any failure. Just that few tasks have completed and
the last one is yet to be created

Thanks,

Ravi


On Sun, Jun 26, 2016, 11:33 Akhil Das  wrote:

> Would be good if you can paste the piece of code that you are executing.
>
> On Sun, Jun 26, 2016 at 11:21 AM, Ravindra 
> wrote:
>
>> Hi All,
>>
>> May be I need to just set some property or its a known issue. My spark
>> application hangs in test environment whenever I see following message -
>>
>> 16/06/26 11:13:34 INFO DAGScheduler: *Submitting 2 missing tasks from
>> ShuffleMapStage* 145 (MapPartitionsRDD[590] at rdd at
>> WriteDataFramesDecorator.scala:61)
>> 16/06/26 11:13:34 INFO TaskSchedulerImpl: Adding task set 145.0 with 2
>> tasks
>> 16/06/26 11:13:34 INFO TaskSetManager: Starting task 0.0 in stage 145.0
>> (TID 186, localhost, PROCESS_LOCAL, 2389 bytes)
>> 16/06/26 11:13:34 INFO Executor: Running task 0.0 in stage 145.0 (TID 186)
>> 16/06/26 11:13:34 INFO BlockManager: Found block rdd_575_0 locally
>> 16/06/26 11:13:34 INFO GenerateMutableProjection: Code generated in 3.796
>> ms
>> 16/06/26 11:13:34 INFO Executor: Finished task 0.0 in stage 145.0 (TID
>> 186). 2578 bytes result sent to driver
>> 16/06/26 11:13:34 INFO TaskSetManager: Finished task 0.0 in stage 145.0
>> (TID 186) in 24 ms on localhost (1/2)
>>
>> It happens with any action. The application works fine whenever I notice 
>> "*Submitting
>> 1 missing tasks from ShuffleMapStage". *For this I need to tweak the
>> plan like using repartition, coalesce etc but this also doesn't help
>> always.
>>
>> Some of the Spark properties are as given below -
>>
>> NameValue
>> spark.app.idlocal-1466914377931
>> spark.app.name  SparkTest
>> spark.cores.max  3
>> spark.default.parallelism 1
>> spark.driver.allowMultipleContexts true
>> spark.executor.iddriver
>> spark.externalBlockStore.folderName
>> spark-050049bd-c058-4035-bc3d-2e73a08e8d0c
>> spark.masterlocal[2]
>> spark.scheduler.mode FIFO
>> spark.ui.enabledtrue
>>
>>
>> Thanks,
>> Ravi.
>>
>>
>
>
> --
> Cheers!
>
>


Re: Spark Task is not created

2016-06-25 Thread Akhil Das
Would be good if you can paste the piece of code that you are executing.

On Sun, Jun 26, 2016 at 11:21 AM, Ravindra 
wrote:

> Hi All,
>
> May be I need to just set some property or its a known issue. My spark
> application hangs in test environment whenever I see following message -
>
> 16/06/26 11:13:34 INFO DAGScheduler: *Submitting 2 missing tasks from
> ShuffleMapStage* 145 (MapPartitionsRDD[590] at rdd at
> WriteDataFramesDecorator.scala:61)
> 16/06/26 11:13:34 INFO TaskSchedulerImpl: Adding task set 145.0 with 2
> tasks
> 16/06/26 11:13:34 INFO TaskSetManager: Starting task 0.0 in stage 145.0
> (TID 186, localhost, PROCESS_LOCAL, 2389 bytes)
> 16/06/26 11:13:34 INFO Executor: Running task 0.0 in stage 145.0 (TID 186)
> 16/06/26 11:13:34 INFO BlockManager: Found block rdd_575_0 locally
> 16/06/26 11:13:34 INFO GenerateMutableProjection: Code generated in 3.796
> ms
> 16/06/26 11:13:34 INFO Executor: Finished task 0.0 in stage 145.0 (TID
> 186). 2578 bytes result sent to driver
> 16/06/26 11:13:34 INFO TaskSetManager: Finished task 0.0 in stage 145.0
> (TID 186) in 24 ms on localhost (1/2)
>
> It happens with any action. The application works fine whenever I notice 
> "*Submitting
> 1 missing tasks from ShuffleMapStage". *For this I need to tweak the plan
> like using repartition, coalesce etc but this also doesn't help always.
>
> Some of the Spark properties are as given below -
>
> NameValue
> spark.app.idlocal-1466914377931
> spark.app.name  SparkTest
> spark.cores.max  3
> spark.default.parallelism 1
> spark.driver.allowMultipleContexts true
> spark.executor.iddriver
> spark.externalBlockStore.folderName
> spark-050049bd-c058-4035-bc3d-2e73a08e8d0c
> spark.masterlocal[2]
> spark.scheduler.mode FIFO
> spark.ui.enabledtrue
>
>
> Thanks,
> Ravi.
>
>


-- 
Cheers!