Re: Negative Number of Active Tasks in Spark UI

2016-01-05 Thread Shixiong(Ryan) Zhu
Did you enable "spark.speculation"?

On Tue, Jan 5, 2016 at 9:14 AM, Prasad Ravilla  wrote:

> I am using Spark 1.5.2.
>
> I am not using Dynamic allocation.
>
> Thanks,
> Prasad.
>
>
>
>
> On 1/5/16, 3:24 AM, "Ted Yu"  wrote:
>
> >Which version of Spark do you use ?
> >
> >This might be related:
> >
> https://urldefense.proofpoint.com/v2/url?u=https-3A__issues.apache.org_jira_browse_SPARK-2D8560&d=CwICAg&c=fa_WZs7nNMvOIDyLmzi2sMVHyyC4hN9WQl29lWJQ5Y4&r=-5JY3iMOXXyFuBleKruCQ-6rGWyZEyiHu8ySSzJdEHw&m=4v0Ji1ymhcVi2Ys2mzOne0cuiDxWMiYmeRYVUeF3hWU&s=9L2ltekpwnC0BDcJPW43_ctNL_G4qTXN4EY2H_Ys0nU&e=
> >
> >Do you use dynamic allocation ?
> >
> >Cheers
> >
> >> On Jan 4, 2016, at 10:05 PM, Prasad Ravilla  wrote:
> >>
> >> I am seeing negative active tasks in the Spark UI.
> >>
> >> Is anyone seeing this?
> >> How is this possible?
> >>
> >> Thanks,
> >> Prasad.
> >> 
> >> 
> >>
> >> -
> >> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> >> For additional commands, e-mail: user-h...@spark.apache.org
>


Re: Negative Number of Active Tasks in Spark UI

2016-01-05 Thread Prasad Ravilla
I am using Spark 1.5.2.

I am not using Dynamic allocation.

Thanks,
Prasad.




On 1/5/16, 3:24 AM, "Ted Yu"  wrote:

>Which version of Spark do you use ?
>
>This might be related:
>https://urldefense.proofpoint.com/v2/url?u=https-3A__issues.apache.org_jira_browse_SPARK-2D8560&d=CwICAg&c=fa_WZs7nNMvOIDyLmzi2sMVHyyC4hN9WQl29lWJQ5Y4&r=-5JY3iMOXXyFuBleKruCQ-6rGWyZEyiHu8ySSzJdEHw&m=4v0Ji1ymhcVi2Ys2mzOne0cuiDxWMiYmeRYVUeF3hWU&s=9L2ltekpwnC0BDcJPW43_ctNL_G4qTXN4EY2H_Ys0nU&e=
> 
>
>Do you use dynamic allocation ?
>
>Cheers
>
>> On Jan 4, 2016, at 10:05 PM, Prasad Ravilla  wrote:
>> 
>> I am seeing negative active tasks in the Spark UI.
>> 
>> Is anyone seeing this?
>> How is this possible?
>> 
>> Thanks,
>> Prasad.
>> 
>> 
>> 
>> -
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org


Re: Negative Number of Active Tasks in Spark UI

2016-01-05 Thread Ted Yu
Which version of Spark do you use ?

This might be related:
https://issues.apache.org/jira/browse/SPARK-8560

Do you use dynamic allocation ?

Cheers

> On Jan 4, 2016, at 10:05 PM, Prasad Ravilla  wrote:
> 
> I am seeing negative active tasks in the Spark UI.
> 
> Is anyone seeing this?
> How is this possible?
> 
> Thanks,
> Prasad.
> 
> 
> 
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org