I am using Spark 1.5.2.

I am not using Dynamic allocation.

Thanks,
Prasad.




On 1/5/16, 3:24 AM, "Ted Yu" <yuzhih...@gmail.com> wrote:

>Which version of Spark do you use ?
>
>This might be related:
>https://urldefense.proofpoint.com/v2/url?u=https-3A__issues.apache.org_jira_browse_SPARK-2D8560&d=CwICAg&c=fa_WZs7nNMvOIDyLmzi2sMVHyyC4hN9WQl29lWJQ5Y4&r=-5JY3iMOXXyFuBleKruCQ-6rGWyZEyiHu8ySSzJdEHw&m=4v0Ji1ymhcVi2Ys2mzOne0cuiDxWMiYmeRYVUeF3hWU&s=9L2ltekpwnC0BDcJPW43_ctNL_G4qTXN4EY2H_Ys0nU&e=
> 
>
>Do you use dynamic allocation ?
>
>Cheers
>
>> On Jan 4, 2016, at 10:05 PM, Prasad Ravilla <pras...@slalom.com> wrote:
>> 
>> I am seeing negative active tasks in the Spark UI.
>> 
>> Is anyone seeing this?
>> How is this possible?
>> 
>> Thanks,
>> Prasad.<Negative_Active_Tasks[1][GJVA].png>
>> <Negative_Active_Tasks.png>
>> <Negative_Active_Tasks.png>
>> 
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to