Hi,

Looks interesting.

It is quite interesting to know about what could have been the reason for
not showing these stats in UI.

As per the description of Patrick W in
https://spark-project.atlassian.net/browse/SPARK-999, it does not mention
any exception w.r.t failed tasks/executors.

Can somebody please comment if it is a bug or some intended behaviour w.r.t
performance or some other bottleneck.

--Twinkle




On Mon, Apr 20, 2015 at 2:47 PM, Archit Thakur <archit279tha...@gmail.com>
wrote:

> Hi Twinkle,
>
> We have a use case in where we want to debug the reason of how n why an
> executor got killed.
> Could be because of stackoverflow, GC or any other unexpected scenario.
> If I see the driver UI there is no information present around killed
> executors, So was just curious how do people usually debug those things
> apart from scanning logs and understanding it. The metrics we are planning
> to add are similar to what we have for non killed executors - [data per
> stage specifically] - numFailedTasks, executorRunTime, inputBytes,
> memoryBytesSpilled .. etc.
>
> Apart from that we also intend to add all information present in an
> executor tabs for running executors.
>
> Thanks,
> Archit Thakur.
>
> On Mon, Apr 20, 2015 at 1:31 PM, twinkle sachdeva <
> twinkle.sachd...@gmail.com> wrote:
>
>> Hi Archit,
>>
>> What is your use case and what kind of metrics are you planning to add?
>>
>> Thanks,
>> Twinkle
>>
>> On Fri, Apr 17, 2015 at 4:07 PM, Archit Thakur <archit279tha...@gmail.com
>> > wrote:
>>
>>> Hi,
>>>
>>> We are planning to add new Metrics in Spark for the executors that got
>>> killed during the execution. Was just curious, why this info is not already
>>> present. Is there some reason for not adding it.?
>>> Any ideas around are welcome.
>>>
>>> Thanks and Regards,
>>> Archit Thakur.
>>>
>>
>>
>

Reply via email to