can see which job/stage is running and determine
> what percentage of your job is complete.
>
> Even when getting the stage info, you only get the number of “tasks”
> complete v/s percentage complete.
>
>
>
> *From: *kant kodali
> *Date: *Thursday, December 6, 2018 at
complete.
From: kant kodali
Date: Thursday, December 6, 2018 at 4:40 AM
To: Mark Hamstra
Cc: , "user @spark"
Subject: Re: How to track batch jobs in spark ?
Thanks for all responses.
1) I am not using YARN. I am using Spark Standalone.
2) yes I want to be able to kill the whole Application.
from output of list command
>> yarn application --kill
>>
>> On Wed, Dec 5, 2018 at 1:42 PM kant kodali wrote:
>>
>>> Hi All,
>>>
>>> How to track batch jobs in spark? For example, is there some id or token
>>> i can get after I s
gt; yarn application --list
> 3. Kill the application using application_id of the form
> application_x_ from output of list command
> yarn application --kill
>
> On Wed, Dec 5, 2018 at 1:42 PM kant kodali wrote:
>
>> Hi All,
>>
>> How to track ba
yarn application --kill
On Wed, Dec 5, 2018 at 1:42 PM kant kodali wrote:
> Hi All,
>
> How to track batch jobs in spark? For example, is there some id or token i
> can get after I spawn a batch job and use it to track the progress or to
> kill the batch job itself?
>
&
Hi All,
How to track batch jobs in spark? For example, is there some id or token i
can get after I spawn a batch job and use it to track the progress or to
kill the batch job itself?
For Streaming, we have StreamingQuery.id()
Thanks!