That will kill an entire Spark application, not a batch Job.

On Wed, Dec 5, 2018 at 3:07 PM Priya Matpadi <pmatp...@gmail.com> wrote:

> if you are deploying your spark application on YARN cluster,
> 1. ssh into master node
> 2. List the currently running application and retreive the application_id
>     yarn application --list
> 3. Kill the application using application_id of the form
> application_xxxxx_xxxx from output of list command
>         yarn application --kill <application_id>
>
> On Wed, Dec 5, 2018 at 1:42 PM kant kodali <kanth...@gmail.com> wrote:
>
>> Hi All,
>>
>> How to track batch jobs in spark? For example, is there some id or token
>> i can get after I spawn a batch job and use it to track the progress or to
>> kill the batch job itself?
>>
>> For Streaming, we have StreamingQuery.id()
>>
>> Thanks!
>>
>

Reply via email to