Re: How to track batch jobs in spark ?

2018-12-06 Thread Gourav Sengupta
can see which job/stage is running and determine > what percentage of your job is complete. > > Even when getting the stage info, you only get the number of “tasks” > complete v/s percentage complete. > > > > *From: *kant kodali > *Date: *Thursday, December 6, 2018 at

Re: How to track batch jobs in spark ?

2018-12-06 Thread Thakrar, Jayesh
complete. From: kant kodali Date: Thursday, December 6, 2018 at 4:40 AM To: Mark Hamstra Cc: , "user @spark" Subject: Re: How to track batch jobs in spark ? Thanks for all responses. 1) I am not using YARN. I am using Spark Standalone. 2) yes I want to be able to kill the whole Application.

Re: How to track batch jobs in spark ?

2018-12-06 Thread kant kodali
from output of list command >> yarn application --kill >> >> On Wed, Dec 5, 2018 at 1:42 PM kant kodali wrote: >> >>> Hi All, >>> >>> How to track batch jobs in spark? For example, is there some id or token >>> i can get after I s

Re: How to track batch jobs in spark ?

2018-12-05 Thread Mark Hamstra
gt; yarn application --list > 3. Kill the application using application_id of the form > application_x_ from output of list command > yarn application --kill > > On Wed, Dec 5, 2018 at 1:42 PM kant kodali wrote: > >> Hi All, >> >> How to track ba

Re: How to track batch jobs in spark ?

2018-12-05 Thread Priya Matpadi
yarn application --kill On Wed, Dec 5, 2018 at 1:42 PM kant kodali wrote: > Hi All, > > How to track batch jobs in spark? For example, is there some id or token i > can get after I spawn a batch job and use it to track the progress or to > kill the batch job itself? > &

How to track batch jobs in spark ?

2018-12-05 Thread kant kodali
Hi All, How to track batch jobs in spark? For example, is there some id or token i can get after I spawn a batch job and use it to track the progress or to kill the batch job itself? For Streaming, we have StreamingQuery.id() Thanks!