gt; On Aug 2, 2016, at 10:25 PM, Park Kyeong Hee <kh1979.p...@samsung.com>
>>> wrote:
>>> >
>>> > So sorry. Your name was Pradeep !!
>>> >
>>> > -Original Message-
>>> > From: Park Kyeong Hee [mailto:kh1979.p...@samsung.com]
>>&g
So sorry. Your name was Pradeep !!
>> >
>> > -Original Message-
>> > From: Park Kyeong Hee [mailto:kh1979.p...@samsung.com]
>> > Sent: Wednesday, August 03, 2016 11:24 AM
>> > To: 'Pradeep'; 'user@spark.apache.org'
>> > Subject: RE: S
g Hee <kh1979.p...@samsung.com>
> wrote:
> >
> > So sorry. Your name was Pradeep !!
> >
> > -Original Message-
> > From: Park Kyeong Hee [mailto:kh1979.p...@samsung.com]
> > Sent: Wednesday, August 03, 2016 11:24 AM
> > To: 'Pradeep'; 'user@
ong Hee [mailto:kh1979.p...@samsung.com]
> Sent: Wednesday, August 03, 2016 11:24 AM
> To: 'Pradeep'; 'user@spark.apache.org'
> Subject: RE: Stop Spark Streaming Jobs
>
> Hi. Paradeep
>
>
> Did you mean, how to kill the job?
> If yes, you should kill the driver and follo
So sorry. Your name was Pradeep !!
-Original Message-
From: Park Kyeong Hee [mailto:kh1979.p...@samsung.com]
Sent: Wednesday, August 03, 2016 11:24 AM
To: 'Pradeep'; 'user@spark.apache.org'
Subject: RE: Stop Spark Streaming Jobs
Hi. Paradeep
Did you mean, how to kill the job?
If yes
ication ID - "yarn application -list"
2. stop it - "yarn application -kill "
3. check driver and executors were down - "yarn application -list"
Thanks.
-Original Message-
From: Pradeep [mailto:pradeep.mi...@mail.com]
Sent: Wednesday, August 03, 2016 10:48 AM
Hi All,
My streaming job reads data from Kafka. The job is triggered and pushed to
background with nohup.
What are the recommended ways to stop job either on yarn-client or cluster mode.
Thanks,
Pradeep
-
To unsubscribe