Re: Stop Spark Streaming Jobs

2016-08-04 Thread Sandeep Nemuri
Also set spark.streaming.stopGracefullyOnShutdown to true
If true, Spark shuts down the StreamingContext gracefully on JVM shutdown
rather than immediately.

http://spark.apache.org/docs/latest/configuration.html#spark-streaming










ᐧ

On Thu, Aug 4, 2016 at 12:31 PM, Sandeep Nemuri <nhsande...@gmail.com>
wrote:

> StreamingContext.stop(...) if using scala
> JavaStreamingContext.stop(...) if using Java
>
> ᐧ
>
> On Wed, Aug 3, 2016 at 9:14 PM, Tony Lane <tonylane@gmail.com> wrote:
>
>> SparkSession exposes stop() method
>>
>> On Wed, Aug 3, 2016 at 8:53 AM, Pradeep <pradeep.mi...@mail.com> wrote:
>>
>>> Thanks Park. I am doing the same. Was trying to understand if there are
>>> other ways.
>>>
>>> Thanks,
>>> Pradeep
>>>
>>> > On Aug 2, 2016, at 10:25 PM, Park Kyeong Hee <kh1979.p...@samsung.com>
>>> wrote:
>>> >
>>> > So sorry. Your name was Pradeep !!
>>> >
>>> > -Original Message-
>>> > From: Park Kyeong Hee [mailto:kh1979.p...@samsung.com]
>>> > Sent: Wednesday, August 03, 2016 11:24 AM
>>> > To: 'Pradeep'; 'user@spark.apache.org'
>>> > Subject: RE: Stop Spark Streaming Jobs
>>> >
>>> > Hi. Paradeep
>>> >
>>> >
>>> > Did you mean, how to kill the job?
>>> > If yes, you should kill the driver and follow next.
>>> >
>>> > on yarn-client
>>> > 1. find pid - "ps -es | grep "
>>> > 2. kill it - "kill -9 "
>>> > 3. check executors were down - "yarn application -list"
>>> >
>>> > on yarn-cluster
>>> > 1. find driver's application ID - "yarn application -list"
>>> > 2. stop it - "yarn application -kill "
>>> > 3. check driver and executors were down - "yarn application -list"
>>> >
>>> >
>>> > Thanks.
>>> >
>>> > -Original Message-
>>> > From: Pradeep [mailto:pradeep.mi...@mail.com]
>>> > Sent: Wednesday, August 03, 2016 10:48 AM
>>> > To: user@spark.apache.org
>>> > Subject: Stop Spark Streaming Jobs
>>> >
>>> > Hi All,
>>> >
>>> > My streaming job reads data from Kafka. The job is triggered and
>>> pushed to
>>> > background with nohup.
>>> >
>>> > What are the recommended ways to stop job either on yarn-client or
>>> cluster
>>> > mode.
>>> >
>>> > Thanks,
>>> > Pradeep
>>> >
>>> > -
>>> > To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>>> >
>>> >
>>> >
>>> >
>>> > -
>>> > To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>>> >
>>>
>>>
>>> -
>>> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>>>
>>>
>>
>
>
> --
> *  Regards*
> *  Sandeep Nemuri*
>



-- 
*  Regards*
*  Sandeep Nemuri*


Re: Stop Spark Streaming Jobs

2016-08-04 Thread Sandeep Nemuri
StreamingContext.stop(...) if using scala
JavaStreamingContext.stop(...) if using Java

ᐧ

On Wed, Aug 3, 2016 at 9:14 PM, Tony Lane <tonylane@gmail.com> wrote:

> SparkSession exposes stop() method
>
> On Wed, Aug 3, 2016 at 8:53 AM, Pradeep <pradeep.mi...@mail.com> wrote:
>
>> Thanks Park. I am doing the same. Was trying to understand if there are
>> other ways.
>>
>> Thanks,
>> Pradeep
>>
>> > On Aug 2, 2016, at 10:25 PM, Park Kyeong Hee <kh1979.p...@samsung.com>
>> wrote:
>> >
>> > So sorry. Your name was Pradeep !!
>> >
>> > -Original Message-
>> > From: Park Kyeong Hee [mailto:kh1979.p...@samsung.com]
>> > Sent: Wednesday, August 03, 2016 11:24 AM
>> > To: 'Pradeep'; 'user@spark.apache.org'
>> > Subject: RE: Stop Spark Streaming Jobs
>> >
>> > Hi. Paradeep
>> >
>> >
>> > Did you mean, how to kill the job?
>> > If yes, you should kill the driver and follow next.
>> >
>> > on yarn-client
>> > 1. find pid - "ps -es | grep "
>> > 2. kill it - "kill -9 "
>> > 3. check executors were down - "yarn application -list"
>> >
>> > on yarn-cluster
>> > 1. find driver's application ID - "yarn application -list"
>> > 2. stop it - "yarn application -kill "
>> > 3. check driver and executors were down - "yarn application -list"
>> >
>> >
>> > Thanks.
>> >
>> > -Original Message-
>> > From: Pradeep [mailto:pradeep.mi...@mail.com]
>> > Sent: Wednesday, August 03, 2016 10:48 AM
>> > To: user@spark.apache.org
>> > Subject: Stop Spark Streaming Jobs
>> >
>> > Hi All,
>> >
>> > My streaming job reads data from Kafka. The job is triggered and pushed
>> to
>> > background with nohup.
>> >
>> > What are the recommended ways to stop job either on yarn-client or
>> cluster
>> > mode.
>> >
>> > Thanks,
>> > Pradeep
>> >
>> > -
>> > To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>> >
>> >
>> >
>> >
>> > -
>> > To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>> >
>>
>>
>> -
>> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>>
>>
>


-- 
*  Regards*
*  Sandeep Nemuri*


Re: Stop Spark Streaming Jobs

2016-08-03 Thread Tony Lane
SparkSession exposes stop() method

On Wed, Aug 3, 2016 at 8:53 AM, Pradeep <pradeep.mi...@mail.com> wrote:

> Thanks Park. I am doing the same. Was trying to understand if there are
> other ways.
>
> Thanks,
> Pradeep
>
> > On Aug 2, 2016, at 10:25 PM, Park Kyeong Hee <kh1979.p...@samsung.com>
> wrote:
> >
> > So sorry. Your name was Pradeep !!
> >
> > -Original Message-
> > From: Park Kyeong Hee [mailto:kh1979.p...@samsung.com]
> > Sent: Wednesday, August 03, 2016 11:24 AM
> > To: 'Pradeep'; 'user@spark.apache.org'
> > Subject: RE: Stop Spark Streaming Jobs
> >
> > Hi. Paradeep
> >
> >
> > Did you mean, how to kill the job?
> > If yes, you should kill the driver and follow next.
> >
> > on yarn-client
> > 1. find pid - "ps -es | grep "
> > 2. kill it - "kill -9 "
> > 3. check executors were down - "yarn application -list"
> >
> > on yarn-cluster
> > 1. find driver's application ID - "yarn application -list"
> > 2. stop it - "yarn application -kill "
> > 3. check driver and executors were down - "yarn application -list"
> >
> >
> > Thanks.
> >
> > -Original Message-
> > From: Pradeep [mailto:pradeep.mi...@mail.com]
> > Sent: Wednesday, August 03, 2016 10:48 AM
> > To: user@spark.apache.org
> > Subject: Stop Spark Streaming Jobs
> >
> > Hi All,
> >
> > My streaming job reads data from Kafka. The job is triggered and pushed
> to
> > background with nohup.
> >
> > What are the recommended ways to stop job either on yarn-client or
> cluster
> > mode.
> >
> > Thanks,
> > Pradeep
> >
> > -
> > To unsubscribe e-mail: user-unsubscr...@spark.apache.org
> >
> >
> >
> >
> > -
> > To unsubscribe e-mail: user-unsubscr...@spark.apache.org
> >
>
>
> -
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>


Re: Stop Spark Streaming Jobs

2016-08-02 Thread Pradeep
Thanks Park. I am doing the same. Was trying to understand if there are other 
ways.

Thanks,
Pradeep

> On Aug 2, 2016, at 10:25 PM, Park Kyeong Hee <kh1979.p...@samsung.com> wrote:
> 
> So sorry. Your name was Pradeep !!
> 
> -Original Message-
> From: Park Kyeong Hee [mailto:kh1979.p...@samsung.com] 
> Sent: Wednesday, August 03, 2016 11:24 AM
> To: 'Pradeep'; 'user@spark.apache.org'
> Subject: RE: Stop Spark Streaming Jobs
> 
> Hi. Paradeep
> 
> 
> Did you mean, how to kill the job?
> If yes, you should kill the driver and follow next.
> 
> on yarn-client
> 1. find pid - "ps -es | grep "
> 2. kill it - "kill -9 "
> 3. check executors were down - "yarn application -list"
> 
> on yarn-cluster
> 1. find driver's application ID - "yarn application -list"
> 2. stop it - "yarn application -kill "
> 3. check driver and executors were down - "yarn application -list"
> 
> 
> Thanks.
> 
> -Original Message-
> From: Pradeep [mailto:pradeep.mi...@mail.com] 
> Sent: Wednesday, August 03, 2016 10:48 AM
> To: user@spark.apache.org
> Subject: Stop Spark Streaming Jobs
> 
> Hi All,
> 
> My streaming job reads data from Kafka. The job is triggered and pushed to
> background with nohup.
> 
> What are the recommended ways to stop job either on yarn-client or cluster
> mode.
> 
> Thanks,
> Pradeep
> 
> -
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
> 
> 
> 
> 
> -
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
> 


-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



RE: Stop Spark Streaming Jobs

2016-08-02 Thread Park Kyeong Hee
Hi. Paradeep


Did you mean, how to kill the job?
If yes, you should kill the driver and follow next.

on yarn-client
1. find pid - "ps -es | grep "
2. kill it - "kill -9 "
3. check executors were down - "yarn application -list"

on yarn-cluster
1. find driver's application ID - "yarn application -list"
2. stop it - "yarn application -kill "
3. check driver and executors were down - "yarn application -list"


Thanks.

-Original Message-
From: Pradeep [mailto:pradeep.mi...@mail.com] 
Sent: Wednesday, August 03, 2016 10:48 AM
To: user@spark.apache.org
Subject: Stop Spark Streaming Jobs

Hi All,

My streaming job reads data from Kafka. The job is triggered and pushed to
background with nohup.

What are the recommended ways to stop job either on yarn-client or cluster
mode.

Thanks,
Pradeep

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org




-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org