Re: Stop Spark Streaming Jobs
Also set spark.streaming.stopGracefullyOnShutdown to true If true, Spark shuts down the StreamingContext gracefully on JVM shutdown rather than immediately. http://spark.apache.org/docs/latest/configuration.html#spark-streaming ᐧ On Thu, Aug 4, 2016 at 12:31 PM, Sandeep Nemuri <nhsande...@gmail.com> wrote: > StreamingContext.stop(...) if using scala > JavaStreamingContext.stop(...) if using Java > > ᐧ > > On Wed, Aug 3, 2016 at 9:14 PM, Tony Lane <tonylane@gmail.com> wrote: > >> SparkSession exposes stop() method >> >> On Wed, Aug 3, 2016 at 8:53 AM, Pradeep <pradeep.mi...@mail.com> wrote: >> >>> Thanks Park. I am doing the same. Was trying to understand if there are >>> other ways. >>> >>> Thanks, >>> Pradeep >>> >>> > On Aug 2, 2016, at 10:25 PM, Park Kyeong Hee <kh1979.p...@samsung.com> >>> wrote: >>> > >>> > So sorry. Your name was Pradeep !! >>> > >>> > -Original Message- >>> > From: Park Kyeong Hee [mailto:kh1979.p...@samsung.com] >>> > Sent: Wednesday, August 03, 2016 11:24 AM >>> > To: 'Pradeep'; 'user@spark.apache.org' >>> > Subject: RE: Stop Spark Streaming Jobs >>> > >>> > Hi. Paradeep >>> > >>> > >>> > Did you mean, how to kill the job? >>> > If yes, you should kill the driver and follow next. >>> > >>> > on yarn-client >>> > 1. find pid - "ps -es | grep " >>> > 2. kill it - "kill -9 " >>> > 3. check executors were down - "yarn application -list" >>> > >>> > on yarn-cluster >>> > 1. find driver's application ID - "yarn application -list" >>> > 2. stop it - "yarn application -kill " >>> > 3. check driver and executors were down - "yarn application -list" >>> > >>> > >>> > Thanks. >>> > >>> > -Original Message- >>> > From: Pradeep [mailto:pradeep.mi...@mail.com] >>> > Sent: Wednesday, August 03, 2016 10:48 AM >>> > To: user@spark.apache.org >>> > Subject: Stop Spark Streaming Jobs >>> > >>> > Hi All, >>> > >>> > My streaming job reads data from Kafka. The job is triggered and >>> pushed to >>> > background with nohup. >>> > >>> > What are the recommended ways to stop job either on yarn-client or >>> cluster >>> > mode. >>> > >>> > Thanks, >>> > Pradeep >>> > >>> > - >>> > To unsubscribe e-mail: user-unsubscr...@spark.apache.org >>> > >>> > >>> > >>> > >>> > - >>> > To unsubscribe e-mail: user-unsubscr...@spark.apache.org >>> > >>> >>> >>> - >>> To unsubscribe e-mail: user-unsubscr...@spark.apache.org >>> >>> >> > > > -- > * Regards* > * Sandeep Nemuri* > -- * Regards* * Sandeep Nemuri*
Re: Stop Spark Streaming Jobs
StreamingContext.stop(...) if using scala JavaStreamingContext.stop(...) if using Java ᐧ On Wed, Aug 3, 2016 at 9:14 PM, Tony Lane <tonylane@gmail.com> wrote: > SparkSession exposes stop() method > > On Wed, Aug 3, 2016 at 8:53 AM, Pradeep <pradeep.mi...@mail.com> wrote: > >> Thanks Park. I am doing the same. Was trying to understand if there are >> other ways. >> >> Thanks, >> Pradeep >> >> > On Aug 2, 2016, at 10:25 PM, Park Kyeong Hee <kh1979.p...@samsung.com> >> wrote: >> > >> > So sorry. Your name was Pradeep !! >> > >> > -Original Message- >> > From: Park Kyeong Hee [mailto:kh1979.p...@samsung.com] >> > Sent: Wednesday, August 03, 2016 11:24 AM >> > To: 'Pradeep'; 'user@spark.apache.org' >> > Subject: RE: Stop Spark Streaming Jobs >> > >> > Hi. Paradeep >> > >> > >> > Did you mean, how to kill the job? >> > If yes, you should kill the driver and follow next. >> > >> > on yarn-client >> > 1. find pid - "ps -es | grep " >> > 2. kill it - "kill -9 " >> > 3. check executors were down - "yarn application -list" >> > >> > on yarn-cluster >> > 1. find driver's application ID - "yarn application -list" >> > 2. stop it - "yarn application -kill " >> > 3. check driver and executors were down - "yarn application -list" >> > >> > >> > Thanks. >> > >> > -Original Message- >> > From: Pradeep [mailto:pradeep.mi...@mail.com] >> > Sent: Wednesday, August 03, 2016 10:48 AM >> > To: user@spark.apache.org >> > Subject: Stop Spark Streaming Jobs >> > >> > Hi All, >> > >> > My streaming job reads data from Kafka. The job is triggered and pushed >> to >> > background with nohup. >> > >> > What are the recommended ways to stop job either on yarn-client or >> cluster >> > mode. >> > >> > Thanks, >> > Pradeep >> > >> > - >> > To unsubscribe e-mail: user-unsubscr...@spark.apache.org >> > >> > >> > >> > >> > - >> > To unsubscribe e-mail: user-unsubscr...@spark.apache.org >> > >> >> >> - >> To unsubscribe e-mail: user-unsubscr...@spark.apache.org >> >> > -- * Regards* * Sandeep Nemuri*
Re: Stop Spark Streaming Jobs
SparkSession exposes stop() method On Wed, Aug 3, 2016 at 8:53 AM, Pradeep <pradeep.mi...@mail.com> wrote: > Thanks Park. I am doing the same. Was trying to understand if there are > other ways. > > Thanks, > Pradeep > > > On Aug 2, 2016, at 10:25 PM, Park Kyeong Hee <kh1979.p...@samsung.com> > wrote: > > > > So sorry. Your name was Pradeep !! > > > > -Original Message- > > From: Park Kyeong Hee [mailto:kh1979.p...@samsung.com] > > Sent: Wednesday, August 03, 2016 11:24 AM > > To: 'Pradeep'; 'user@spark.apache.org' > > Subject: RE: Stop Spark Streaming Jobs > > > > Hi. Paradeep > > > > > > Did you mean, how to kill the job? > > If yes, you should kill the driver and follow next. > > > > on yarn-client > > 1. find pid - "ps -es | grep " > > 2. kill it - "kill -9 " > > 3. check executors were down - "yarn application -list" > > > > on yarn-cluster > > 1. find driver's application ID - "yarn application -list" > > 2. stop it - "yarn application -kill " > > 3. check driver and executors were down - "yarn application -list" > > > > > > Thanks. > > > > -Original Message- > > From: Pradeep [mailto:pradeep.mi...@mail.com] > > Sent: Wednesday, August 03, 2016 10:48 AM > > To: user@spark.apache.org > > Subject: Stop Spark Streaming Jobs > > > > Hi All, > > > > My streaming job reads data from Kafka. The job is triggered and pushed > to > > background with nohup. > > > > What are the recommended ways to stop job either on yarn-client or > cluster > > mode. > > > > Thanks, > > Pradeep > > > > - > > To unsubscribe e-mail: user-unsubscr...@spark.apache.org > > > > > > > > > > - > > To unsubscribe e-mail: user-unsubscr...@spark.apache.org > > > > > - > To unsubscribe e-mail: user-unsubscr...@spark.apache.org > >
Re: Stop Spark Streaming Jobs
Thanks Park. I am doing the same. Was trying to understand if there are other ways. Thanks, Pradeep > On Aug 2, 2016, at 10:25 PM, Park Kyeong Hee <kh1979.p...@samsung.com> wrote: > > So sorry. Your name was Pradeep !! > > -Original Message- > From: Park Kyeong Hee [mailto:kh1979.p...@samsung.com] > Sent: Wednesday, August 03, 2016 11:24 AM > To: 'Pradeep'; 'user@spark.apache.org' > Subject: RE: Stop Spark Streaming Jobs > > Hi. Paradeep > > > Did you mean, how to kill the job? > If yes, you should kill the driver and follow next. > > on yarn-client > 1. find pid - "ps -es | grep " > 2. kill it - "kill -9 " > 3. check executors were down - "yarn application -list" > > on yarn-cluster > 1. find driver's application ID - "yarn application -list" > 2. stop it - "yarn application -kill " > 3. check driver and executors were down - "yarn application -list" > > > Thanks. > > -----Original Message----- > From: Pradeep [mailto:pradeep.mi...@mail.com] > Sent: Wednesday, August 03, 2016 10:48 AM > To: user@spark.apache.org > Subject: Stop Spark Streaming Jobs > > Hi All, > > My streaming job reads data from Kafka. The job is triggered and pushed to > background with nohup. > > What are the recommended ways to stop job either on yarn-client or cluster > mode. > > Thanks, > Pradeep > > - > To unsubscribe e-mail: user-unsubscr...@spark.apache.org > > > > > - > To unsubscribe e-mail: user-unsubscr...@spark.apache.org > - To unsubscribe e-mail: user-unsubscr...@spark.apache.org
FW: Stop Spark Streaming Jobs
So sorry. Your name was Pradeep !! -Original Message- From: Park Kyeong Hee [mailto:kh1979.p...@samsung.com] Sent: Wednesday, August 03, 2016 11:24 AM To: 'Pradeep'; 'user@spark.apache.org' Subject: RE: Stop Spark Streaming Jobs Hi. Paradeep Did you mean, how to kill the job? If yes, you should kill the driver and follow next. on yarn-client 1. find pid - "ps -es | grep " 2. kill it - "kill -9 " 3. check executors were down - "yarn application -list" on yarn-cluster 1. find driver's application ID - "yarn application -list" 2. stop it - "yarn application -kill " 3. check driver and executors were down - "yarn application -list" Thanks. -Original Message- From: Pradeep [mailto:pradeep.mi...@mail.com] Sent: Wednesday, August 03, 2016 10:48 AM To: user@spark.apache.org Subject: Stop Spark Streaming Jobs Hi All, My streaming job reads data from Kafka. The job is triggered and pushed to background with nohup. What are the recommended ways to stop job either on yarn-client or cluster mode. Thanks, Pradeep - To unsubscribe e-mail: user-unsubscr...@spark.apache.org - To unsubscribe e-mail: user-unsubscr...@spark.apache.org
RE: Stop Spark Streaming Jobs
Hi. Paradeep Did you mean, how to kill the job? If yes, you should kill the driver and follow next. on yarn-client 1. find pid - "ps -es | grep " 2. kill it - "kill -9 " 3. check executors were down - "yarn application -list" on yarn-cluster 1. find driver's application ID - "yarn application -list" 2. stop it - "yarn application -kill " 3. check driver and executors were down - "yarn application -list" Thanks. -Original Message- From: Pradeep [mailto:pradeep.mi...@mail.com] Sent: Wednesday, August 03, 2016 10:48 AM To: user@spark.apache.org Subject: Stop Spark Streaming Jobs Hi All, My streaming job reads data from Kafka. The job is triggered and pushed to background with nohup. What are the recommended ways to stop job either on yarn-client or cluster mode. Thanks, Pradeep - To unsubscribe e-mail: user-unsubscr...@spark.apache.org - To unsubscribe e-mail: user-unsubscr...@spark.apache.org
Stop Spark Streaming Jobs
Hi All, My streaming job reads data from Kafka. The job is triggered and pushed to background with nohup. What are the recommended ways to stop job either on yarn-client or cluster mode. Thanks, Pradeep - To unsubscribe e-mail: user-unsubscr...@spark.apache.org