So sorry. Your name was Pradeep !! -----Original Message----- From: Park Kyeong Hee [mailto:[email protected]] Sent: Wednesday, August 03, 2016 11:24 AM To: 'Pradeep'; '[email protected]' Subject: RE: Stop Spark Streaming Jobs
Hi. Paradeep Did you mean, how to kill the job? If yes, you should kill the driver and follow next. on yarn-client 1. find pid - "ps -es | grep <your_jobs_main_class>" 2. kill it - "kill -9 <pid>" 3. check executors were down - "yarn application -list" on yarn-cluster 1. find driver's application ID - "yarn application -list" 2. stop it - "yarn application -kill <app_ID>" 3. check driver and executors were down - "yarn application -list" Thanks. -----Original Message----- From: Pradeep [mailto:[email protected]] Sent: Wednesday, August 03, 2016 10:48 AM To: [email protected] Subject: Stop Spark Streaming Jobs Hi All, My streaming job reads data from Kafka. The job is triggered and pushed to background with nohup. What are the recommended ways to stop job either on yarn-client or cluster mode. Thanks, Pradeep --------------------------------------------------------------------- To unsubscribe e-mail: [email protected] --------------------------------------------------------------------- To unsubscribe e-mail: [email protected]
