Re: How to kill spark applications submitted using spark-submit reliably?

2015-11-22 Thread Ted Yu
>
> If you ask about trapping the SIGKILL signal in your script, see the
> following:
>
> http://linuxcommand.org/wss0160.php
>
> Cheers
>


> On Fri, Nov 20, 2015 at 10:02 PM, Vikram Kone 
> wrote:
>
>> I tried adding shutdown hook to my code but it didn't help. Still same
>> issue
>>
>>
>> On Fri, Nov 20, 2015 at 7:08 PM, Ted Yu  wrote:
>>
>>> Which Spark release are you using ?
>>>
>>> Can you pastebin the stack trace of the process running on your machine ?
>>>
>>> Thanks
>>>
>>> On Nov 20, 2015, at 6:46 PM, Vikram Kone  wrote:
>>>
>>> Hi,
>>> I'm seeing a strange problem. I have a spark cluster in standalone mode.
>>> I submit spark jobs from a remote node as follows from the terminal
>>>
>>> spark-submit --master spark://10.1.40.18:7077  --class com.test.Ping
>>> spark-jobs.jar
>>>
>>> when the app is running , when I press ctrl-C on the console terminal,
>>> then the process is killed and so is the app in the spark master UI. When I
>>> go to spark master ui, i see that this app is in state Killed under
>>> Completed applications, which is what I expected to see.
>>>
>>> Now, I created a shell script as follows to do the same
>>>
>>> #!/bin/bash
>>> spark-submit --master spark://10.1.40.18:7077  --class com.test.Ping
>>> spark-jobs.jar
>>> echo $! > my.pid
>>>
>>> When I execute the shell script from terminal, as follows
>>>
>>> $> bash myscript.sh
>>>
>>> The application is submitted correctly to spark master and I can see it
>>> as one of the running apps in teh spark master ui. But when I kill the
>>> process in my terminal as follows
>>>
>>> $> ps kill $(cat my.pid)
>>>
>>> I see that the process is killed on my machine but the spark appliation
>>> is still running in spark master! It doesn't get killed.
>>>
>>> I noticed one more thing that, when I launch the spark job via shell
>>> script and kill the application from spark master UI by clicking on "kill"
>>> next to the running application, it gets killed in spark ui but I still see
>>> the process running in my machine.
>>>
>>> In both cases, I would expect the remote spark app to be killed and my
>>> local process to be killed.
>>>
>>> Why is this happening? and how can I kill a spark app from the terminal
>>> launced via shell script w.o going to the spark master UI?
>>>
>>> I want to launch the spark app via script and log the pid so i can
>>> monitor it remotely
>>>
>>> thanks for the help
>>>
>>>
>>
>


Re: How to kill spark applications submitted using spark-submit reliably?

2015-11-22 Thread Sudhanshu Janghel
I have noticed that the UI takes some time to reflect the requested changes. Is 
that the issue ? Have you tried waiting for a few minutes after killing the 
spark job from terminal ?

Regards,
Sudhanshu

Kind Regards,
Sudhanshu

On 23 Nov 2015, at 1:43 a.m., Ted Yu  wrote:

>> If you ask about trapping the SIGKILL signal in your script, see the 
>> following:
>> 
>> http://linuxcommand.org/wss0160.php
>> 
>> Cheers
>  
>>> On Fri, Nov 20, 2015 at 10:02 PM, Vikram Kone  wrote:
>>> I tried adding shutdown hook to my code but it didn't help. Still same issue
>>> 
>>> 
 On Fri, Nov 20, 2015 at 7:08 PM, Ted Yu  wrote:
 Which Spark release are you using ?
 
 Can you pastebin the stack trace of the process running on your machine ?
 
 Thanks
 
> On Nov 20, 2015, at 6:46 PM, Vikram Kone  wrote:
> 
> Hi,
> I'm seeing a strange problem. I have a spark cluster in standalone mode. 
> I submit spark jobs from a remote node as follows from the terminal
> 
> spark-submit --master spark://10.1.40.18:7077  --class com.test.Ping 
> spark-jobs.jar
> 
> when the app is running , when I press ctrl-C on the console terminal, 
> then the process is killed and so is the app in the spark master UI. When 
> I go to spark master ui, i see that this app is in state Killed under 
> Completed applications, which is what I expected to see.
> 
> Now, I created a shell script as follows to do the same
> 
> #!/bin/bash
> spark-submit --master spark://10.1.40.18:7077  --class com.test.Ping 
> spark-jobs.jar
> echo $! > my.pid
> 
> When I execute the shell script from terminal, as follows
> 
> $> bash myscript.sh
> 
> The application is submitted correctly to spark master and I can see it 
> as one of the running apps in teh spark master ui. But when I kill the 
> process in my terminal as follows
> 
> $> ps kill $(cat my.pid)
> 
> I see that the process is killed on my machine but the spark appliation 
> is still running in spark master! It doesn't get killed.
> 
> I noticed one more thing that, when I launch the spark job via shell 
> script and kill the application from spark master UI by clicking on 
> "kill" next to the running application, it gets killed in spark ui but I 
> still see the process running in my machine. 
> 
> In both cases, I would expect the remote spark app to be killed and my 
> local process to be killed.
> 
> Why is this happening? and how can I kill a spark app from the terminal 
> launced via shell script w.o going to the spark master UI?
> 
> I want to launch the spark app via script and log the pid so i can 
> monitor it remotely
> 
> thanks for the help
> 


Re: How to kill spark applications submitted using spark-submit reliably?

2015-11-22 Thread Ted Yu
If you ask about trapping the SIGKILL signal in your script, see the
following:

http://linuxcommand.org/wss0160.php

Cheers

On Fri, Nov 20, 2015 at 10:02 PM, Vikram Kone  wrote:

> I tried adding shutdown hook to my code but it didn't help. Still same
> issue
>
>
> On Fri, Nov 20, 2015 at 7:08 PM, Ted Yu  wrote:
>
>> Which Spark release are you using ?
>>
>> Can you pastebin the stack trace of the process running on your machine ?
>>
>> Thanks
>>
>> On Nov 20, 2015, at 6:46 PM, Vikram Kone  wrote:
>>
>> Hi,
>> I'm seeing a strange problem. I have a spark cluster in standalone mode.
>> I submit spark jobs from a remote node as follows from the terminal
>>
>> spark-submit --master spark://10.1.40.18:7077  --class com.test.Ping
>> spark-jobs.jar
>>
>> when the app is running , when I press ctrl-C on the console terminal,
>> then the process is killed and so is the app in the spark master UI. When I
>> go to spark master ui, i see that this app is in state Killed under
>> Completed applications, which is what I expected to see.
>>
>> Now, I created a shell script as follows to do the same
>>
>> #!/bin/bash
>> spark-submit --master spark://10.1.40.18:7077  --class com.test.Ping
>> spark-jobs.jar
>> echo $! > my.pid
>>
>> When I execute the shell script from terminal, as follows
>>
>> $> bash myscript.sh
>>
>> The application is submitted correctly to spark master and I can see it
>> as one of the running apps in teh spark master ui. But when I kill the
>> process in my terminal as follows
>>
>> $> ps kill $(cat my.pid)
>>
>> I see that the process is killed on my machine but the spark appliation
>> is still running in spark master! It doesn't get killed.
>>
>> I noticed one more thing that, when I launch the spark job via shell
>> script and kill the application from spark master UI by clicking on "kill"
>> next to the running application, it gets killed in spark ui but I still see
>> the process running in my machine.
>>
>> In both cases, I would expect the remote spark app to be killed and my
>> local process to be killed.
>>
>> Why is this happening? and how can I kill a spark app from the terminal
>> launced via shell script w.o going to the spark master UI?
>>
>> I want to launch the spark app via script and log the pid so i can
>> monitor it remotely
>>
>> thanks for the help
>>
>>
>


Re: How to kill spark applications submitted using spark-submit reliably?

2015-11-20 Thread Ted Yu
Interesting, SPARK-3090 installs shutdown hook for stopping SparkContext.

FYI

On Fri, Nov 20, 2015 at 7:12 PM, Stéphane Verlet 
wrote:

> I solved the first issue by adding a shutdown hook in my code. The
> shutdown hook get call when you exit your script (ctrl-C , kill … but nor
> kill -9)
>
> val shutdownHook = scala.sys.addShutdownHook {
> try {
>
> sparkContext.stop()
> //Make sure to kill any other threads or thread pool you may be running
>   }
>   catch {
> case e: Exception =>
>   {
> ...
>
>   }
>   }
>
> }
>
> For the other issue , kill from the UI. I also had the issue. This was
> caused by a thread pool that I use.
>
> So I surrounded my code with try/finally block to guarantee that the
> thread pool was shutdown when spark stopped
>
> I hopes this help
>
> Stephane
> ​
>
> On Fri, Nov 20, 2015 at 7:46 PM, Vikram Kone  wrote:
>
>> Hi,
>> I'm seeing a strange problem. I have a spark cluster in standalone mode.
>> I submit spark jobs from a remote node as follows from the terminal
>>
>> spark-submit --master spark://10.1.40.18:7077  --class com.test.Ping
>> spark-jobs.jar
>>
>> when the app is running , when I press ctrl-C on the console terminal,
>> then the process is killed and so is the app in the spark master UI. When I
>> go to spark master ui, i see that this app is in state Killed under
>> Completed applications, which is what I expected to see.
>>
>> Now, I created a shell script as follows to do the same
>>
>> #!/bin/bash
>> spark-submit --master spark://10.1.40.18:7077  --class com.test.Ping
>> spark-jobs.jar
>> echo $! > my.pid
>>
>> When I execute the shell script from terminal, as follows
>>
>> $> bash myscript.sh
>>
>> The application is submitted correctly to spark master and I can see it
>> as one of the running apps in teh spark master ui. But when I kill the
>> process in my terminal as follows
>>
>> $> ps kill $(cat my.pid)
>>
>> I see that the process is killed on my machine but the spark appliation
>> is still running in spark master! It doesn't get killed.
>>
>> I noticed one more thing that, when I launch the spark job via shell
>> script and kill the application from spark master UI by clicking on "kill"
>> next to the running application, it gets killed in spark ui but I still see
>> the process running in my machine.
>>
>> In both cases, I would expect the remote spark app to be killed and my
>> local process to be killed.
>>
>> Why is this happening? and how can I kill a spark app from the terminal
>> launced via shell script w.o going to the spark master UI?
>>
>> I want to launch the spark app via script and log the pid so i can
>> monitor it remotely
>>
>> thanks for the help
>>
>>
>


Re: How to kill spark applications submitted using spark-submit reliably?

2015-11-20 Thread Vikram Kone
I tried adding shutdown hook to my code but it didn't help. Still same issue


On Fri, Nov 20, 2015 at 7:08 PM, Ted Yu  wrote:

> Which Spark release are you using ?
>
> Can you pastebin the stack trace of the process running on your machine ?
>
> Thanks
>
> On Nov 20, 2015, at 6:46 PM, Vikram Kone  wrote:
>
> Hi,
> I'm seeing a strange problem. I have a spark cluster in standalone mode. I
> submit spark jobs from a remote node as follows from the terminal
>
> spark-submit --master spark://10.1.40.18:7077  --class com.test.Ping
> spark-jobs.jar
>
> when the app is running , when I press ctrl-C on the console terminal,
> then the process is killed and so is the app in the spark master UI. When I
> go to spark master ui, i see that this app is in state Killed under
> Completed applications, which is what I expected to see.
>
> Now, I created a shell script as follows to do the same
>
> #!/bin/bash
> spark-submit --master spark://10.1.40.18:7077  --class com.test.Ping
> spark-jobs.jar
> echo $! > my.pid
>
> When I execute the shell script from terminal, as follows
>
> $> bash myscript.sh
>
> The application is submitted correctly to spark master and I can see it as
> one of the running apps in teh spark master ui. But when I kill the process
> in my terminal as follows
>
> $> ps kill $(cat my.pid)
>
> I see that the process is killed on my machine but the spark appliation is
> still running in spark master! It doesn't get killed.
>
> I noticed one more thing that, when I launch the spark job via shell
> script and kill the application from spark master UI by clicking on "kill"
> next to the running application, it gets killed in spark ui but I still see
> the process running in my machine.
>
> In both cases, I would expect the remote spark app to be killed and my
> local process to be killed.
>
> Why is this happening? and how can I kill a spark app from the terminal
> launced via shell script w.o going to the spark master UI?
>
> I want to launch the spark app via script and log the pid so i can monitor
> it remotely
>
> thanks for the help
>
>


Re: How to kill spark applications submitted using spark-submit reliably?

2015-11-20 Thread Vikram Kone
Spark 1.4.1

On Friday, November 20, 2015, Ted Yu  wrote:

> Which Spark release are you using ?
>
> Can you pastebin the stack trace of the process running on your machine ?
>
> Thanks
>
> On Nov 20, 2015, at 6:46 PM, Vikram Kone  > wrote:
>
> Hi,
> I'm seeing a strange problem. I have a spark cluster in standalone mode. I
> submit spark jobs from a remote node as follows from the terminal
>
> spark-submit --master spark://10.1.40.18:7077  --class com.test.Ping
> spark-jobs.jar
>
> when the app is running , when I press ctrl-C on the console terminal,
> then the process is killed and so is the app in the spark master UI. When I
> go to spark master ui, i see that this app is in state Killed under
> Completed applications, which is what I expected to see.
>
> Now, I created a shell script as follows to do the same
>
> #!/bin/bash
> spark-submit --master spark://10.1.40.18:7077  --class com.test.Ping
> spark-jobs.jar
> echo $! > my.pid
>
> When I execute the shell script from terminal, as follows
>
> $> bash myscript.sh
>
> The application is submitted correctly to spark master and I can see it as
> one of the running apps in teh spark master ui. But when I kill the process
> in my terminal as follows
>
> $> ps kill $(cat my.pid)
>
> I see that the process is killed on my machine but the spark appliation is
> still running in spark master! It doesn't get killed.
>
> I noticed one more thing that, when I launch the spark job via shell
> script and kill the application from spark master UI by clicking on "kill"
> next to the running application, it gets killed in spark ui but I still see
> the process running in my machine.
>
> In both cases, I would expect the remote spark app to be killed and my
> local process to be killed.
>
> Why is this happening? and how can I kill a spark app from the terminal
> launced via shell script w.o going to the spark master UI?
>
> I want to launch the spark app via script and log the pid so i can monitor
> it remotely
>
> thanks for the help
>
>


Re: How to kill spark applications submitted using spark-submit reliably?

2015-11-20 Thread Stéphane Verlet
I am not sure , I think it has to do with the signal sent to the process
and how the JVM handles it

Ctrl-C sends a a SIGINT vs a TERM signal for the kill command



On Fri, Nov 20, 2015 at 8:21 PM, Vikram Kone  wrote:

> Thanks for the info Stephane.
> Why does CTRL-C in the terminal running spark-submit kills the app in
> spark master correctly w/o any explicit shutdown hooks in the code? Can you
> explain why we need to add the shutdown hook to kill it when launched via a
> shell script ?
> For the second issue, I'm not using any thread pool. So not sure why
> killing the app in spark UI doesn't kill the process launched via script
>
>
> On Friday, November 20, 2015, Stéphane Verlet 
> wrote:
>
>> I solved the first issue by adding a shutdown hook in my code. The
>> shutdown hook get call when you exit your script (ctrl-C , kill … but nor
>> kill -9)
>>
>> val shutdownHook = scala.sys.addShutdownHook {
>> try {
>>
>> sparkContext.stop()
>> //Make sure to kill any other threads or thread pool you may be running
>>   }
>>   catch {
>> case e: Exception =>
>>   {
>> ...
>>
>>   }
>>   }
>>
>> }
>>
>> For the other issue , kill from the UI. I also had the issue. This was
>> caused by a thread pool that I use.
>>
>> So I surrounded my code with try/finally block to guarantee that the
>> thread pool was shutdown when spark stopped
>>
>> I hopes this help
>>
>> Stephane
>> ​
>>
>> On Fri, Nov 20, 2015 at 7:46 PM, Vikram Kone 
>> wrote:
>>
>>> Hi,
>>> I'm seeing a strange problem. I have a spark cluster in standalone mode.
>>> I submit spark jobs from a remote node as follows from the terminal
>>>
>>> spark-submit --master spark://10.1.40.18:7077  --class com.test.Ping
>>> spark-jobs.jar
>>>
>>> when the app is running , when I press ctrl-C on the console terminal,
>>> then the process is killed and so is the app in the spark master UI. When I
>>> go to spark master ui, i see that this app is in state Killed under
>>> Completed applications, which is what I expected to see.
>>>
>>> Now, I created a shell script as follows to do the same
>>>
>>> #!/bin/bash
>>> spark-submit --master spark://10.1.40.18:7077  --class com.test.Ping
>>> spark-jobs.jar
>>> echo $! > my.pid
>>>
>>> When I execute the shell script from terminal, as follows
>>>
>>> $> bash myscript.sh
>>>
>>> The application is submitted correctly to spark master and I can see it
>>> as one of the running apps in teh spark master ui. But when I kill the
>>> process in my terminal as follows
>>>
>>> $> ps kill $(cat my.pid)
>>>
>>> I see that the process is killed on my machine but the spark appliation
>>> is still running in spark master! It doesn't get killed.
>>>
>>> I noticed one more thing that, when I launch the spark job via shell
>>> script and kill the application from spark master UI by clicking on "kill"
>>> next to the running application, it gets killed in spark ui but I still see
>>> the process running in my machine.
>>>
>>> In both cases, I would expect the remote spark app to be killed and my
>>> local process to be killed.
>>>
>>> Why is this happening? and how can I kill a spark app from the terminal
>>> launced via shell script w.o going to the spark master UI?
>>>
>>> I want to launch the spark app via script and log the pid so i can
>>> monitor it remotely
>>>
>>> thanks for the help
>>>
>>>
>>


Re: How to kill spark applications submitted using spark-submit reliably?

2015-11-20 Thread varun sharma
I do this in my stop script to kill the application: kill -s SIGTERM `pgrep
-f StreamingApp`
to stop it forcefully : pkill -9 -f "StreamingApp"
StreamingApp is name of class which I submitted.

I also have shutdown hook thread to stop it gracefully.

sys.ShutdownHookThread {
  logInfo("Gracefully stopping StreamingApp")
  ssc.stop(true, true)
  logInfo("StreamingApp stopped")
}

I am also not able to kill application from sparkUI.


On Sat, Nov 21, 2015 at 11:32 AM, Vikram Kone  wrote:

> I tried adding shutdown hook to my code but it didn't help. Still same
> issue
>
>
> On Fri, Nov 20, 2015 at 7:08 PM, Ted Yu  wrote:
>
>> Which Spark release are you using ?
>>
>> Can you pastebin the stack trace of the process running on your machine ?
>>
>> Thanks
>>
>> On Nov 20, 2015, at 6:46 PM, Vikram Kone  wrote:
>>
>> Hi,
>> I'm seeing a strange problem. I have a spark cluster in standalone mode.
>> I submit spark jobs from a remote node as follows from the terminal
>>
>> spark-submit --master spark://10.1.40.18:7077  --class com.test.Ping
>> spark-jobs.jar
>>
>> when the app is running , when I press ctrl-C on the console terminal,
>> then the process is killed and so is the app in the spark master UI. When I
>> go to spark master ui, i see that this app is in state Killed under
>> Completed applications, which is what I expected to see.
>>
>> Now, I created a shell script as follows to do the same
>>
>> #!/bin/bash
>> spark-submit --master spark://10.1.40.18:7077  --class com.test.Ping
>> spark-jobs.jar
>> echo $! > my.pid
>>
>> When I execute the shell script from terminal, as follows
>>
>> $> bash myscript.sh
>>
>> The application is submitted correctly to spark master and I can see it
>> as one of the running apps in teh spark master ui. But when I kill the
>> process in my terminal as follows
>>
>> $> ps kill $(cat my.pid)
>>
>> I see that the process is killed on my machine but the spark appliation
>> is still running in spark master! It doesn't get killed.
>>
>> I noticed one more thing that, when I launch the spark job via shell
>> script and kill the application from spark master UI by clicking on "kill"
>> next to the running application, it gets killed in spark ui but I still see
>> the process running in my machine.
>>
>> In both cases, I would expect the remote spark app to be killed and my
>> local process to be killed.
>>
>> Why is this happening? and how can I kill a spark app from the terminal
>> launced via shell script w.o going to the spark master UI?
>>
>> I want to launch the spark app via script and log the pid so i can
>> monitor it remotely
>>
>> thanks for the help
>>
>>
>


-- 
*VARUN SHARMA*
*Flipkart*
*Bangalore*


Re: How to kill spark applications submitted using spark-submit reliably?

2015-11-20 Thread Stéphane Verlet
I solved the first issue by adding a shutdown hook in my code. The shutdown
hook get call when you exit your script (ctrl-C , kill … but nor kill -9)

val shutdownHook = scala.sys.addShutdownHook {
try {

sparkContext.stop()
//Make sure to kill any other threads or thread pool you may be running
  }
  catch {
case e: Exception =>
  {
...

  }
  }

}

For the other issue , kill from the UI. I also had the issue. This was
caused by a thread pool that I use.

So I surrounded my code with try/finally block to guarantee that the thread
pool was shutdown when spark stopped

I hopes this help

Stephane
​

On Fri, Nov 20, 2015 at 7:46 PM, Vikram Kone  wrote:

> Hi,
> I'm seeing a strange problem. I have a spark cluster in standalone mode. I
> submit spark jobs from a remote node as follows from the terminal
>
> spark-submit --master spark://10.1.40.18:7077  --class com.test.Ping
> spark-jobs.jar
>
> when the app is running , when I press ctrl-C on the console terminal,
> then the process is killed and so is the app in the spark master UI. When I
> go to spark master ui, i see that this app is in state Killed under
> Completed applications, which is what I expected to see.
>
> Now, I created a shell script as follows to do the same
>
> #!/bin/bash
> spark-submit --master spark://10.1.40.18:7077  --class com.test.Ping
> spark-jobs.jar
> echo $! > my.pid
>
> When I execute the shell script from terminal, as follows
>
> $> bash myscript.sh
>
> The application is submitted correctly to spark master and I can see it as
> one of the running apps in teh spark master ui. But when I kill the process
> in my terminal as follows
>
> $> ps kill $(cat my.pid)
>
> I see that the process is killed on my machine but the spark appliation is
> still running in spark master! It doesn't get killed.
>
> I noticed one more thing that, when I launch the spark job via shell
> script and kill the application from spark master UI by clicking on "kill"
> next to the running application, it gets killed in spark ui but I still see
> the process running in my machine.
>
> In both cases, I would expect the remote spark app to be killed and my
> local process to be killed.
>
> Why is this happening? and how can I kill a spark app from the terminal
> launced via shell script w.o going to the spark master UI?
>
> I want to launch the spark app via script and log the pid so i can monitor
> it remotely
>
> thanks for the help
>
>


Re: How to kill spark applications submitted using spark-submit reliably?

2015-11-20 Thread Vikram Kone
Thanks for the info Stephane.
Why does CTRL-C in the terminal running spark-submit kills the app in spark
master correctly w/o any explicit shutdown hooks in the code? Can you
explain why we need to add the shutdown hook to kill it when launched via a
shell script ?
For the second issue, I'm not using any thread pool. So not sure why
killing the app in spark UI doesn't kill the process launched via script

On Friday, November 20, 2015, Stéphane Verlet 
wrote:

> I solved the first issue by adding a shutdown hook in my code. The
> shutdown hook get call when you exit your script (ctrl-C , kill … but nor
> kill -9)
>
> val shutdownHook = scala.sys.addShutdownHook {
> try {
>
> sparkContext.stop()
> //Make sure to kill any other threads or thread pool you may be running
>   }
>   catch {
> case e: Exception =>
>   {
> ...
>
>   }
>   }
>
> }
>
> For the other issue , kill from the UI. I also had the issue. This was
> caused by a thread pool that I use.
>
> So I surrounded my code with try/finally block to guarantee that the
> thread pool was shutdown when spark stopped
>
> I hopes this help
>
> Stephane
> ​
>
> On Fri, Nov 20, 2015 at 7:46 PM, Vikram Kone  > wrote:
>
>> Hi,
>> I'm seeing a strange problem. I have a spark cluster in standalone mode.
>> I submit spark jobs from a remote node as follows from the terminal
>>
>> spark-submit --master spark://10.1.40.18:7077  --class com.test.Ping
>> spark-jobs.jar
>>
>> when the app is running , when I press ctrl-C on the console terminal,
>> then the process is killed and so is the app in the spark master UI. When I
>> go to spark master ui, i see that this app is in state Killed under
>> Completed applications, which is what I expected to see.
>>
>> Now, I created a shell script as follows to do the same
>>
>> #!/bin/bash
>> spark-submit --master spark://10.1.40.18:7077  --class com.test.Ping
>> spark-jobs.jar
>> echo $! > my.pid
>>
>> When I execute the shell script from terminal, as follows
>>
>> $> bash myscript.sh
>>
>> The application is submitted correctly to spark master and I can see it
>> as one of the running apps in teh spark master ui. But when I kill the
>> process in my terminal as follows
>>
>> $> ps kill $(cat my.pid)
>>
>> I see that the process is killed on my machine but the spark appliation
>> is still running in spark master! It doesn't get killed.
>>
>> I noticed one more thing that, when I launch the spark job via shell
>> script and kill the application from spark master UI by clicking on "kill"
>> next to the running application, it gets killed in spark ui but I still see
>> the process running in my machine.
>>
>> In both cases, I would expect the remote spark app to be killed and my
>> local process to be killed.
>>
>> Why is this happening? and how can I kill a spark app from the terminal
>> launced via shell script w.o going to the spark master UI?
>>
>> I want to launch the spark app via script and log the pid so i can
>> monitor it remotely
>>
>> thanks for the help
>>
>>
>


Re: How to kill spark applications submitted using spark-submit reliably?

2015-11-20 Thread Ted Yu
Which Spark release are you using ?

Can you pastebin the stack trace of the process running on your machine ?

Thanks

> On Nov 20, 2015, at 6:46 PM, Vikram Kone  wrote:
> 
> Hi,
> I'm seeing a strange problem. I have a spark cluster in standalone mode. I 
> submit spark jobs from a remote node as follows from the terminal
> 
> spark-submit --master spark://10.1.40.18:7077  --class com.test.Ping 
> spark-jobs.jar
> 
> when the app is running , when I press ctrl-C on the console terminal, then 
> the process is killed and so is the app in the spark master UI. When I go to 
> spark master ui, i see that this app is in state Killed under Completed 
> applications, which is what I expected to see.
> 
> Now, I created a shell script as follows to do the same
> 
> #!/bin/bash
> spark-submit --master spark://10.1.40.18:7077  --class com.test.Ping 
> spark-jobs.jar
> echo $! > my.pid
> 
> When I execute the shell script from terminal, as follows
> 
> $> bash myscript.sh
> 
> The application is submitted correctly to spark master and I can see it as 
> one of the running apps in teh spark master ui. But when I kill the process 
> in my terminal as follows
> 
> $> ps kill $(cat my.pid)
> 
> I see that the process is killed on my machine but the spark appliation is 
> still running in spark master! It doesn't get killed.
> 
> I noticed one more thing that, when I launch the spark job via shell script 
> and kill the application from spark master UI by clicking on "kill" next to 
> the running application, it gets killed in spark ui but I still see the 
> process running in my machine. 
> 
> In both cases, I would expect the remote spark app to be killed and my local 
> process to be killed.
> 
> Why is this happening? and how can I kill a spark app from the terminal 
> launced via shell script w.o going to the spark master UI?
> 
> I want to launch the spark app via script and log the pid so i can monitor it 
> remotely
> 
> thanks for the help
>