Re: killing spark job which is submitted using SparkSubmit

2016-05-06 Thread satish saley
Thank you Anthony. I am clearer on yarn-cluster and yarn-client now.

On Fri, May 6, 2016 at 1:05 PM, Anthony May  wrote:

> Making the master yarn-cluster means that the driver is then running on
> YARN not just the executor nodes. It's then independent of your application
> and can only be killed via YARN commands, or if it's batch and completes.
> The simplest way to tie the driver to your app is to pass in yarn-client as
> master instead.
>
> On Fri, May 6, 2016 at 2:00 PM satish saley 
> wrote:
>
>> Hi Anthony,
>>
>> I am passing
>>
>> --master
>> yarn-cluster
>> --name
>> pysparkexample
>> --executor-memory
>> 1G
>> --driver-memory
>> 1G
>> --conf
>> spark.yarn.historyServer.address=http://localhost:18080
>> --conf
>> spark.eventLog.enabled=true
>>
>> --verbose
>>
>> pi.py
>>
>>
>> I am able to run the job successfully. I just want to get it killed 
>> automatically whenever I kill my application.
>>
>>
>> On Fri, May 6, 2016 at 11:58 AM, Anthony May 
>> wrote:
>>
>>> Greetings Satish,
>>>
>>> What are the arguments you're passing in?
>>>
>>> On Fri, 6 May 2016 at 12:50 satish saley 
>>> wrote:
>>>
 Hello,

 I am submitting a spark job using SparkSubmit. When I kill my
 application, it does not kill the corresponding spark job. How would I kill
 the corresponding spark job? I know, one way is to use SparkSubmit again
 with appropriate options. Is there any way though which I can tell
 SparkSubmit at the time of job submission itself. Here is my code:


-
import org.apache.spark.deploy.SparkSubmit;
- class MyClass{
-
- public static void main(String args[]){
- //preparing args
- SparkSubmit.main(args);
- }
-
- }


>>


Re: killing spark job which is submitted using SparkSubmit

2016-05-06 Thread Anthony May
Making the master yarn-cluster means that the driver is then running on
YARN not just the executor nodes. It's then independent of your application
and can only be killed via YARN commands, or if it's batch and completes.
The simplest way to tie the driver to your app is to pass in yarn-client as
master instead.
On Fri, May 6, 2016 at 2:00 PM satish saley  wrote:

> Hi Anthony,
>
> I am passing
>
> --master
> yarn-cluster
> --name
> pysparkexample
> --executor-memory
> 1G
> --driver-memory
> 1G
> --conf
> spark.yarn.historyServer.address=http://localhost:18080
> --conf
> spark.eventLog.enabled=true
>
> --verbose
>
> pi.py
>
>
> I am able to run the job successfully. I just want to get it killed 
> automatically whenever I kill my application.
>
>
> On Fri, May 6, 2016 at 11:58 AM, Anthony May  wrote:
>
>> Greetings Satish,
>>
>> What are the arguments you're passing in?
>>
>> On Fri, 6 May 2016 at 12:50 satish saley  wrote:
>>
>>> Hello,
>>>
>>> I am submitting a spark job using SparkSubmit. When I kill my
>>> application, it does not kill the corresponding spark job. How would I kill
>>> the corresponding spark job? I know, one way is to use SparkSubmit again
>>> with appropriate options. Is there any way though which I can tell
>>> SparkSubmit at the time of job submission itself. Here is my code:
>>>
>>>
>>>-
>>>import org.apache.spark.deploy.SparkSubmit;
>>>- class MyClass{
>>>-
>>>- public static void main(String args[]){
>>>- //preparing args
>>>- SparkSubmit.main(args);
>>>- }
>>>-
>>>- }
>>>
>>>
>


Re: killing spark job which is submitted using SparkSubmit

2016-05-06 Thread satish saley
Hi Anthony,

I am passing

--master
yarn-cluster
--name
pysparkexample
--executor-memory
1G
--driver-memory
1G
--conf
spark.yarn.historyServer.address=http://localhost:18080
--conf
spark.eventLog.enabled=true

--verbose

pi.py


I am able to run the job successfully. I just want to get it killed
automatically whenever I kill my application.


On Fri, May 6, 2016 at 11:58 AM, Anthony May  wrote:

> Greetings Satish,
>
> What are the arguments you're passing in?
>
> On Fri, 6 May 2016 at 12:50 satish saley  wrote:
>
>> Hello,
>>
>> I am submitting a spark job using SparkSubmit. When I kill my
>> application, it does not kill the corresponding spark job. How would I kill
>> the corresponding spark job? I know, one way is to use SparkSubmit again
>> with appropriate options. Is there any way though which I can tell
>> SparkSubmit at the time of job submission itself. Here is my code:
>>
>>
>>-
>>import org.apache.spark.deploy.SparkSubmit;
>>- class MyClass{
>>-
>>- public static void main(String args[]){
>>- //preparing args
>>- SparkSubmit.main(args);
>>- }
>>-
>>- }
>>
>>


Re: killing spark job which is submitted using SparkSubmit

2016-05-06 Thread Anthony May
Greetings Satish,

What are the arguments you're passing in?

On Fri, 6 May 2016 at 12:50 satish saley  wrote:

> Hello,
>
> I am submitting a spark job using SparkSubmit. When I kill my application,
> it does not kill the corresponding spark job. How would I kill the
> corresponding spark job? I know, one way is to use SparkSubmit again with
> appropriate options. Is there any way though which I can tell SparkSubmit
> at the time of job submission itself. Here is my code:
>
>
>-
>import org.apache.spark.deploy.SparkSubmit;
>- class MyClass{
>-
>- public static void main(String args[]){
>- //preparing args
>- SparkSubmit.main(args);
>- }
>-
>- }
>
>