Re: killing spark job which is submitted using SparkSubmit
Thank you Anthony. I am clearer on yarn-cluster and yarn-client now. On Fri, May 6, 2016 at 1:05 PM, Anthony Maywrote: > Making the master yarn-cluster means that the driver is then running on > YARN not just the executor nodes. It's then independent of your application > and can only be killed via YARN commands, or if it's batch and completes. > The simplest way to tie the driver to your app is to pass in yarn-client as > master instead. > > On Fri, May 6, 2016 at 2:00 PM satish saley > wrote: > >> Hi Anthony, >> >> I am passing >> >> --master >> yarn-cluster >> --name >> pysparkexample >> --executor-memory >> 1G >> --driver-memory >> 1G >> --conf >> spark.yarn.historyServer.address=http://localhost:18080 >> --conf >> spark.eventLog.enabled=true >> >> --verbose >> >> pi.py >> >> >> I am able to run the job successfully. I just want to get it killed >> automatically whenever I kill my application. >> >> >> On Fri, May 6, 2016 at 11:58 AM, Anthony May >> wrote: >> >>> Greetings Satish, >>> >>> What are the arguments you're passing in? >>> >>> On Fri, 6 May 2016 at 12:50 satish saley >>> wrote: >>> Hello, I am submitting a spark job using SparkSubmit. When I kill my application, it does not kill the corresponding spark job. How would I kill the corresponding spark job? I know, one way is to use SparkSubmit again with appropriate options. Is there any way though which I can tell SparkSubmit at the time of job submission itself. Here is my code: - import org.apache.spark.deploy.SparkSubmit; - class MyClass{ - - public static void main(String args[]){ - //preparing args - SparkSubmit.main(args); - } - - } >>
Re: killing spark job which is submitted using SparkSubmit
Making the master yarn-cluster means that the driver is then running on YARN not just the executor nodes. It's then independent of your application and can only be killed via YARN commands, or if it's batch and completes. The simplest way to tie the driver to your app is to pass in yarn-client as master instead. On Fri, May 6, 2016 at 2:00 PM satish saleywrote: > Hi Anthony, > > I am passing > > --master > yarn-cluster > --name > pysparkexample > --executor-memory > 1G > --driver-memory > 1G > --conf > spark.yarn.historyServer.address=http://localhost:18080 > --conf > spark.eventLog.enabled=true > > --verbose > > pi.py > > > I am able to run the job successfully. I just want to get it killed > automatically whenever I kill my application. > > > On Fri, May 6, 2016 at 11:58 AM, Anthony May wrote: > >> Greetings Satish, >> >> What are the arguments you're passing in? >> >> On Fri, 6 May 2016 at 12:50 satish saley wrote: >> >>> Hello, >>> >>> I am submitting a spark job using SparkSubmit. When I kill my >>> application, it does not kill the corresponding spark job. How would I kill >>> the corresponding spark job? I know, one way is to use SparkSubmit again >>> with appropriate options. Is there any way though which I can tell >>> SparkSubmit at the time of job submission itself. Here is my code: >>> >>> >>>- >>>import org.apache.spark.deploy.SparkSubmit; >>>- class MyClass{ >>>- >>>- public static void main(String args[]){ >>>- //preparing args >>>- SparkSubmit.main(args); >>>- } >>>- >>>- } >>> >>> >
Re: killing spark job which is submitted using SparkSubmit
Hi Anthony, I am passing --master yarn-cluster --name pysparkexample --executor-memory 1G --driver-memory 1G --conf spark.yarn.historyServer.address=http://localhost:18080 --conf spark.eventLog.enabled=true --verbose pi.py I am able to run the job successfully. I just want to get it killed automatically whenever I kill my application. On Fri, May 6, 2016 at 11:58 AM, Anthony Maywrote: > Greetings Satish, > > What are the arguments you're passing in? > > On Fri, 6 May 2016 at 12:50 satish saley wrote: > >> Hello, >> >> I am submitting a spark job using SparkSubmit. When I kill my >> application, it does not kill the corresponding spark job. How would I kill >> the corresponding spark job? I know, one way is to use SparkSubmit again >> with appropriate options. Is there any way though which I can tell >> SparkSubmit at the time of job submission itself. Here is my code: >> >> >>- >>import org.apache.spark.deploy.SparkSubmit; >>- class MyClass{ >>- >>- public static void main(String args[]){ >>- //preparing args >>- SparkSubmit.main(args); >>- } >>- >>- } >> >>
Re: killing spark job which is submitted using SparkSubmit
Greetings Satish, What are the arguments you're passing in? On Fri, 6 May 2016 at 12:50 satish saleywrote: > Hello, > > I am submitting a spark job using SparkSubmit. When I kill my application, > it does not kill the corresponding spark job. How would I kill the > corresponding spark job? I know, one way is to use SparkSubmit again with > appropriate options. Is there any way though which I can tell SparkSubmit > at the time of job submission itself. Here is my code: > > >- >import org.apache.spark.deploy.SparkSubmit; >- class MyClass{ >- >- public static void main(String args[]){ >- //preparing args >- SparkSubmit.main(args); >- } >- >- } > >