Hi,

Read the doc http://spark.apache.org/docs/latest/spark-standalone.html
which seems to be the cluster manager the OP uses.

Pozdrawiam,
Jacek Laskowski
----
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski


On Thu, Jul 7, 2016 at 11:26 PM, Mr rty ff <yash...@yahoo.com> wrote:
> Hi
> I am sorry but its still not clear
> Do you mean ./bin/spark-shell --master local
> And what I do after that killing the
> org.apache.spark.deploy.SparkSubmit --master local --class
> org.apache.spark.repl.Main --name Spark shell spark-shell
> will kill the shell so I couldn't send the commands .
> Thanks
>
>
> On Friday, July 8, 2016 12:05 AM, Jacek Laskowski <ja...@japila.pl> wrote:
>
>
> Hi,
>
> Then use --master with spark standalone, yarn, or mesos.
>
> Pozdrawiam,
> Jacek Laskowski
> ----
> https://medium.com/@jaceklaskowski/
> Mastering Apache Spark http://bit.ly/mastering-apache-spark
> Follow me at https://twitter.com/jaceklaskowski
>
>
> On Thu, Jul 7, 2016 at 10:35 PM, Mr rty ff <yash...@yahoo.com> wrote:
>> I don't think Its the proper way to recreate the bug becouse I should
>> continue to send commands to the shell
>> They talking about killing the CoarseGrainedExecutorBackend
>>
>>
>> On Thursday, July 7, 2016 11:32 PM, Jacek Laskowski <ja...@japila.pl>
>> wrote:
>>
>>
>> Hi,
>>
>> It appears you're running local mode (local[*] assumed) so killing
>> spark-shell *will* kill the one and only executor -- the driver :)
>>
>> Pozdrawiam,
>> Jacek Laskowski
>> ----
>> https://medium.com/@jaceklaskowski/
>> Mastering Apache Spark http://bit.ly/mastering-apache-spark
>> Follow me at https://twitter.com/jaceklaskowski
>>
>>
>> On Thu, Jul 7, 2016 at 10:27 PM, Mr rty ff <yash...@yahoo.com> wrote:
>>> This what I get when I run the command
>>> 946 sun.tools.jps.Jps -lm
>>> 7443 org.apache.spark.deploy.SparkSubmit --class
>>> org.apache.spark.repl.Main
>>> --name Spark shell spark-shell
>>> I don't think that shululd kill SparkSubmit  process
>>>
>>>
>>>
>>> On Thursday, July 7, 2016 9:58 PM, Jacek Laskowski <ja...@japila.pl>
>>> wrote:
>>>
>>>
>>> Hi,
>>>
>>> Use jps -lm and see the processes on the machine(s) to kill.
>>>
>>> Pozdrawiam,
>>> Jacek Laskowski
>>> ----
>>> https://medium.com/@jaceklaskowski/
>>> Mastering Apache Spark http://bit.ly/mastering-apache-spark
>>> Follow me at https://twitter.com/jaceklaskowski
>>>
>>>
>>> On Wed, Jul 6, 2016 at 9:49 PM, Mr rty ff <yash...@yahoo.com.invalid>
>>> wrote:
>>>> Hi
>>>> I like to recreate this bug
>>>> https://issues.apache.org/jira/browse/SPARK-13979
>>>> They talking about stopping Spark executors. Its not clear exactly how
>>>> do
>>>> I
>>>> stop the executors
>>>> Thanks
>>>
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>>
>>>
>>>
>>>
>>>
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>
>>
>>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>
>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to