find the *start-master.sh* file with command excuted as root  *find / -name
"start-master.sh" -print.*
Check that the directory found is well put in the $PATH.
As a first step go to the directory where the *start-master.sh* is with *cd*
command and execute with command *./start-master.sh*.

@*JB*Δ <http://jbigdata.fr/jbigdata/index.html>



Le mer. 31 mars 2021 à 12:35, GUINKO Ferdinand <tonguimferdin...@guinko.net>
a écrit :

>
> I have exited from the container, and logged in using:
>
> sudo docker run -it -p8080:8080 ubuntu
>
> Then I tried to launch Start Standalone Spark Master Server doing:
>
> start-master.sh
>
> and got the following message:
>
> bash: start-master.sh: command not found
>
> So I started the process of setting the environmental variables again doing:
>
> echo "export SPARK_HOME=/opt/spark" >> ~/.profile
> export 
> PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/bin:/sbin:$SPARK_HOME/bin:$SPARK_HOME/sbin
> echo $PATH
> echo "export PYSPARK_PYTHON=/usr/bin/python3" >> ~/.profile
>
> Here is the output of the file .profile:
>
> root@291b0eb654ea:/# cat ~/.profile
> # ~/.profile: executed by Bourne-compatible login shells.
>
> if [ "$BASH" ]; then
>   if [ -f ~/.bashrc ]; then
>     . ~/.bashrc
>   fi
> fi
>
> mesg n 2> /dev/null || true
> export SPARK_HOME=/opt/spark
> export 
> PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/bin:/sbin:/opt/spark/bin:/opt/spark/sbin
> export PYSPARK_PYTHON=/usr/bin/python3
>
> I also typed this command:
>
> root@291b0eb654ea:/# source ~/.profile
>
> I am still getting the following message:
>
> bash: start-master.sh: command not found
>
> What am missing please?
>
> --
> «Ce dont le monde a le plus besoin, c'est d'hommes, non pas des hommes
> qu'on achète et qui se vendent, mais d'hommes profondément loyaux et
> intègres, des hommes qui ne craignent pas d'appeler le péché par son nom,
> des hommes dont la conscience soit aussi fidèle à son devoir que la
> boussole l'est au pôle, des hommes qui défendraient la justice et la vérité
> même si l'univers s'écroulait.» Ellen Gould WHITE, education, P. 55
>
> Le Mardi, Mars 30, 2021 21:01 GMT, Mich Talebzadeh <
> mich.talebza...@gmail.com> a écrit:
>
>
> Those two export lines means set SPARK_HOME and PATH *as environment
> variables* in the session you have in Ubuntu.
>
> Check this website for more info
>
> bash - How do I add environment variables? - Ask Ubuntu
> <https://askubuntu.com/questions/58814/how-do-i-add-environment-variables#:~:text=To%20set%20an%20environment%20variable%20everytime%2C%20use%20the%20export%20command,script%20it%20will%20not%20work.>
>
>
> If you are familiar with windows, then they are equivalent to Windows
> environment variables. For example, note SPARK_HOME
>
>
> [image: image.png]
>
>
> Next. Now you are trying to start spark in standalone mode. Check the
> following link
>
> Spark Standalone Mode - Spark 3.1.1 Documentation (apache.org)
> <http://spark.apache.org/docs/latest/spark-standalone.html>
>
> to learn more about it
>
> And also check the logfile generated by invoking spark-master.sh in your
> output
>
> starting org.apache.spark.deploy.master.Master, logging to*
> /opt/spark/logs/spark--org.apache.spark.deploy.master.Master-1-23d865d7f117.out*
>
> HTH
>
>
>
>
>    view my Linkedin profile
> <https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>
>
>
>
> *Disclaimer:* Use it at your own risk. Any and all responsibility for any
> loss, damage or destruction of data or any other property which may arise
> from relying on this email's technical content is explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>
> On Tue, 30 Mar 2021 at 20:48, GUINKO Ferdinand <
> tonguimferdin...@guinko.net> wrote:
>
>> This is what I am having now:
>>
>> root@33z261w1a18:/opt/spark# *export SPARK_HOME=/opt/spark*
>> root@33z261w1a18:/opt/spark# *export
>> PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/bin:/sbin:$SPARK_HOME/bin:$SPARK_HOME/sbin*
>> root@33z261w1a18:/opt/spark# *echo $PATH*
>>
>> /usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/bin:/sbin:/opt/spark/bin:/opt/spark/sbin
>> root@33z261w1a18:/opt/spark# *start-master.sh*
>> starting org.apache.spark.deploy.master.Master, logging to
>> /opt/spark/logs/spark--org.apache.spark.deploy.master.Master-1-23d865d7f117.out
>> root@33z261w1a18:/opt/spark#
>>
>> It seems that SPARK was trying to start but couldn't.
>>
>> Please would you explain me what the following lines means and do:
>>
>> export SPARK_HOME=/opt/spark
>> export
>> PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/bin:/sbin:$SPARK_HOME/bin:$SPARK_HOME/sbin
>>
>> then
>>
>> echo $PATH
>>
>>
>> Thank you for the assistance.
>>
>> --
>> «Ce dont le monde a le plus besoin, c'est d'hommes, non pas des hommes
>> qu'on achète et qui se vendent, mais d'hommes profondément loyaux et
>> intègres, des hommes qui ne craignent pas d'appeler le péché par son nom,
>> des hommes dont la conscience soit aussi fidèle à son devoir que la
>> boussole l'est au pôle, des hommes qui défendraient la justice et la vérité
>> même si l'univers s'écroulait.» Ellen Gould WHITE, education, P. 55
>>
>> Le Mardi, Mars 30, 2021 15:00 GMT, Mich Talebzadeh <
>> mich.talebza...@gmail.com> a écrit:
>>
>>
>> Ok simple
>>
>> export SPARK_HOME=/opt/spark
>> export
>> PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/bin:/sbin:$SPARK_HOME/bin:$SPARK_HOME/sbin
>>
>> then
>>
>> echo $PATH
>>
>> which start-master.sh
>>
>> and send the output
>>
>> HTH
>>
>>
>>
>>
>>
>>    view my Linkedin profile
>> <https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>
>>
>>
>>
>> *Disclaimer:* Use it at your own risk. Any and all responsibility for
>> any loss, damage or destruction of data or any other property which may
>> arise from relying on this email's technical content is explicitly
>> disclaimed. The author will in no case be liable for any monetary damages
>> arising from such loss, damage or destruction.
>>
>>
>>
>> On Tue, 30 Mar 2021 at 13:23, GUINKO Ferdinand <
>> tonguimferdin...@guinko.net> wrote:
>>
>>> Where is SPARK? I also tried this:
>>>
>>> root@33z261w1a18:/opt/spark# *whereis spark-shell*
>>> spark-shell: /opt/spark/bin/spark-shell2.cmd
>>> /opt/spark/bin/spark-shell.cmd /opt/spark/bin/spark-shell
>>>
>>> --
>>> «Ce dont le monde a le plus besoin, c'est d'hommes, non pas des hommes
>>> qu'on achète et qui se vendent, mais d'hommes profondément loyaux et
>>> intègres, des hommes qui ne craignent pas d'appeler le péché par son nom,
>>> des hommes dont la conscience soit aussi fidèle à son devoir que la
>>> boussole l'est au pôle, des hommes qui défendraient la justice et la vérité
>>> même si l'univers s'écroulait.» Ellen Gould WHITE, education, P. 55
>>>
>>> Le Mardi, Mars 30, 2021 09:00 GMT, Mich Talebzadeh <
>>> mich.talebza...@gmail.com> a écrit:
>>>
>>>
>>> OK
>>>
>>> on your path where is SPARK?
>>>
>>> Try to edit your .profile with simple editor of your choice (vi, vim etc
>>> if you are on Linux)
>>>
>>> export
>>> PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/bin:/sbin
>>> *:$PATH*
>>> export SPARK_HOME=/opt/spark
>>> ## Add it to the PATH!!!!
>>> export PATH=$SPARK_HOME/bin:$SPARK_HOME/sbin:$PATH
>>>
>>>
>>> Once you have done it  do
>>>
>>> . ./.profile
>>>
>>> send the output of
>>>
>>> echo $PATH
>>>
>>> HTH
>>>
>>>
>>>
>>>
>>>
>>>
>>>    view my Linkedin profile
>>> <https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>
>>>
>>>
>>>
>>> *Disclaimer:* Use it at your own risk. Any and all responsibility for
>>> any loss, damage or destruction of data or any other property which may
>>> arise from relying on this email's technical content is explicitly
>>> disclaimed. The author will in no case be liable for any monetary damages
>>> arising from such loss, damage or destruction.
>>>
>>>
>>>
>>> On Tue, 30 Mar 2021 at 09:23, GUINKO Ferdinand <
>>> tonguimferdin...@guinko.net> wrote:
>>>
>>>> Good day Talebzadeh,
>>>>
>>>> here are the outputs:
>>>>
>>>> *cat ~/.profile*
>>>>
>>>> root@33z261w1a18:/home# cat ~/.profile
>>>> # ~/.profile: executed by Bourne-compatible login shells.
>>>>
>>>> if [ "$BASH" ]; then
>>>>   if [ -f ~/.bashrc ]; then
>>>>     . ~/.bashrc
>>>>   fi
>>>> fi
>>>>
>>>> mesg n 2> /dev/null || true
>>>> export SPARK_HOME=/opt/spark
>>>> export
>>>> PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/bin:/sbin
>>>> export PYSPARK_PYTHON=/usr/bin/python3
>>>>
>>>>
>>>>  *echo $PATH*
>>>>
>>>> /usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/bin:/sbin
>>>>
>>>> Thank you.
>>>>
>>>> --
>>>> «Ce dont le monde a le plus besoin, c'est d'hommes, non pas des hommes
>>>> qu'on achète et qui se vendent, mais d'hommes profondément loyaux et
>>>> intègres, des hommes qui ne craignent pas d'appeler le péché par son nom,
>>>> des hommes dont la conscience soit aussi fidèle à son devoir que la
>>>> boussole l'est au pôle, des hommes qui défendraient la justice et la vérité
>>>> même si l'univers s'écroulait.» Ellen Gould WHITE, education, P. 55
>>>>
>>>> Le Lundi, Mars 29, 2021 22:24 GMT, Mich Talebzadeh <
>>>> mich.talebza...@gmail.com> a écrit:
>>>>
>>>>
>>>> OK just do
>>>>
>>>> cat ~/.profile
>>>>
>>>> and send the content please
>>>>
>>>> also do
>>>>
>>>>  echo $PATH
>>>>
>>>> and send the output as well.
>>>>
>>>> HTH
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>    view my Linkedin profile
>>>> <https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>
>>>>
>>>>
>>>>
>>>> *Disclaimer:* Use it at your own risk. Any and all responsibility for
>>>> any loss, damage or destruction of data or any other property which may
>>>> arise from relying on this email's technical content is explicitly
>>>> disclaimed. The author will in no case be liable for any monetary damages
>>>> arising from such loss, damage or destruction.
>>>>
>>>>
>>>>
>>>> On Mon, 29 Mar 2021 at 22:06, GUINKO Ferdinand <
>>>> tonguimferdin...@guinko.net> wrote:
>>>>
>>>>> Hi Talebzadeh,
>>>>>
>>>>> thank you for your answer. I am sorry, but I don't  undertand what I
>>>>> need to do. It seems that I already put *SPARK_HOME/sbin* in my path
>>>>> as I have this line among the 3 lines I have sent in my initial email:
>>>>>
>>>>> echo "export PATH=$PATH:$SPARK_HOME/bin:$SPARK_HOME/sbin" >> ~/.profile
>>>>>
>>>>> Isn't it? Also should I remove the "echo" as it seems you don't have
>>>>> them in your commands.
>>>>>
>>>>> I am following this tutorial:
>>>>> https://phoenixnap.com/kb/install-spark-on-ubuntu but installing
>>>>> Spark 3.1.1 with Hadoop 2.7.
>>>>>
>>>>> Thank you for any help.
>>>>>
>>>>> Ferdinand
>>>>> --
>>>>> «Ce dont le monde a le plus besoin, c'est d'hommes, non pas des hommes
>>>>> qu'on achète et qui se vendent, mais d'hommes profondément loyaux et
>>>>> intègres, des hommes qui ne craignent pas d'appeler le péché par son nom,
>>>>> des hommes dont la conscience soit aussi fidèle à son devoir que la
>>>>> boussole l'est au pôle, des hommes qui défendraient la justice et la 
>>>>> vérité
>>>>> même si l'univers s'écroulait.» Ellen Gould WHITE, education, P. 55
>>>>>
>>>>> Le Lundi, Mars 29, 2021 18:57 GMT, Mich Talebzadeh <
>>>>> mich.talebza...@gmail.com> a écrit:
>>>>>
>>>>>
>>>>> In my case
>>>>>
>>>>> export SPARK_HOME=/d4T/hduser/spark-3.1.1-bin-hadoop3.2
>>>>> export PATH=$SPARK_HOME/bin:*$SPARK_HOME/sbin*:$PATH
>>>>>
>>>>>  which start-master.sh
>>>>> /d4T/hduser/spark-3.1.1-bin-hadoop3.2/sbin/start-master.sh
>>>>>
>>>>> You need to add *$SPARK_HOME/sbin* to your path as well
>>>>>
>>>>> HTH
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>    view my Linkedin profile
>>>>> <https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>
>>>>>
>>>>>
>>>>>
>>>>> *Disclaimer:* Use it at your own risk. Any and all responsibility for
>>>>> any loss, damage or destruction of data or any other property which may
>>>>> arise from relying on this email's technical content is explicitly
>>>>> disclaimed. The author will in no case be liable for any monetary damages
>>>>> arising from such loss, damage or destruction.
>>>>>
>>>>>
>>>>>
>>>>> On Mon, 29 Mar 2021 at 19:19, GUINKO Ferdinand <
>>>>> tonguimferdin...@guinko.net> wrote:
>>>>>
>>>>>> Hi,
>>>>>>
>>>>>> I have installed Docker, Spark 3.1.1 and Hadoop 2.7 on Ubuntu 18.04.
>>>>>>
>>>>>> After I have executed the following 3 lines
>>>>>>
>>>>>> echo "export SPARK_HOME=/opt/spark" >> ~/.profile
>>>>>> echo "export PATH=$PATH:$SPARK_HOME/bin:$SPARK_HOME/sbin" >>
>>>>>> ~/.profile
>>>>>> echo "export PYSPARK_PYTHON=/usr/bin/python3" >> ~/.profile
>>>>>>
>>>>>> I tried to start the master server by typing
>>>>>>
>>>>>> start-master.sh
>>>>>>
>>>>>> and got the following error message
>>>>>>
>>>>>> bash: start-master.sh: command not found
>>>>>>
>>>>>> Thank you for any help.
>>>>>> Ferdinand
>>>>>> --
>>>>>> «Ce dont le monde a le plus besoin, c'est d'hommes, non pas des
>>>>>> hommes qu'on achète et qui se vendent, mais d'hommes profondément loyaux 
>>>>>> et
>>>>>> intègres, des hommes qui ne craignent pas d'appeler le péché par son nom,
>>>>>> des hommes dont la conscience soit aussi fidèle à son devoir que la
>>>>>> boussole l'est au pôle, des hommes qui défendraient la justice et la 
>>>>>> vérité
>>>>>> même si l'univers s'écroulait.» Ellen Gould WHITE, education, P. 55
>>>>>
>>>>>
>>>>>
>>>>
>>>>
>>>>
>>>
>>>
>>>
>>
>>
>>
>
>
>

Reply via email to