Re: Ubuntu 18.04: Docker: start-master.sh: command not found

2021-03-30 Thread Mich Talebzadeh
OK on your path where is SPARK? Try to edit your .profile with simple editor of your choice (vi, vim etc if you are on Linux) export PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/bin:/sbin *:$PATH* export SPARK_HOME=/opt/spark ## Add it to the PATH export PATH=$SPARK_HOM

Re: convert java dataframe to pyspark dataframe

2021-03-30 Thread Aditya Singh
Hi Sean, Thanks a lot for replying and apologies for the late reply(I somehow missed this mail before) but I am under the impression that passing the py4j. java_gateway.JavaGateway object lets the pyspark access the spark context created on the java side. My use case is exactly what you mentioned

Re: Ubuntu 18.04: Docker: start-master.sh: command not found

2021-03-30 Thread GUINKO Ferdinand
Good day Talebzadeh, here are the outputs: cat ~/.profile root@33z261w1a18:/home# cat ~/.profile # ~/.profile: executed by Bourne-compatible login shells. if [ "$BASH" ]; then   if [ -f ~/.bashrc ]; then     . ~/.bashrc   fi fi mesg n 2> /dev/null || true export SPARK_HOME=/opt/spark export P

Re: Ubuntu 18.04: Docker: start-master.sh: command not found

2021-03-30 Thread GUINKO Ferdinand
The Spark installation I am trying to do is on an Ubuntu image on Docker. So there is no text editor installed on the Ubuntu image I have installed. I have installed gedit but had some graphic issues and couldn't lunch it. When I tried to lunch gedit, I had the following error message: root@33

Re: Ubuntu 18.04: Docker: start-master.sh: command not found

2021-03-30 Thread GUINKO Ferdinand
Where is SPARK? I also tried this: root@33z261w1a18:/opt/spark# whereis spark-shell spark-shell: /opt/spark/bin/spark-shell2.cmd /opt/spark/bin/spark-shell.cmd /opt/spark/bin/spark-shell --  «Ce dont le monde a le plus besoin, c'est d'hommes, non pas des hommes qu'on achète et qui se vendent,

Re: Ubuntu 18.04: Docker: start-master.sh: command not found

2021-03-30 Thread Mich Talebzadeh
Ok simple export SPARK_HOME=/opt/spark export PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/bin:/sbin:$SPARK_HOME/bin:$SPARK_HOME/sbin then echo $PATH which start-master.sh and send the output HTH view my Linkedin profile

Re: Ubuntu 18.04: Docker: start-master.sh: command not found

2021-03-30 Thread GUINKO Ferdinand
This is what I am having now: root@33z261w1a18:/opt/spark# export SPARK_HOME=/opt/spark root@33z261w1a18:/opt/spark# export PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/bin:/sbin:$SPARK_HOME/bin:$SPARK_HOME/sbin root@33z261w1a18:/opt/spark# echo $PATH /usr/local/sbin:/usr/lo

Re: Ubuntu 18.04: Docker: start-master.sh: command not found

2021-03-30 Thread Mich Talebzadeh
Those two export lines means set SPARK_HOME and PATH *as environment variables* in the session you have in Ubuntu. Check this website for more info bash - How do I add environment variables? - Ask Ubuntu

Re: convert java dataframe to pyspark dataframe

2021-03-30 Thread Khalid Mammadov
Hi Aditya, I think you original question was as how to convert a DataFrame from Spark session created on Java/Scala to a DataFrame on a Spark session created from Python(PySpark). So, as I have answered on your SO question: There is a missing call to *entry_point* before calling getDf() in

Re: Ubuntu 18.04: Docker: start-master.sh: command not found

2021-03-30 Thread GUINKO Ferdinand
Thank you for your assistance! --  «Ce dont le monde a le plus besoin, c'est d'hommes, non pas des hommes qu'on achète et qui se vendent, mais d'hommes profondément loyaux et intègres, des hommes qui ne craignent pas d'appeler le péché par son nom, des hommes dont la conscience soit aussi fidè