To be specific:

1. Check the log files on both master and worker and see if any errors.
2. If you are not running your browser on the same machine and the
   Spark cluster, please use the host's external IP instead of
   localhost IP when launching the worker

Hope this helps...
-- ND
On 3/9/22 9:23 AM, Sean Owen wrote:
Did it start successfully? What do you mean ports were not opened?

On Wed, Mar 9, 2022 at 3:02 AM <capitnfrak...@free.fr> wrote:

    Hello

    I have spark 3.2.0 deployed in localhost as the standalone mode.
    I even didn't run the start master and worker command:

         start-master.sh
         start-worker.sh spark://127.0.0.1:7077 <http://127.0.0.1:7077>


    And the ports (such as 7077) were not opened there.
    But I still can login into pyspark to run the jobs.

    Why this happens?

    Thanks.

    ---------------------------------------------------------------------
    To unsubscribe e-mail: user-unsubscr...@spark.apache.org


Reply via email to