You can run Spark in local mode and not require any standalone master or
worker.
Are you sure you're not using local mode? are you sure the daemons aren't
running?
What is the Spark master you pass?
On Wed, Mar 9, 2022 at 7:35 PM wrote:
> What I tried to say is, I didn't start spark
What I tried to say is, I didn't start spark master/worker at all, for a
standalone deployment.
But I still can login into pyspark to run the job. I don't know why.
$ ps -efw|grep spark
$ netstat -ntlp
both the output above have no spark related info.
And this machine is managed by myself, I
To be specific:
1. Check the log files on both master and worker and see if any errors.
2. If you are not running your browser on the same machine and the
Spark cluster, please use the host's external IP instead of
localhost IP when launching the worker
Hope this helps...
-- ND
On 3/9/22
Did it start successfully? What do you mean ports were not opened?
On Wed, Mar 9, 2022 at 3:02 AM wrote:
> Hello
>
> I have spark 3.2.0 deployed in localhost as the standalone mode.
> I even didn't run the start master and worker command:
>
> start-master.sh
> start-worker.sh
Hello
I have spark 3.2.0 deployed in localhost as the standalone mode.
I even didn't run the start master and worker command:
start-master.sh
start-worker.sh spark://127.0.0.1:7077
And the ports (such as 7077) were not opened there.
But I still can login into pyspark to run the jobs.