There is an executors tab on spark connect. It's contents are generally
similar to the workers section of the spark master ui.

You might need to specify --master option in your spark connect command if
you haven't done so yet.

On Tue, 6 Aug, 2024, 14:19 Ilango, <elango...@gmail.com> wrote:

>
> Hi all,
>
> I am evaluating the use of Spark Connect with my Spark stand-alone
> cluster, which has a master node and 3 worker nodes. I have successfully
> created a Spark Connect connection. However, when submitting Spark SQL
> queries, the jobs are being executed only on the master node, and I do not
> observe any executors running on the worker nodes, despite requesting 4
> executors.
>
>
>
> I would appreciate clarification on whether Spark stand-alone cluster is
> supported for use with Spark Connect.
>
> If so, how can I leverage the existing Spark stand-alone cluster's worker
> nodes?
>
>
>
>
>
>
> Thanks,
> Elango
>

Reply via email to