Thanks Praboth. Passing —master attr in spark connect command worked like
charm.  I am able to submit spark connect to my existing stand-alone cluster

Thanks for saving my day once again :)

Thanks,
Elango


On Tue, 6 Aug 2024 at 6:08 PM, Prabodh Agarwal <prabodh1...@gmail.com>
wrote:

> Do you get some error on passing the master option to your spark connect
> command?
>
> On Tue, 6 Aug, 2024, 15:36 Ilango, <elango...@gmail.com> wrote:
>
>>
>>
>>
>> Thanks Prabodh. I'm having an issue with the Spark Connect connection as
>> the `spark.master` value is set to `local[*]` in Spark Connect UI, whereas
>> the actual master node for our Spark standalone cluster is different. I am
>> passing that master node ip in the Spark Connect Connection. But still it
>> is not set correctly. Could you please help me update this configuration to
>> reflect the correct master node value?
>>
>>
>>
>> This is my spark connect connection
>>
>>
>>
>> spark = SparkSession.builder\
>>
>>         .remote("sc://<spark-stand-alone-master-node-ip>:15002")\
>>
>>         .getOrCreate()
>>
>>
>> Thanks,
>> Elango
>>
>>
>> On Tue, 6 Aug 2024 at 5:45 PM, Prabodh Agarwal <prabodh1...@gmail.com>
>> wrote:
>>
>>> There is an executors tab on spark connect. It's contents are generally
>>> similar to the workers section of the spark master ui.
>>>
>>> You might need to specify --master option in your spark connect command
>>> if you haven't done so yet.
>>>
>>> On Tue, 6 Aug, 2024, 14:19 Ilango, <elango...@gmail.com> wrote:
>>>
>>>>
>>>> Hi all,
>>>>
>>>> I am evaluating the use of Spark Connect with my Spark stand-alone
>>>> cluster, which has a master node and 3 worker nodes. I have successfully
>>>> created a Spark Connect connection. However, when submitting Spark SQL
>>>> queries, the jobs are being executed only on the master node, and I do not
>>>> observe any executors running on the worker nodes, despite requesting 4
>>>> executors.
>>>>
>>>>
>>>>
>>>> I would appreciate clarification on whether Spark stand-alone cluster
>>>> is supported for use with Spark Connect.
>>>>
>>>> If so, how can I leverage the existing Spark stand-alone cluster's
>>>> worker nodes?
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> Thanks,
>>>> Elango
>>>>
>>>

Reply via email to