Is there any plan to support python spark running in "cluster mode" on a
standalone deployment?

There is this famous survey mentioning that more than 50% of the users are
using the standalone configuration.
Making pyspark work in cluster mode with standalone will help a lot for
high availabilty in python spark.

Cuurently only Yarn deployment supports it. Bringing the huge Yarn
installation just for this feature is not fun at all....

Does someone have time estimation for this?



-- 
Regards,
Ofer Eliassaf

Reply via email to