I think the answer is yes. Code packaged in pyspark.zip needs python to execute.
On Tue, Sep 29, 2015 at 2:08 PM, Ranjana Rajendran < ranjana.rajend...@gmail.com> wrote: > Hi, > > Does a python spark program (which makes use of pyspark ) submitted in > cluster mode need python on the executor nodes ? Isn't the python program > interpreted on the client node from where the job is submitted and then the > executors run in the JVM of each the executor nodes ? > > Thank you, > Ranjana >