Hi all,

In a pyspark application is the python process the driver or spark will
start a new driver process? If it is the same as driver then how does
specifying "spark.submit.deployMode" as "cluster" in  spark conf would come
in use.

conf = SparkConf()
            .setMaster("yarn")
            .set("spark.submit.deployMode", "cluster")
sc = SparkContext(conf)

Is the spark context being created on application master or on the machine
where this python process is being run?

Thanks
Nikhil

Reply via email to