Hey there, How can I get the same spark context in two different python
processes?
Let’s say I create a context in Process A, and then I want to use python
subprocess B to get the spark context created by Process A. How can I
achieve that?

I've tried pyspark.sql.SparkSession.builder.appName("spark").getOrCreate(),
but it will create a new spark context.

Reply via email to