Hi Andrew,
Thanks a lot for your response. I am aware of the '--master' flag in the
spark-submit command. However I would like to create the SparkContext
inside my coding.
Maybe I should elaborate a little bit further: I would like to reuse e.g.
the result of any Spark computation inside my
Hi Andreas,
I believe the distinction is not between standalone and YARN mode, but
between client and cluster mode.
In client mode, your Spark submit JVM runs your driver code. In cluster
mode, one of the workers (or NodeManagers if you're using YARN) in the
cluster runs your driver code. In the
Hi all,
when runnig the Spark cluster in standalone mode I am able to create the
Spark context from Java via the following code snippet:
SparkConf conf = new SparkConf()
.setAppName(MySparkApp)
.setMaster(spark://SPARK_MASTER:7077)
.setJars(jars);
JavaSparkContext sc = new