Hi
I was wondering why my SPARK_WORKER_OPTS at conf/spark-env.sh was not passed
to the executors and noticed the following line in ExecutorRunner.scala
(Spark 0.8.0):
116: val workerLocalOpts =
Option(getenv("SPARK_JAVA_OPTS")).map(Utils.splitCommandString).getOrElse(Ni
l)
Is SPARK_JAVA_OPTS supposed to be SPARK_WORKER_OPTS in this line? The next
line adds the options in SPARK_JAVA_OPTS:
117: val userOpts =
getAppEnv("SPARK_JAVA_OPTS").map(Utils.splitCommandString).getOrElse(Nil)
The options in both the variables workerLocalOpts and userOpts are
aggregated into the executor options in the line:
126: Seq("-cp", classPath) ++ libraryOpts ++ workerLocalOpts ++ userOpts ++
memoryOpts
Best regards,
Markus Losoi ([email protected])