rei [mailto:faithlessfri...@gmail.com]
> *Sent:* Thursday, April 14, 2016 5:45 AM
>
> *To:* Sun, Rui
> *Cc:* user
> *Subject:* Re: How does spark-submit handle Python scripts (and how to
> repeat it)?
>
>
>
> Julia can pick the env var, and set the system properties
...@gmail.com]
Sent: Thursday, April 14, 2016 5:45 AM
To: Sun, Rui
Cc: user
Subject: Re: How does spark-submit handle Python scripts (and how to repeat it)?
Julia can pick the env var, and set the system properties or directly fill the
configurations into a SparkConf, and then create a
.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/PythonRunner.scala#L47
> and
>
>
> https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/RRunner.scala#L65
>
>
>
>
>
> *From:* Andrei [mailto:faithlessfri...@gmail.c
w does spark-submit handle Python scripts (and how to repeat it)?
One part is passing the command line options, like “--master”, from the JVM
launched by spark-submit to the JVM where SparkContext resides
Since I have full control over both - JVM and Julia parts - I can pass whatever
options to bo
>
> One part is passing the command line options, like “--master”, from the
> JVM launched by spark-submit to the JVM where SparkContext resides
Since I have full control over both - JVM and Julia parts - I can pass
whatever options to both. But what exactly should be passed? Currently
pipeline l
There is much deployment preparation work handling different deployment modes
for pyspark and SparkR in SparkSubmit. It is difficult to summarize it briefly,
you had better refer to the source code.
Supporting running Julia scripts in SparkSubmit is more than implementing a
‘JuliaRunner’. One p