Saisai Shao created SPARK-24377:
-----------------------------------

             Summary: Make --py-files work in non pyspark application
                 Key: SPARK-24377
                 URL: https://issues.apache.org/jira/browse/SPARK-24377
             Project: Spark
          Issue Type: Bug
          Components: Spark Submit
    Affects Versions: 2.3.0
            Reporter: Saisai Shao


For some Spark applications, though they're a java program, they require not 
only jar dependencies, but also python dependencies. One example is Livy remote 
SparkContext application, this application is actually a embedded REPL for 
Scala/Python/R, so it will not only load in jar dependencies, but also python 
and R deps.

Currently for a Spark application, --py-files can only be worked for a pyspark 
application, so it will not be worked in the above case. So here propose to 
remove such restriction.

Also we tested that "spark.submit.pyFiles" only supports quite limited scenario 
(client mode with local deps), so here also expand the usage of 
"spark.submit.pyFiles" to be alternative of --py-files.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to