shanyu zhao created SPARK-26011:
-----------------------------------

             Summary: pyspark app with "spark.jars.packages" config does not 
work
                 Key: SPARK-26011
                 URL: https://issues.apache.org/jira/browse/SPARK-26011
             Project: Spark
          Issue Type: Bug
          Components: Spark Submit
    Affects Versions: 2.4.0, 2.3.2
            Reporter: shanyu zhao


Command "pyspark --packages" works as expected, but if submitting a livy 
pyspark job with "spark.jars.packages" config, the downloaded packages are not 
added to python's sys.path therefore the package is not available to use.

For example, this command works:

pyspark --packages Azure:mmlspark:0.14

However, using Jupyter notebook with sparkmagic kernel to open a pyspark 
session failed:

%%configure -f \{"conf": {spark.jars.packages": "Azure:mmlspark:0.14"}}
import mmlspark

The root cause is that SparkSubmit determines pyspark app by the suffix of 
primary resource but Livy uses "spark-internal" as the primary resource when 
calling spark-submit, therefore args.isPython is fails in SparkSubmit.scala.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to