[ https://issues.apache.org/jira/browse/SPARK-26011?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Apache Spark reassigned SPARK-26011: ------------------------------------ Assignee: (was: Apache Spark) > pyspark app with "spark.jars.packages" config does not work > ----------------------------------------------------------- > > Key: SPARK-26011 > URL: https://issues.apache.org/jira/browse/SPARK-26011 > Project: Spark > Issue Type: Bug > Components: Spark Submit > Affects Versions: 2.3.2, 2.4.0 > Reporter: shanyu zhao > Priority: Major > > Command "pyspark --packages" works as expected, but if submitting a livy > pyspark job with "spark.jars.packages" config, the downloaded packages are > not added to python's sys.path therefore the package is not available to use. > For example, this command works: > pyspark --packages Azure:mmlspark:0.14 > However, using Jupyter notebook with sparkmagic kernel to open a pyspark > session failed: > %%configure -f \{"conf": {spark.jars.packages": "Azure:mmlspark:0.14"}} > import mmlspark > The root cause is that SparkSubmit determines pyspark app by the suffix of > primary resource but Livy uses "spark-internal" as the primary resource when > calling spark-submit, therefore args.isPython is set to false in > SparkSubmit.scala. -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org