[ https://issues.apache.org/jira/browse/SPARK-26011?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sean Owen resolved SPARK-26011. ------------------------------- Resolution: Fixed Fix Version/s: 2.4.1 3.0.0 2.3.3 Issue resolved by pull request 23009 [https://github.com/apache/spark/pull/23009] > pyspark app with "spark.jars.packages" config does not work > ----------------------------------------------------------- > > Key: SPARK-26011 > URL: https://issues.apache.org/jira/browse/SPARK-26011 > Project: Spark > Issue Type: Bug > Components: Spark Submit > Affects Versions: 2.3.2, 2.4.0 > Reporter: shanyu zhao > Assignee: shanyu zhao > Priority: Major > Fix For: 2.3.3, 3.0.0, 2.4.1 > > > Command "pyspark --packages" works as expected, but if submitting a livy > pyspark job with "spark.jars.packages" config, the downloaded packages are > not added to python's sys.path therefore the package is not available to use. > For example, this command works: > pyspark --packages Azure:mmlspark:0.14 > However, using Jupyter notebook with sparkmagic kernel to open a pyspark > session failed: > %%configure -f \{"conf": {spark.jars.packages": "Azure:mmlspark:0.14"}} > import mmlspark > The root cause is that SparkSubmit determines pyspark app by the suffix of > primary resource but Livy uses "spark-internal" as the primary resource when > calling spark-submit, therefore args.isPython is set to false in > SparkSubmit.scala. -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org