[ https://issues.apache.org/jira/browse/SPARK-6047?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Josh Rosen resolved SPARK-6047. ------------------------------- Resolution: Duplicate > pyspark - class loading on driver failing with --jars and --packages > -------------------------------------------------------------------- > > Key: SPARK-6047 > URL: https://issues.apache.org/jira/browse/SPARK-6047 > Project: Spark > Issue Type: Bug > Components: PySpark, Spark Submit > Affects Versions: 1.3.0 > Reporter: Burak Yavuz > > Because py4j uses the system ClassLoader instead of the contextClassLoader of > the thread, the dynamically added jars in Spark Submit can't be loaded in the > driver. > This causes `Py4JError: Trying to call a package` errors. > Usually `--packages` are downloaded from some remote repo before runtime, > adding them explicitly to `--driver-class-path` is not an option, like we can > do with `--jars`. One solution is to move the fetching of `--packages` to the > SparkSubmitDriverBootstrapper, and add it to the driver class-path there. > A more complete solution can be achieved through [SPARK-4924]. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org