Cory Maklin created SPARK-34870:
-----------------------------------

             Summary: Jars downloaded with the --packages argument are not 
added to the classpath for executors.
                 Key: SPARK-34870
                 URL: https://issues.apache.org/jira/browse/SPARK-34870
             Project: Spark
          Issue Type: Bug
          Components: Spark Submit
    Affects Versions: 3.0.0
         Environment: Spark worker running inside a Kubernetes pod with a 
Bitnami Spark image, and the driver running inside of a Jupyter Spark 
Kubernetes pod.
            Reporter: Cory Maklin


When Spark is run in local mode, it works as expected. However, when Spark is 
run in client mode, it copies the jars to the executor ($SPARK_HOME/work/<app 
id>/<executor id>), but never adds them to the classpath.

It might be worth noting that `spark.jars` does add the jars to the classpath, 
but unlike `spark.jars.packages` it doesn't automatically download the jar's 
compiled dependencies.

 

```

spark = SparkSession.builder\
 .master(SPARK_MASTER)\
 .appName(APP_NAME)\
...
 .config("spark.jars.packages", DEPENDENCY_PACKAGES) \

...
 .getOrCreate()
```

 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to