That isn't a bug - you can't change the classpath once the JVM is executing.

On Wed, Mar 9, 2022 at 7:11 AM Rafał Wojdyła <ravwojd...@gmail.com> wrote:

> Hi,
> My use case is that, I have a long running process (orchestrator) with
> multiple tasks, some tasks might require extra spark dependencies. It seems
> once the spark context is started it's not possible to update
> `spark.jars.packages`? I have reported an issue at
> https://issues.apache.org/jira/browse/SPARK-38438, together with a
> workaround ("hard reset of the cluster"). I wonder if anyone has a solution
> for this?
> Cheers - Rafal
>

Reply via email to